* [PATCH v5 01/23] dts: code adjustments for doc generation
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-08 13:35 ` Yoan Picchi
2023-11-06 17:15 ` [PATCH v5 02/23] dts: add docstring checker Juraj Linkeš
` (22 subsequent siblings)
23 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
The standard Python tool for generating API documentation, Sphinx,
imports modules one-by-one when generating the documentation. This
requires code changes:
* properly guarding argument parsing in the if __name__ == '__main__'
block,
* the logger used by DTS runner underwent the same treatment so that it
doesn't create log files outside of a DTS run,
* however, DTS uses the arguments to construct an object holding global
variables. The defaults for the global variables needed to be moved
from argument parsing elsewhere,
* importing the remote_session module from framework resulted in
circular imports because of one module trying to import another
module. This is fixed by reorganizing the code,
* some code reorganization was done because the resulting structure
makes more sense, improving documentation clarity.
The are some other changes which are documentation related:
* added missing type annotation so they appear in the generated docs,
* reordered arguments in some methods,
* removed superfluous arguments and attributes,
* change private functions/methods/attributes to private and vice-versa.
The above all appear in the generated documentation and the with them,
the documentation is improved.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 10 ++-
dts/framework/dts.py | 33 +++++--
dts/framework/exception.py | 54 +++++-------
dts/framework/remote_session/__init__.py | 41 ++++-----
.../interactive_remote_session.py | 0
.../{remote => }/interactive_shell.py | 0
.../{remote => }/python_shell.py | 0
.../remote_session/remote/__init__.py | 27 ------
.../{remote => }/remote_session.py | 0
.../{remote => }/ssh_session.py | 12 +--
.../{remote => }/testpmd_shell.py | 0
dts/framework/settings.py | 87 +++++++++++--------
dts/framework/test_result.py | 4 +-
dts/framework/test_suite.py | 7 +-
dts/framework/testbed_model/__init__.py | 12 +--
dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
dts/framework/testbed_model/hw/__init__.py | 27 ------
.../linux_session.py | 6 +-
dts/framework/testbed_model/node.py | 26 ++++--
.../os_session.py | 22 ++---
dts/framework/testbed_model/{hw => }/port.py | 0
.../posix_session.py | 4 +-
dts/framework/testbed_model/sut_node.py | 8 +-
dts/framework/testbed_model/tg_node.py | 30 +------
.../traffic_generator/__init__.py | 24 +++++
.../capturing_traffic_generator.py | 6 +-
.../{ => traffic_generator}/scapy.py | 23 ++---
.../traffic_generator.py | 16 +++-
.../testbed_model/{hw => }/virtual_device.py | 0
dts/framework/utils.py | 46 +++-------
dts/main.py | 9 +-
31 files changed, 259 insertions(+), 288 deletions(-)
rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
delete mode 100644 dts/framework/remote_session/remote/__init__.py
rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
delete mode 100644 dts/framework/testbed_model/hw/__init__.py
rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
rename dts/framework/testbed_model/{hw => }/port.py (100%)
rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (96%)
rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index cb7e00ba34..2044c82611 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -17,6 +17,7 @@
import warlock # type: ignore[import]
import yaml
+from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict):
+ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
# This looks useless now, but is designed to allow expansion to traffic
# generators that require more configuration later.
match TrafficGeneratorType(d["type"]):
@@ -97,6 +98,10 @@ def from_dict(d: dict):
return ScapyTrafficGeneratorConfig(
traffic_generator_type=TrafficGeneratorType.SCAPY
)
+ case _:
+ raise ConfigurationError(
+ f'Unknown traffic generator type "{d["type"]}".'
+ )
@dataclass(slots=True, frozen=True)
@@ -324,6 +329,3 @@ def load_config() -> Configuration:
config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
config_obj: Configuration = Configuration.from_dict(dict(config))
return config_obj
-
-
-CONFIGURATION = load_config()
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index f773f0c38d..4c7fb0c40a 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -6,19 +6,19 @@
import sys
from .config import (
- CONFIGURATION,
BuildTargetConfiguration,
ExecutionConfiguration,
TestSuiteConfig,
+ load_config,
)
from .exception import BlockingTestSuiteError
from .logger import DTSLOG, getLogger
from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
from .test_suite import get_test_suites
from .testbed_model import SutNode, TGNode
-from .utils import check_dts_python_version
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG = None # type: ignore[assignment]
result: DTSResult = DTSResult(dts_logger)
@@ -30,14 +30,18 @@ def run_all() -> None:
global dts_logger
global result
+ # create a regular DTS logger and create a new result with it
+ dts_logger = getLogger("DTSRunner")
+ result = DTSResult(dts_logger)
+
# check the python version of the server that run dts
- check_dts_python_version()
+ _check_dts_python_version()
sut_nodes: dict[str, SutNode] = {}
tg_nodes: dict[str, TGNode] = {}
try:
# for all Execution sections
- for execution in CONFIGURATION.executions:
+ for execution in load_config().executions:
sut_node = sut_nodes.get(execution.system_under_test_node.name)
tg_node = tg_nodes.get(execution.traffic_generator_node.name)
@@ -82,6 +86,25 @@ def run_all() -> None:
_exit_dts()
+def _check_dts_python_version() -> None:
+ def RED(text: str) -> str:
+ return f"\u001B[31;1m{str(text)}\u001B[0m"
+
+ if sys.version_info.major < 3 or (
+ sys.version_info.major == 3 and sys.version_info.minor < 10
+ ):
+ print(
+ RED(
+ (
+ "WARNING: DTS execution node's python version is lower than"
+ "python 3.10, is deprecated and will not work in future releases."
+ )
+ ),
+ file=sys.stderr,
+ )
+ print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+
+
def _run_execution(
sut_node: SutNode,
tg_node: TGNode,
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 001a5a5496..7489c03570 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
Command execution timeout.
"""
- command: str
- output: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _command: str
- def __init__(self, command: str, output: str):
- self.command = command
- self.output = output
+ def __init__(self, command: str):
+ self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self.command}"
-
- def get_output(self) -> str:
- return self.output
+ return f"TIMEOUT on {self._command}"
class SSHConnectionError(DTSError):
@@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
SSH connection error.
"""
- host: str
- errors: list[str]
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
+ _errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
- self.host = host
- self.errors = [] if errors is None else errors
+ self._host = host
+ self._errors = [] if errors is None else errors
def __str__(self) -> str:
- message = f"Error trying to connect with {self.host}."
- if self.errors:
- message += f" Errors encountered while retrying: {', '.join(self.errors)}"
+ message = f"Error trying to connect with {self._host}."
+ if self._errors:
+ message += f" Errors encountered while retrying: {', '.join(self._errors)}"
return message
@@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
It can no longer be used.
"""
- host: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
def __init__(self, host: str):
- self.host = host
+ self._host = host
def __str__(self) -> str:
- return f"SSH session with {self.host} has died"
+ return f"SSH session with {self._host} has died"
class ConfigurationError(DTSError):
@@ -107,18 +102,18 @@ class RemoteCommandExecutionError(DTSError):
Raised when a command executed on a Node returns a non-zero exit status.
"""
- command: str
- command_return_code: int
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ command: str
+ _command_return_code: int
def __init__(self, command: str, command_return_code: int):
self.command = command
- self.command_return_code = command_return_code
+ self._command_return_code = command_return_code
def __str__(self) -> str:
return (
f"Command {self.command} returned a non-zero exit code: "
- f"{self.command_return_code}"
+ f"{self._command_return_code}"
)
@@ -143,22 +138,15 @@ class TestCaseVerifyError(DTSError):
Used in test cases to verify the expected behavior.
"""
- value: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
- def __init__(self, value: str):
- self.value = value
-
- def __str__(self) -> str:
- return repr(self.value)
-
class BlockingTestSuiteError(DTSError):
- suite_name: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
+ _suite_name: str
def __init__(self, suite_name: str) -> None:
- self.suite_name = suite_name
+ self._suite_name = suite_name
def __str__(self) -> str:
- return f"Blocking suite {self.suite_name} failed."
+ return f"Blocking suite {self._suite_name} failed."
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 00b6d1f03a..5e7ddb2b05 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -12,29 +12,24 @@
# pylama:ignore=W0611
-from framework.config import OS, NodeConfiguration
-from framework.exception import ConfigurationError
+from framework.config import NodeConfiguration
from framework.logger import DTSLOG
-from .linux_session import LinuxSession
-from .os_session import InteractiveShellType, OSSession
-from .remote import (
- CommandResult,
- InteractiveRemoteSession,
- InteractiveShell,
- PythonShell,
- RemoteSession,
- SSHSession,
- TestPmdDevice,
- TestPmdShell,
-)
-
-
-def create_session(
+from .interactive_remote_session import InteractiveRemoteSession
+from .interactive_shell import InteractiveShell
+from .python_shell import PythonShell
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
+from .testpmd_shell import TestPmdShell
+
+
+def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> OSSession:
- match node_config.os:
- case OS.linux:
- return LinuxSession(node_config, name, logger)
- case _:
- raise ConfigurationError(f"Unsupported OS {node_config.os}")
+) -> RemoteSession:
+ return SSHSession(node_config, name, logger)
+
+
+def create_interactive_session(
+ node_config: NodeConfiguration, logger: DTSLOG
+) -> InteractiveRemoteSession:
+ return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_remote_session.py
rename to dts/framework/remote_session/interactive_remote_session.py
diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_shell.py
rename to dts/framework/remote_session/interactive_shell.py
diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/python_shell.py
rename to dts/framework/remote_session/python_shell.py
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 06403691a5..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-# Copyright(c) 2023 University of New Hampshire
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .interactive_remote_session import InteractiveRemoteSession
-from .interactive_shell import InteractiveShell
-from .python_shell import PythonShell
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-from .testpmd_shell import TestPmdDevice, TestPmdShell
-
-
-def create_remote_session(
- node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
- return SSHSession(node_config, name, logger)
-
-
-def create_interactive_session(
- node_config: NodeConfiguration, logger: DTSLOG
-) -> InteractiveRemoteSession:
- return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 91%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
index 8d127f1601..cee11d14d6 100644
--- a/dts/framework/remote_session/remote/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -18,9 +18,7 @@
SSHException,
)
-from framework.config import NodeConfiguration
from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
-from framework.logger import DTSLOG
from .remote_session import CommandResult, RemoteSession
@@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
session: Connection
- def __init__(
- self,
- node_config: NodeConfiguration,
- session_name: str,
- logger: DTSLOG,
- ):
- super(SSHSession, self).__init__(node_config, session_name, logger)
-
def _connect(self) -> None:
errors = []
retry_attempts = 10
@@ -117,7 +107,7 @@ def _send_command(
except CommandTimedOut as e:
self._logger.exception(e)
- raise SSHTimeoutError(command, e.result.stderr) from e
+ raise SSHTimeoutError(command) from e
return CommandResult(
self.name, command, output.stdout, output.stderr, output.return_code
diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/testpmd_shell.py
rename to dts/framework/remote_session/testpmd_shell.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index cfa39d011b..7f5841d073 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
from pathlib import Path
from typing import Any, TypeVar
@@ -22,8 +22,8 @@ def __init__(
option_strings: Sequence[str],
dest: str,
nargs: str | int | None = None,
- const: str | None = None,
- default: str = None,
+ const: bool | None = None,
+ default: Any = None,
type: Callable[[str], _T | argparse.FileType | None] = None,
choices: Iterable[_T] | None = None,
required: bool = False,
@@ -32,6 +32,12 @@ def __init__(
) -> None:
env_var_value = os.environ.get(env_var)
default = env_var_value or default
+ if const is not None:
+ nargs = 0
+ default = const if env_var_value else default
+ type = None
+ choices = None
+ metavar = None
super(_EnvironmentArgument, self).__init__(
option_strings,
dest,
@@ -52,22 +58,28 @@ def __call__(
values: Any,
option_string: str = None,
) -> None:
- setattr(namespace, self.dest, values)
+ if self.const is not None:
+ setattr(namespace, self.dest, self.const)
+ else:
+ setattr(namespace, self.dest, values)
return _EnvironmentArgument
-@dataclass(slots=True, frozen=True)
-class _Settings:
- config_file_path: str
- output_dir: str
- timeout: float
- verbose: bool
- skip_setup: bool
- dpdk_tarball_path: Path
- compile_timeout: float
- test_cases: list
- re_run: int
+@dataclass(slots=True)
+class Settings:
+ config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ output_dir: str = "output"
+ timeout: float = 15
+ verbose: bool = False
+ skip_setup: bool = False
+ dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ compile_timeout: float = 1200
+ test_cases: list[str] = field(default_factory=list)
+ re_run: int = 0
+
+
+SETTINGS: Settings = Settings()
def _get_parser() -> argparse.ArgumentParser:
@@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--config-file",
action=_env_arg("DTS_CFG_FILE"),
- default="conf.yaml",
+ default=SETTINGS.config_file_path,
+ type=Path,
help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
"and targets.",
)
@@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--output-dir",
"--output",
action=_env_arg("DTS_OUTPUT_DIR"),
- default="output",
+ default=SETTINGS.output_dir,
help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
)
@@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
"-t",
"--timeout",
action=_env_arg("DTS_TIMEOUT"),
- default=15,
+ default=SETTINGS.timeout,
type=float,
help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
"compiling DPDK.",
@@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
"-v",
"--verbose",
action=_env_arg("DTS_VERBOSE"),
- default="N",
- help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
+ default=SETTINGS.verbose,
+ const=True,
+ help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
"to the console.",
)
@@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
"-s",
"--skip-setup",
action=_env_arg("DTS_SKIP_SETUP"),
- default="N",
- help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
+ const=True,
+ help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
)
parser.add_argument(
@@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--snapshot",
"--git-ref",
action=_env_arg("DTS_DPDK_TARBALL"),
- default="dpdk.tar.xz",
+ default=SETTINGS.dpdk_tarball_path,
type=Path,
help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
"tag ID or tree ID to test. To test local changes, first commit them, "
@@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--compile-timeout",
action=_env_arg("DTS_COMPILE_TIMEOUT"),
- default=1200,
+ default=SETTINGS.compile_timeout,
type=float,
help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
)
@@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--re-run",
"--re_run",
action=_env_arg("DTS_RERUN"),
- default=0,
+ default=SETTINGS.re_run,
type=int,
help="[DTS_RERUN] Re-run each test case the specified amount of times "
"if a test failure occurs",
@@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
return parser
-def _get_settings() -> _Settings:
+def get_settings() -> Settings:
parsed_args = _get_parser().parse_args()
- return _Settings(
+ return Settings(
config_file_path=parsed_args.config_file,
output_dir=parsed_args.output_dir,
timeout=parsed_args.timeout,
- verbose=(parsed_args.verbose == "Y"),
- skip_setup=(parsed_args.skip_setup == "Y"),
+ verbose=parsed_args.verbose,
+ skip_setup=parsed_args.skip_setup,
dpdk_tarball_path=Path(
- DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
- )
- if not os.path.exists(parsed_args.tarball)
- else Path(parsed_args.tarball),
+ Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
+ if not os.path.exists(parsed_args.tarball)
+ else Path(parsed_args.tarball)
+ ),
compile_timeout=parsed_args.compile_timeout,
- test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+ test_cases=(
+ parsed_args.test_cases.split(",") if parsed_args.test_cases else []
+ ),
re_run=parsed_args.re_run,
)
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index f0fbe80f6f..603e18872c 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -254,7 +254,7 @@ def add_build_target(
self._inner_results.append(build_target_result)
return build_target_result
- def add_sut_info(self, sut_info: NodeInfo):
+ def add_sut_info(self, sut_info: NodeInfo) -> None:
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
@@ -297,7 +297,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
self._inner_results.append(execution_result)
return execution_result
- def add_error(self, error) -> None:
+ def add_error(self, error: Exception) -> None:
self._errors.append(error)
def process(self) -> None:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 3b890c0451..d53553bf34 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -11,7 +11,7 @@
import re
from ipaddress import IPv4Interface, IPv6Interface, ip_interface
from types import MethodType
-from typing import Union
+from typing import Any, Union
from scapy.layers.inet import IP # type: ignore[import]
from scapy.layers.l2 import Ether # type: ignore[import]
@@ -26,8 +26,7 @@
from .logger import DTSLOG, getLogger
from .settings import SETTINGS
from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
-from .testbed_model import SutNode, TGNode
-from .testbed_model.hw.port import Port, PortLink
+from .testbed_model import Port, PortLink, SutNode, TGNode
from .utils import get_packet_summaries
@@ -453,7 +452,7 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
- def is_test_suite(object) -> bool:
+ def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
return True
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 5cbb859e47..8ced05653b 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,15 +9,9 @@
# pylama:ignore=W0611
-from .hw import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
- VirtualDevice,
- lcore_filter,
-)
+from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
from .node import Node
+from .port import Port, PortLink
from .sut_node import SutNode
from .tg_node import TGNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/testbed_model/cpu.py
index d1918a12dc..8fe785dfe4 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
)
return filtered_lcores
+
+
+def lcore_filter(
+ core_list: list[LogicalCore],
+ filter_specifier: LogicalCoreCount | LogicalCoreList,
+ ascending: bool,
+) -> LogicalCoreFilter:
+ if isinstance(filter_specifier, LogicalCoreList):
+ return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+ elif isinstance(filter_specifier, LogicalCoreCount):
+ return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+ else:
+ raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
- core_list: list[LogicalCore],
- filter_specifier: LogicalCoreCount | LogicalCoreList,
- ascending: bool,
-) -> LogicalCoreFilter:
- if isinstance(filter_specifier, LogicalCoreList):
- return LogicalCoreListFilter(core_list, filter_specifier, ascending)
- elif isinstance(filter_specifier, LogicalCoreCount):
- return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
- else:
- raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
similarity index 97%
rename from dts/framework/remote_session/linux_session.py
rename to dts/framework/testbed_model/linux_session.py
index a3f1a6bf3b..f472bb8f0f 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -9,10 +9,10 @@
from typing_extensions import NotRequired
from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
from framework.utils import expand_range
+from .cpu import LogicalCore
+from .port import Port
from .posix_session import PosixSession
@@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
lcores.append(LogicalCore(lcore, core, socket, node))
return lcores
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return dpdk_prefix
def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index fc01e0bf8e..7571e7b98d 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,23 +12,26 @@
from typing import Any, Callable, Type, Union
from framework.config import (
+ OS,
BuildTargetConfiguration,
ExecutionConfiguration,
NodeConfiguration,
)
+from framework.exception import ConfigurationError
from framework.logger import DTSLOG, getLogger
-from framework.remote_session import InteractiveShellType, OSSession, create_session
from framework.settings import SETTINGS
-from .hw import (
+from .cpu import (
LogicalCore,
LogicalCoreCount,
LogicalCoreList,
LogicalCoreListFilter,
- VirtualDevice,
lcore_filter,
)
-from .hw.port import Port
+from .linux_session import LinuxSession
+from .os_session import InteractiveShellType, OSSession
+from .port import Port
+from .virtual_device import VirtualDevice
class Node(ABC):
@@ -69,6 +72,7 @@ def __init__(self, node_config: NodeConfiguration):
def _init_ports(self) -> None:
self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
self.main_session.update_ports(self.ports)
+
for port in self.ports:
self.configure_port_state(port)
@@ -172,9 +176,9 @@ def create_interactive_shell(
return self.main_session.create_interactive_shell(
shell_cls,
- app_args,
timeout,
privileged,
+ app_args,
)
def filter_lcores(
@@ -205,7 +209,7 @@ def _get_remote_cpus(self) -> None:
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
- def _setup_hugepages(self):
+ def _setup_hugepages(self) -> None:
"""
Setup hugepages on the Node. Different architectures can supply different
amounts of memory for hugepages and numa-based hugepage allocation may need
@@ -249,3 +253,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
return lambda *args: None
else:
return func
+
+
+def create_session(
+ node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> OSSession:
+ match node_config.os:
+ case OS.linux:
+ return LinuxSession(node_config, name, logger)
+ case _:
+ raise ConfigurationError(f"Unsupported OS {node_config.os}")
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
similarity index 95%
rename from dts/framework/remote_session/os_session.py
rename to dts/framework/testbed_model/os_session.py
index 8a709eac1c..76e595a518 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -10,19 +10,19 @@
from framework.config import Architecture, NodeConfiguration, NodeInfo
from framework.logger import DTSLOG
-from framework.remote_session.remote import InteractiveShell
-from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
-from framework.utils import MesonArgs
-
-from .remote import (
+from framework.remote_session import (
CommandResult,
InteractiveRemoteSession,
+ InteractiveShell,
RemoteSession,
create_interactive_session,
create_remote_session,
)
+from framework.settings import SETTINGS
+from framework.utils import MesonArgs
+
+from .cpu import LogicalCore
+from .port import Port
InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
@@ -85,9 +85,9 @@ def send_command(
def create_interactive_shell(
self,
shell_cls: Type[InteractiveShellType],
- eal_parameters: str,
timeout: float,
privileged: bool,
+ app_args: str,
) -> InteractiveShellType:
"""
See "create_interactive_shell" in SutNode
@@ -96,7 +96,7 @@ def create_interactive_shell(
self.interactive_session.session,
self._logger,
self._get_privileged_command if privileged else None,
- eal_parameters,
+ app_args,
timeout,
)
@@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
"""
@abstractmethod
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
"""
Try to find DPDK remote dir in remote_dir.
"""
@@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
"""
@abstractmethod
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
"""
Get the DPDK file prefix that will be used when running DPDK apps.
"""
diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
similarity index 100%
rename from dts/framework/testbed_model/hw/port.py
rename to dts/framework/testbed_model/port.py
diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
similarity index 98%
rename from dts/framework/remote_session/posix_session.py
rename to dts/framework/testbed_model/posix_session.py
index 5da0516e05..1d1d5b1b26 100644
--- a/dts/framework/remote_session/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
@@ -219,7 +219,7 @@ def _remove_dpdk_runtime_dirs(
for dpdk_runtime_dir in dpdk_runtime_dirs:
self.remove_remote_dir(dpdk_runtime_dir)
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return ""
def get_compiler_version(self, compiler_name: str) -> str:
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 202aebfd06..4e33cf02ea 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,12 +15,14 @@
NodeInfo,
SutNodeConfiguration,
)
-from framework.remote_session import CommandResult, InteractiveShellType, OSSession
+from framework.remote_session import CommandResult
from framework.settings import SETTINGS
from framework.utils import MesonArgs
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
+from .cpu import LogicalCoreCount, LogicalCoreList
from .node import Node
+from .os_session import InteractiveShellType, OSSession
+from .virtual_device import VirtualDevice
class EalParameters(object):
@@ -289,7 +291,7 @@ def create_eal_parameters(
prefix: str = "dpdk",
append_prefix_timestamp: bool = True,
no_pci: bool = False,
- vdevs: list[VirtualDevice] = None,
+ vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
"""
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 27025cfa31..166eb8430e 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -16,16 +16,11 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.config import (
- ScapyTrafficGeneratorConfig,
- TGNodeConfiguration,
- TrafficGeneratorType,
-)
-from framework.exception import ConfigurationError
-
-from .capturing_traffic_generator import CapturingTrafficGenerator
-from .hw.port import Port
+from framework.config import TGNodeConfiguration
+
from .node import Node
+from .port import Port
+from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
class TGNode(Node):
@@ -80,20 +75,3 @@ def close(self) -> None:
"""Free all resources used by the node"""
self.traffic_generator.close()
super(TGNode, self).close()
-
-
-def create_traffic_generator(
- tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
-) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
-
- from .scapy import ScapyTrafficGenerator
-
- match traffic_generator_config.traffic_generator_type:
- case TrafficGeneratorType.SCAPY:
- return ScapyTrafficGenerator(tg_node, traffic_generator_config)
- case _:
- raise ConfigurationError(
- "Unknown traffic generator: "
- f"{traffic_generator_config.traffic_generator_type}"
- )
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
new file mode 100644
index 0000000000..11bfa1ee0f
--- /dev/null
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -0,0 +1,24 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
+from framework.exception import ConfigurationError
+from framework.testbed_model.node import Node
+
+from .capturing_traffic_generator import CapturingTrafficGenerator
+from .scapy import ScapyTrafficGenerator
+
+
+def create_traffic_generator(
+ tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
+) -> CapturingTrafficGenerator:
+ """A factory function for creating traffic generator object from user config."""
+
+ match traffic_generator_config.traffic_generator_type:
+ case TrafficGeneratorType.SCAPY:
+ return ScapyTrafficGenerator(tg_node, traffic_generator_config)
+ case _:
+ raise ConfigurationError(
+ "Unknown traffic generator: "
+ f"{traffic_generator_config.traffic_generator_type}"
+ )
diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
similarity index 96%
rename from dts/framework/testbed_model/capturing_traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index ab98987f8e..e521211ef0 100644
--- a/dts/framework/testbed_model/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -16,9 +16,9 @@
from scapy.packet import Packet # type: ignore[import]
from framework.settings import SETTINGS
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
from .traffic_generator import TrafficGenerator
@@ -130,7 +130,9 @@ def _send_packets_and_capture(
for the specified duration. It must be able to handle no received packets.
"""
- def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
+ def _write_capture_from_packets(
+ self, capture_name: str, packets: list[Packet]
+ ) -> None:
file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
self._logger.debug(f"Writing packets to {file_name}.")
scapy.utils.wrpcap(file_name, packets)
diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
similarity index 95%
rename from dts/framework/testbed_model/scapy.py
rename to dts/framework/testbed_model/traffic_generator/scapy.py
index af0d4dbb25..51864b6e6b 100644
--- a/dts/framework/testbed_model/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -24,16 +24,15 @@
from scapy.packet import Packet # type: ignore[import]
from framework.config import OS, ScapyTrafficGeneratorConfig
-from framework.logger import DTSLOG, getLogger
from framework.remote_session import PythonShell
from framework.settings import SETTINGS
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from .capturing_traffic_generator import (
CapturingTrafficGenerator,
_get_default_capture_name,
)
-from .hw.port import Port
-from .tg_node import TGNode
"""
========= BEGIN RPC FUNCTIONS =========
@@ -146,7 +145,7 @@ def quit(self) -> None:
self._BaseServer__shutdown_request = True
return None
- def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
+ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
"""Add a function to the server.
This is meant to be executed remotely.
@@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
session: PythonShell
rpc_server_proxy: xmlrpc.client.ServerProxy
_config: ScapyTrafficGeneratorConfig
- _tg_node: TGNode
- _logger: DTSLOG
-
- def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
- self._config = config
- self._tg_node = tg_node
- self._logger = getLogger(
- f"{self._tg_node.name} {self._config.traffic_generator_type}"
- )
+
+ def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ super().__init__(tg_node, config)
assert (
self._tg_node.config.os == OS.linux
@@ -235,7 +228,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
function_bytes = marshal.dumps(function.__code__)
self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
- def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
+ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
# load the source of the function
src = inspect.getsource(QuittableXMLRPCServer)
# Lines with only whitespace break the repl if in the middle of a function
@@ -280,7 +273,7 @@ def _send_packets_and_capture(
scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
return scapy_packets
- def close(self):
+ def close(self) -> None:
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
similarity index 80%
rename from dts/framework/testbed_model/traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 28c35d3ce4..ea7c3963da 100644
--- a/dts/framework/testbed_model/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -12,11 +12,12 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.logger import DTSLOG
+from framework.config import TrafficGeneratorConfig
+from framework.logger import DTSLOG, getLogger
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
-
class TrafficGenerator(ABC):
"""The base traffic generator.
@@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
Defines the few basic methods that each traffic generator must implement.
"""
+ _config: TrafficGeneratorConfig
+ _tg_node: Node
_logger: DTSLOG
+ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ self._config = config
+ self._tg_node = tg_node
+ self._logger = getLogger(
+ f"{self._tg_node.name} {self._config.traffic_generator_type}"
+ )
+
def send_packet(self, packet: Packet, port: Port) -> None:
"""Send a packet and block until it is fully sent.
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index d27c2c5b5f..f0c916471c 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -7,7 +7,6 @@
import json
import os
import subprocess
-import sys
from enum import Enum
from pathlib import Path
from subprocess import SubprocessError
@@ -16,35 +15,7 @@
from .exception import ConfigurationError
-
-class StrEnum(Enum):
- @staticmethod
- def _generate_next_value_(
- name: str, start: int, count: int, last_values: object
- ) -> str:
- return name
-
- def __str__(self) -> str:
- return self.name
-
-
-REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
-
-
-def check_dts_python_version() -> None:
- if sys.version_info.major < 3 or (
- sys.version_info.major == 3 and sys.version_info.minor < 10
- ):
- print(
- RED(
- (
- "WARNING: DTS execution node's python version is lower than"
- "python 3.10, is deprecated and will not work in future releases."
- )
- ),
- file=sys.stderr,
- )
- print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
def expand_range(range_str: str) -> list[int]:
@@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
return expanded_range
-def get_packet_summaries(packets: list[Packet]):
+def get_packet_summaries(packets: list[Packet]) -> str:
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
return f"Packet contents: \n{packet_summaries}"
-def RED(text: str) -> str:
- return f"\u001B[31;1m{str(text)}\u001B[0m"
+class StrEnum(Enum):
+ @staticmethod
+ def _generate_next_value_(
+ name: str, start: int, count: int, last_values: object
+ ) -> str:
+ return name
+
+ def __str__(self) -> str:
+ return self.name
class MesonArgs(object):
@@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
if self._tarball_path and os.path.exists(self._tarball_path):
os.remove(self._tarball_path)
- def __fspath__(self):
+ def __fspath__(self) -> str:
return str(self._tarball_path)
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..5d4714b0c3 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,17 @@
import logging
-from framework import dts
+from framework import settings
def main() -> None:
+ """Set DTS settings, then run DTS.
+
+ The DTS settings are taken from the command line arguments and the environment variables.
+ """
+ settings.SETTINGS = settings.get_settings()
+ from framework import dts
+
dts.run_all()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v5 01/23] dts: code adjustments for doc generation
2023-11-06 17:15 ` [PATCH v5 01/23] dts: code adjustments for doc generation Juraj Linkeš
@ 2023-11-08 13:35 ` Yoan Picchi
2023-11-15 7:46 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-08 13:35 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek
Cc: dev
On 11/6/23 17:15, Juraj Linkeš wrote:
> The standard Python tool for generating API documentation, Sphinx,
> imports modules one-by-one when generating the documentation. This
> requires code changes:
> * properly guarding argument parsing in the if __name__ == '__main__'
> block,
> * the logger used by DTS runner underwent the same treatment so that it
> doesn't create log files outside of a DTS run,
> * however, DTS uses the arguments to construct an object holding global
> variables. The defaults for the global variables needed to be moved
> from argument parsing elsewhere,
> * importing the remote_session module from framework resulted in
> circular imports because of one module trying to import another
> module. This is fixed by reorganizing the code,
> * some code reorganization was done because the resulting structure
> makes more sense, improving documentation clarity.
>
> The are some other changes which are documentation related:
> * added missing type annotation so they appear in the generated docs,
> * reordered arguments in some methods,
> * removed superfluous arguments and attributes,
> * change private functions/methods/attributes to private and vice-versa.
>
> The above all appear in the generated documentation and the with them,
> the documentation is improved.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/config/__init__.py | 10 ++-
> dts/framework/dts.py | 33 +++++--
> dts/framework/exception.py | 54 +++++-------
> dts/framework/remote_session/__init__.py | 41 ++++-----
> .../interactive_remote_session.py | 0
> .../{remote => }/interactive_shell.py | 0
> .../{remote => }/python_shell.py | 0
> .../remote_session/remote/__init__.py | 27 ------
> .../{remote => }/remote_session.py | 0
> .../{remote => }/ssh_session.py | 12 +--
> .../{remote => }/testpmd_shell.py | 0
> dts/framework/settings.py | 87 +++++++++++--------
> dts/framework/test_result.py | 4 +-
> dts/framework/test_suite.py | 7 +-
> dts/framework/testbed_model/__init__.py | 12 +--
> dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
> dts/framework/testbed_model/hw/__init__.py | 27 ------
> .../linux_session.py | 6 +-
> dts/framework/testbed_model/node.py | 26 ++++--
> .../os_session.py | 22 ++---
> dts/framework/testbed_model/{hw => }/port.py | 0
> .../posix_session.py | 4 +-
> dts/framework/testbed_model/sut_node.py | 8 +-
> dts/framework/testbed_model/tg_node.py | 30 +------
> .../traffic_generator/__init__.py | 24 +++++
> .../capturing_traffic_generator.py | 6 +-
> .../{ => traffic_generator}/scapy.py | 23 ++---
> .../traffic_generator.py | 16 +++-
> .../testbed_model/{hw => }/virtual_device.py | 0
> dts/framework/utils.py | 46 +++-------
> dts/main.py | 9 +-
> 31 files changed, 259 insertions(+), 288 deletions(-)
> rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
> rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
> rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
> delete mode 100644 dts/framework/remote_session/remote/__init__.py
> rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
> rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
> rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
> rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
> delete mode 100644 dts/framework/testbed_model/hw/__init__.py
> rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
> rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
> rename dts/framework/testbed_model/{hw => }/port.py (100%)
> rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
> create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
> rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (96%)
> rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
> rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
> rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
>
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index cb7e00ba34..2044c82611 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -17,6 +17,7 @@
> import warlock # type: ignore[import]
> import yaml
>
> +from framework.exception import ConfigurationError
> from framework.settings import SETTINGS
> from framework.utils import StrEnum
>
> @@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
> traffic_generator_type: TrafficGeneratorType
>
> @staticmethod
> - def from_dict(d: dict):
> + def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
This function looks to be designed to support more trafic generator than
just scapy, so setting its return type to scapy specifically looks
wrong. Shouldn't it be a more generic traffic generator type? Like you
did in create_traffic_generator()
> # This looks useless now, but is designed to allow expansion to traffic
> # generators that require more configuration later.
> match TrafficGeneratorType(d["type"]):
> @@ -97,6 +98,10 @@ def from_dict(d: dict):
> return ScapyTrafficGeneratorConfig(
> traffic_generator_type=TrafficGeneratorType.SCAPY
> )
> + case _:
> + raise ConfigurationError(
> + f'Unknown traffic generator type "{d["type"]}".'
> + )
>
>
> @dataclass(slots=True, frozen=True)
> @@ -324,6 +329,3 @@ def load_config() -> Configuration:
> config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
> config_obj: Configuration = Configuration.from_dict(dict(config))
> return config_obj
> -
> -
> -CONFIGURATION = load_config()
> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> index f773f0c38d..4c7fb0c40a 100644
> --- a/dts/framework/dts.py
> +++ b/dts/framework/dts.py
> @@ -6,19 +6,19 @@
> import sys
>
> from .config import (
> - CONFIGURATION,
> BuildTargetConfiguration,
> ExecutionConfiguration,
> TestSuiteConfig,
> + load_config,
> )
> from .exception import BlockingTestSuiteError
> from .logger import DTSLOG, getLogger
> from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
> from .test_suite import get_test_suites
> from .testbed_model import SutNode, TGNode
> -from .utils import check_dts_python_version
>
> -dts_logger: DTSLOG = getLogger("DTSRunner")
> +# dummy defaults to satisfy linters
> +dts_logger: DTSLOG = None # type: ignore[assignment]
> result: DTSResult = DTSResult(dts_logger)
>
>
> @@ -30,14 +30,18 @@ def run_all() -> None:
> global dts_logger
> global result
>
> + # create a regular DTS logger and create a new result with it
> + dts_logger = getLogger("DTSRunner")
> + result = DTSResult(dts_logger)
> +
> # check the python version of the server that run dts
> - check_dts_python_version()
> + _check_dts_python_version()
>
> sut_nodes: dict[str, SutNode] = {}
> tg_nodes: dict[str, TGNode] = {}
> try:
> # for all Execution sections
> - for execution in CONFIGURATION.executions:
> + for execution in load_config().executions:
> sut_node = sut_nodes.get(execution.system_under_test_node.name)
> tg_node = tg_nodes.get(execution.traffic_generator_node.name)
>
> @@ -82,6 +86,25 @@ def run_all() -> None:
> _exit_dts()
>
>
> +def _check_dts_python_version() -> None:
> + def RED(text: str) -> str:
> + return f"\u001B[31;1m{str(text)}\u001B[0m"
> +
> + if sys.version_info.major < 3 or (
> + sys.version_info.major == 3 and sys.version_info.minor < 10
> + ):
> + print(
> + RED(
> + (
> + "WARNING: DTS execution node's python version is lower than"
> + "python 3.10, is deprecated and will not work in future releases."
> + )
> + ),
> + file=sys.stderr,
> + )
> + print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> +
> +
> def _run_execution(
> sut_node: SutNode,
> tg_node: TGNode,
> diff --git a/dts/framework/exception.py b/dts/framework/exception.py
> index 001a5a5496..7489c03570 100644
> --- a/dts/framework/exception.py
> +++ b/dts/framework/exception.py
> @@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
> Command execution timeout.
> """
>
> - command: str
> - output: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _command: str
>
> - def __init__(self, command: str, output: str):
> - self.command = command
> - self.output = output
> + def __init__(self, command: str):
> + self._command = command
>
> def __str__(self) -> str:
> - return f"TIMEOUT on {self.command}"
> -
> - def get_output(self) -> str:
> - return self.output
> + return f"TIMEOUT on {self._command}"
>
>
> class SSHConnectionError(DTSError):
> @@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
> SSH connection error.
> """
>
> - host: str
> - errors: list[str]
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _host: str
> + _errors: list[str]
>
> def __init__(self, host: str, errors: list[str] | None = None):
> - self.host = host
> - self.errors = [] if errors is None else errors
> + self._host = host
> + self._errors = [] if errors is None else errors
>
> def __str__(self) -> str:
> - message = f"Error trying to connect with {self.host}."
> - if self.errors:
> - message += f" Errors encountered while retrying: {', '.join(self.errors)}"
> + message = f"Error trying to connect with {self._host}."
> + if self._errors:
> + message += f" Errors encountered while retrying: {', '.join(self._errors)}"
>
> return message
>
> @@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
> It can no longer be used.
> """
>
> - host: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _host: str
>
> def __init__(self, host: str):
> - self.host = host
> + self._host = host
>
> def __str__(self) -> str:
> - return f"SSH session with {self.host} has died"
> + return f"SSH session with {self._host} has died"
>
>
> class ConfigurationError(DTSError):
> @@ -107,18 +102,18 @@ class RemoteCommandExecutionError(DTSError):
> Raised when a command executed on a Node returns a non-zero exit status.
> """
>
> - command: str
> - command_return_code: int
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
> + command: str
did you forget the _ ?
> + _command_return_code: int
>
> def __init__(self, command: str, command_return_code: int):
> self.command = command
> - self.command_return_code = command_return_code
> + self._command_return_code = command_return_code
>
> def __str__(self) -> str:
> return (
> f"Command {self.command} returned a non-zero exit code: "
> - f"{self.command_return_code}"
> + f"{self._command_return_code}"
> )
>
>
> @@ -143,22 +138,15 @@ class TestCaseVerifyError(DTSError):
> Used in test cases to verify the expected behavior.
> """
>
> - value: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
>
> - def __init__(self, value: str):
> - self.value = value
> -
> - def __str__(self) -> str:
> - return repr(self.value)
> -
>
> class BlockingTestSuiteError(DTSError):
> - suite_name: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
> + _suite_name: str
>
> def __init__(self, suite_name: str) -> None:
> - self.suite_name = suite_name
> + self._suite_name = suite_name
>
> def __str__(self) -> str:
> - return f"Blocking suite {self.suite_name} failed."
> + return f"Blocking suite {self._suite_name} failed."
> diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
> index 00b6d1f03a..5e7ddb2b05 100644
> --- a/dts/framework/remote_session/__init__.py
> +++ b/dts/framework/remote_session/__init__.py
> @@ -12,29 +12,24 @@
>
> # pylama:ignore=W0611
>
> -from framework.config import OS, NodeConfiguration
> -from framework.exception import ConfigurationError
> +from framework.config import NodeConfiguration
> from framework.logger import DTSLOG
>
> -from .linux_session import LinuxSession
> -from .os_session import InteractiveShellType, OSSession
> -from .remote import (
> - CommandResult,
> - InteractiveRemoteSession,
> - InteractiveShell,
> - PythonShell,
> - RemoteSession,
> - SSHSession,
> - TestPmdDevice,
> - TestPmdShell,
> -)
> -
> -
> -def create_session(
> +from .interactive_remote_session import InteractiveRemoteSession
> +from .interactive_shell import InteractiveShell
> +from .python_shell import PythonShell
> +from .remote_session import CommandResult, RemoteSession
> +from .ssh_session import SSHSession
> +from .testpmd_shell import TestPmdShell
> +
> +
> +def create_remote_session(
> node_config: NodeConfiguration, name: str, logger: DTSLOG
> -) -> OSSession:
> - match node_config.os:
> - case OS.linux:
> - return LinuxSession(node_config, name, logger)
> - case _:
> - raise ConfigurationError(f"Unsupported OS {node_config.os}")
> +) -> RemoteSession:
> + return SSHSession(node_config, name, logger)
> +
> +
> +def create_interactive_session(
> + node_config: NodeConfiguration, logger: DTSLOG
> +) -> InteractiveRemoteSession:
> + return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/interactive_remote_session.py
> rename to dts/framework/remote_session/interactive_remote_session.py
> diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/interactive_shell.py
> rename to dts/framework/remote_session/interactive_shell.py
> diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/python_shell.py
> rename to dts/framework/remote_session/python_shell.py
> diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
> deleted file mode 100644
> index 06403691a5..0000000000
> --- a/dts/framework/remote_session/remote/__init__.py
> +++ /dev/null
> @@ -1,27 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -# Copyright(c) 2023 University of New Hampshire
> -
> -# pylama:ignore=W0611
> -
> -from framework.config import NodeConfiguration
> -from framework.logger import DTSLOG
> -
> -from .interactive_remote_session import InteractiveRemoteSession
> -from .interactive_shell import InteractiveShell
> -from .python_shell import PythonShell
> -from .remote_session import CommandResult, RemoteSession
> -from .ssh_session import SSHSession
> -from .testpmd_shell import TestPmdDevice, TestPmdShell
> -
> -
> -def create_remote_session(
> - node_config: NodeConfiguration, name: str, logger: DTSLOG
> -) -> RemoteSession:
> - return SSHSession(node_config, name, logger)
> -
> -
> -def create_interactive_session(
> - node_config: NodeConfiguration, logger: DTSLOG
> -) -> InteractiveRemoteSession:
> - return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/remote_session.py
> rename to dts/framework/remote_session/remote_session.py
> diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
> similarity index 91%
> rename from dts/framework/remote_session/remote/ssh_session.py
> rename to dts/framework/remote_session/ssh_session.py
> index 8d127f1601..cee11d14d6 100644
> --- a/dts/framework/remote_session/remote/ssh_session.py
> +++ b/dts/framework/remote_session/ssh_session.py
> @@ -18,9 +18,7 @@
> SSHException,
> )
>
> -from framework.config import NodeConfiguration
> from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
> -from framework.logger import DTSLOG
>
> from .remote_session import CommandResult, RemoteSession
>
> @@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
>
> session: Connection
>
> - def __init__(
> - self,
> - node_config: NodeConfiguration,
> - session_name: str,
> - logger: DTSLOG,
> - ):
> - super(SSHSession, self).__init__(node_config, session_name, logger)
> -
> def _connect(self) -> None:
> errors = []
> retry_attempts = 10
> @@ -117,7 +107,7 @@ def _send_command(
>
> except CommandTimedOut as e:
> self._logger.exception(e)
> - raise SSHTimeoutError(command, e.result.stderr) from e
> + raise SSHTimeoutError(command) from e
>
> return CommandResult(
> self.name, command, output.stdout, output.stderr, output.return_code
> diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/testpmd_shell.py
> rename to dts/framework/remote_session/testpmd_shell.py
> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> index cfa39d011b..7f5841d073 100644
> --- a/dts/framework/settings.py
> +++ b/dts/framework/settings.py
> @@ -6,7 +6,7 @@
> import argparse
> import os
> from collections.abc import Callable, Iterable, Sequence
> -from dataclasses import dataclass
> +from dataclasses import dataclass, field
> from pathlib import Path
> from typing import Any, TypeVar
>
> @@ -22,8 +22,8 @@ def __init__(
> option_strings: Sequence[str],
> dest: str,
> nargs: str | int | None = None,
> - const: str | None = None,
> - default: str = None,
> + const: bool | None = None,
> + default: Any = None,
> type: Callable[[str], _T | argparse.FileType | None] = None,
> choices: Iterable[_T] | None = None,
> required: bool = False,
> @@ -32,6 +32,12 @@ def __init__(
> ) -> None:
> env_var_value = os.environ.get(env_var)
> default = env_var_value or default
> + if const is not None:
> + nargs = 0
> + default = const if env_var_value else default
> + type = None
> + choices = None
> + metavar = None
> super(_EnvironmentArgument, self).__init__(
> option_strings,
> dest,
> @@ -52,22 +58,28 @@ def __call__(
> values: Any,
> option_string: str = None,
> ) -> None:
> - setattr(namespace, self.dest, values)
> + if self.const is not None:
> + setattr(namespace, self.dest, self.const)
> + else:
> + setattr(namespace, self.dest, values)
>
> return _EnvironmentArgument
>
>
> -@dataclass(slots=True, frozen=True)
> -class _Settings:
> - config_file_path: str
> - output_dir: str
> - timeout: float
> - verbose: bool
> - skip_setup: bool
> - dpdk_tarball_path: Path
> - compile_timeout: float
> - test_cases: list
> - re_run: int
> +@dataclass(slots=True)
> +class Settings:
> + config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
> + output_dir: str = "output"
> + timeout: float = 15
> + verbose: bool = False
> + skip_setup: bool = False
> + dpdk_tarball_path: Path | str = "dpdk.tar.xz"
> + compile_timeout: float = 1200
> + test_cases: list[str] = field(default_factory=list)
> + re_run: int = 0
> +
> +
> +SETTINGS: Settings = Settings()
>
>
> def _get_parser() -> argparse.ArgumentParser:
> @@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
> parser.add_argument(
> "--config-file",
> action=_env_arg("DTS_CFG_FILE"),
> - default="conf.yaml",
> + default=SETTINGS.config_file_path,
> + type=Path,
> help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
> "and targets.",
> )
> @@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--output-dir",
> "--output",
> action=_env_arg("DTS_OUTPUT_DIR"),
> - default="output",
> + default=SETTINGS.output_dir,
> help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
> )
>
> @@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "-t",
> "--timeout",
> action=_env_arg("DTS_TIMEOUT"),
> - default=15,
> + default=SETTINGS.timeout,
> type=float,
> help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
> "compiling DPDK.",
> @@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
> "-v",
> "--verbose",
> action=_env_arg("DTS_VERBOSE"),
> - default="N",
> - help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
> + default=SETTINGS.verbose,
> + const=True,
> + help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
> "to the console.",
> )
>
> @@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
> "-s",
> "--skip-setup",
> action=_env_arg("DTS_SKIP_SETUP"),
> - default="N",
> - help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
> + const=True,
> + help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
> )
>
> parser.add_argument(
> @@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--snapshot",
> "--git-ref",
> action=_env_arg("DTS_DPDK_TARBALL"),
> - default="dpdk.tar.xz",
> + default=SETTINGS.dpdk_tarball_path,
> type=Path,
> help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
> "tag ID or tree ID to test. To test local changes, first commit them, "
> @@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
> parser.add_argument(
> "--compile-timeout",
> action=_env_arg("DTS_COMPILE_TIMEOUT"),
> - default=1200,
> + default=SETTINGS.compile_timeout,
> type=float,
> help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
> )
> @@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--re-run",
> "--re_run",
> action=_env_arg("DTS_RERUN"),
> - default=0,
> + default=SETTINGS.re_run,
> type=int,
> help="[DTS_RERUN] Re-run each test case the specified amount of times "
> "if a test failure occurs",
> @@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
> return parser
>
>
> -def _get_settings() -> _Settings:
> +def get_settings() -> Settings:
> parsed_args = _get_parser().parse_args()
> - return _Settings(
> + return Settings(
That means we're parsing and creating a new setting object everytime
we're trying to read the setting? Shouldn't we just save it and return a
copy? That seems to be the old behavior, any reason to change it?
Related to this, this do mean that the previously created setting
variable is only used to set up the parser, so it might need to be
renamed to default_setting if it doesnt get reused.
> config_file_path=parsed_args.config_file,
> output_dir=parsed_args.output_dir,
> timeout=parsed_args.timeout,
> - verbose=(parsed_args.verbose == "Y"),
> - skip_setup=(parsed_args.skip_setup == "Y"),
> + verbose=parsed_args.verbose,
> + skip_setup=parsed_args.skip_setup,
> dpdk_tarball_path=Path(
> - DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
> - )
> - if not os.path.exists(parsed_args.tarball)
> - else Path(parsed_args.tarball),
> + Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
> + if not os.path.exists(parsed_args.tarball)
> + else Path(parsed_args.tarball)
> + ),
> compile_timeout=parsed_args.compile_timeout,
> - test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
> + test_cases=(
> + parsed_args.test_cases.split(",") if parsed_args.test_cases else []
> + ),
> re_run=parsed_args.re_run,
> )
> -
> -
> -SETTINGS: _Settings = _get_settings()
> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
> index f0fbe80f6f..603e18872c 100644
> --- a/dts/framework/test_result.py
> +++ b/dts/framework/test_result.py
> @@ -254,7 +254,7 @@ def add_build_target(
> self._inner_results.append(build_target_result)
> return build_target_result
>
> - def add_sut_info(self, sut_info: NodeInfo):
> + def add_sut_info(self, sut_info: NodeInfo) -> None:
> self.sut_os_name = sut_info.os_name
> self.sut_os_version = sut_info.os_version
> self.sut_kernel_version = sut_info.kernel_version
> @@ -297,7 +297,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
> self._inner_results.append(execution_result)
> return execution_result
>
> - def add_error(self, error) -> None:
> + def add_error(self, error: Exception) -> None:
> self._errors.append(error)
>
> def process(self) -> None:
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index 3b890c0451..d53553bf34 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -11,7 +11,7 @@
> import re
> from ipaddress import IPv4Interface, IPv6Interface, ip_interface
> from types import MethodType
> -from typing import Union
> +from typing import Any, Union
>
> from scapy.layers.inet import IP # type: ignore[import]
> from scapy.layers.l2 import Ether # type: ignore[import]
> @@ -26,8 +26,7 @@
> from .logger import DTSLOG, getLogger
> from .settings import SETTINGS
> from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
> -from .testbed_model import SutNode, TGNode
> -from .testbed_model.hw.port import Port, PortLink
> +from .testbed_model import Port, PortLink, SutNode, TGNode
> from .utils import get_packet_summaries
>
>
> @@ -453,7 +452,7 @@ def _execute_test_case(
>
>
> def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
> - def is_test_suite(object) -> bool:
> + def is_test_suite(object: Any) -> bool:
> try:
> if issubclass(object, TestSuite) and object is not TestSuite:
> return True
> diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
> index 5cbb859e47..8ced05653b 100644
> --- a/dts/framework/testbed_model/__init__.py
> +++ b/dts/framework/testbed_model/__init__.py
> @@ -9,15 +9,9 @@
>
> # pylama:ignore=W0611
>
> -from .hw import (
> - LogicalCore,
> - LogicalCoreCount,
> - LogicalCoreCountFilter,
> - LogicalCoreList,
> - LogicalCoreListFilter,
> - VirtualDevice,
> - lcore_filter,
> -)
> +from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
> from .node import Node
> +from .port import Port, PortLink
> from .sut_node import SutNode
> from .tg_node import TGNode
> +from .virtual_device import VirtualDevice
> diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
> similarity index 95%
> rename from dts/framework/testbed_model/hw/cpu.py
> rename to dts/framework/testbed_model/cpu.py
> index d1918a12dc..8fe785dfe4 100644
> --- a/dts/framework/testbed_model/hw/cpu.py
> +++ b/dts/framework/testbed_model/cpu.py
> @@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
> )
>
> return filtered_lcores
> +
> +
> +def lcore_filter(
> + core_list: list[LogicalCore],
> + filter_specifier: LogicalCoreCount | LogicalCoreList,
> + ascending: bool,
> +) -> LogicalCoreFilter:
> + if isinstance(filter_specifier, LogicalCoreList):
> + return LogicalCoreListFilter(core_list, filter_specifier, ascending)
> + elif isinstance(filter_specifier, LogicalCoreCount):
> + return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
> + else:
> + raise ValueError(f"Unsupported filter r{filter_specifier}")
> diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
> deleted file mode 100644
> index 88ccac0b0e..0000000000
> --- a/dts/framework/testbed_model/hw/__init__.py
> +++ /dev/null
> @@ -1,27 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -
> -# pylama:ignore=W0611
> -
> -from .cpu import (
> - LogicalCore,
> - LogicalCoreCount,
> - LogicalCoreCountFilter,
> - LogicalCoreFilter,
> - LogicalCoreList,
> - LogicalCoreListFilter,
> -)
> -from .virtual_device import VirtualDevice
> -
> -
> -def lcore_filter(
> - core_list: list[LogicalCore],
> - filter_specifier: LogicalCoreCount | LogicalCoreList,
> - ascending: bool,
> -) -> LogicalCoreFilter:
> - if isinstance(filter_specifier, LogicalCoreList):
> - return LogicalCoreListFilter(core_list, filter_specifier, ascending)
> - elif isinstance(filter_specifier, LogicalCoreCount):
> - return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
> - else:
> - raise ValueError(f"Unsupported filter r{filter_specifier}")
> diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
> similarity index 97%
> rename from dts/framework/remote_session/linux_session.py
> rename to dts/framework/testbed_model/linux_session.py
> index a3f1a6bf3b..f472bb8f0f 100644
> --- a/dts/framework/remote_session/linux_session.py
> +++ b/dts/framework/testbed_model/linux_session.py
> @@ -9,10 +9,10 @@
> from typing_extensions import NotRequired
>
> from framework.exception import RemoteCommandExecutionError
> -from framework.testbed_model import LogicalCore
> -from framework.testbed_model.hw.port import Port
> from framework.utils import expand_range
>
> +from .cpu import LogicalCore
> +from .port import Port
> from .posix_session import PosixSession
>
>
> @@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> lcores.append(LogicalCore(lcore, core, socket, node))
> return lcores
>
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> return dpdk_prefix
>
> def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index fc01e0bf8e..7571e7b98d 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -12,23 +12,26 @@
> from typing import Any, Callable, Type, Union
>
> from framework.config import (
> + OS,
> BuildTargetConfiguration,
> ExecutionConfiguration,
> NodeConfiguration,
> )
> +from framework.exception import ConfigurationError
> from framework.logger import DTSLOG, getLogger
> -from framework.remote_session import InteractiveShellType, OSSession, create_session
> from framework.settings import SETTINGS
>
> -from .hw import (
> +from .cpu import (
> LogicalCore,
> LogicalCoreCount,
> LogicalCoreList,
> LogicalCoreListFilter,
> - VirtualDevice,
> lcore_filter,
> )
> -from .hw.port import Port
> +from .linux_session import LinuxSession
> +from .os_session import InteractiveShellType, OSSession
> +from .port import Port
> +from .virtual_device import VirtualDevice
>
>
> class Node(ABC):
> @@ -69,6 +72,7 @@ def __init__(self, node_config: NodeConfiguration):
> def _init_ports(self) -> None:
> self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
> self.main_session.update_ports(self.ports)
> +
Is the newline intended?
> for port in self.ports:
> self.configure_port_state(port)
>
> @@ -172,9 +176,9 @@ def create_interactive_shell(
>
> return self.main_session.create_interactive_shell(
> shell_cls,
> - app_args,
> timeout,
> privileged,
> + app_args,
> )
>
> def filter_lcores(
> @@ -205,7 +209,7 @@ def _get_remote_cpus(self) -> None:
> self._logger.info("Getting CPU information.")
> self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
>
> - def _setup_hugepages(self):
> + def _setup_hugepages(self) -> None:
> """
> Setup hugepages on the Node. Different architectures can supply different
> amounts of memory for hugepages and numa-based hugepage allocation may need
> @@ -249,3 +253,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> return lambda *args: None
> else:
> return func
> +
> +
> +def create_session(
> + node_config: NodeConfiguration, name: str, logger: DTSLOG
> +) -> OSSession:
> + match node_config.os:
> + case OS.linux:
> + return LinuxSession(node_config, name, logger)
> + case _:
> + raise ConfigurationError(f"Unsupported OS {node_config.os}")
> diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
> similarity index 95%
> rename from dts/framework/remote_session/os_session.py
> rename to dts/framework/testbed_model/os_session.py
> index 8a709eac1c..76e595a518 100644
> --- a/dts/framework/remote_session/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -10,19 +10,19 @@
>
> from framework.config import Architecture, NodeConfiguration, NodeInfo
> from framework.logger import DTSLOG
> -from framework.remote_session.remote import InteractiveShell
> -from framework.settings import SETTINGS
> -from framework.testbed_model import LogicalCore
> -from framework.testbed_model.hw.port import Port
> -from framework.utils import MesonArgs
> -
> -from .remote import (
> +from framework.remote_session import (
> CommandResult,
> InteractiveRemoteSession,
> + InteractiveShell,
> RemoteSession,
> create_interactive_session,
> create_remote_session,
> )
> +from framework.settings import SETTINGS
> +from framework.utils import MesonArgs
> +
> +from .cpu import LogicalCore
> +from .port import Port
>
> InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
>
> @@ -85,9 +85,9 @@ def send_command(
> def create_interactive_shell(
> self,
> shell_cls: Type[InteractiveShellType],
> - eal_parameters: str,
> timeout: float,
> privileged: bool,
> + app_args: str,
Is there a reason why the argument position got changed? I'd guess
because it's more idomatic to have the extra arg at the end, but I just
want to make sure it's intended.
> ) -> InteractiveShellType:
> """
> See "create_interactive_shell" in SutNode
> @@ -96,7 +96,7 @@ def create_interactive_shell(
> self.interactive_session.session,
> self._logger,
> self._get_privileged_command if privileged else None,
> - eal_parameters,
> + app_args,
> timeout,
> )
>
> @@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
> """
>
> @abstractmethod
> - def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
> + def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
> """
> Try to find DPDK remote dir in remote_dir.
> """
> @@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
> """
>
> @abstractmethod
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> """
> Get the DPDK file prefix that will be used when running DPDK apps.
> """
> diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
> similarity index 100%
> rename from dts/framework/testbed_model/hw/port.py
> rename to dts/framework/testbed_model/port.py
> diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
> similarity index 98%
> rename from dts/framework/remote_session/posix_session.py
> rename to dts/framework/testbed_model/posix_session.py
> index 5da0516e05..1d1d5b1b26 100644
> --- a/dts/framework/remote_session/posix_session.py
> +++ b/dts/framework/testbed_model/posix_session.py
> @@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
>
> return ret_opts
>
> - def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
> + def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
> remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
> result = self.send_command(f"ls -d {remote_guess} | tail -1")
> return PurePosixPath(result.stdout)
> @@ -219,7 +219,7 @@ def _remove_dpdk_runtime_dirs(
> for dpdk_runtime_dir in dpdk_runtime_dirs:
> self.remove_remote_dir(dpdk_runtime_dir)
>
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> return ""
>
> def get_compiler_version(self, compiler_name: str) -> str:
> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> index 202aebfd06..4e33cf02ea 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -15,12 +15,14 @@
> NodeInfo,
> SutNodeConfiguration,
> )
> -from framework.remote_session import CommandResult, InteractiveShellType, OSSession
> +from framework.remote_session import CommandResult
> from framework.settings import SETTINGS
> from framework.utils import MesonArgs
>
> -from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
> +from .cpu import LogicalCoreCount, LogicalCoreList
> from .node import Node
> +from .os_session import InteractiveShellType, OSSession
> +from .virtual_device import VirtualDevice
>
>
> class EalParameters(object):
> @@ -289,7 +291,7 @@ def create_eal_parameters(
> prefix: str = "dpdk",
> append_prefix_timestamp: bool = True,
> no_pci: bool = False,
> - vdevs: list[VirtualDevice] = None,
> + vdevs: list[VirtualDevice] | None = None,
> other_eal_param: str = "",
> ) -> "EalParameters":
> """
> diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
> index 27025cfa31..166eb8430e 100644
> --- a/dts/framework/testbed_model/tg_node.py
> +++ b/dts/framework/testbed_model/tg_node.py
> @@ -16,16 +16,11 @@
>
> from scapy.packet import Packet # type: ignore[import]
>
> -from framework.config import (
> - ScapyTrafficGeneratorConfig,
> - TGNodeConfiguration,
> - TrafficGeneratorType,
> -)
> -from framework.exception import ConfigurationError
> -
> -from .capturing_traffic_generator import CapturingTrafficGenerator
> -from .hw.port import Port
> +from framework.config import TGNodeConfiguration
> +
> from .node import Node
> +from .port import Port
> +from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
>
>
> class TGNode(Node):
> @@ -80,20 +75,3 @@ def close(self) -> None:
> """Free all resources used by the node"""
> self.traffic_generator.close()
> super(TGNode, self).close()
> -
> -
> -def create_traffic_generator(
> - tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
> -) -> CapturingTrafficGenerator:
> - """A factory function for creating traffic generator object from user config."""
> -
> - from .scapy import ScapyTrafficGenerator
> -
> - match traffic_generator_config.traffic_generator_type:
> - case TrafficGeneratorType.SCAPY:
> - return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> - case _:
> - raise ConfigurationError(
> - "Unknown traffic generator: "
> - f"{traffic_generator_config.traffic_generator_type}"
> - )
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> new file mode 100644
> index 0000000000..11bfa1ee0f
> --- /dev/null
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -0,0 +1,24 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
> +from framework.exception import ConfigurationError
> +from framework.testbed_model.node import Node
> +
> +from .capturing_traffic_generator import CapturingTrafficGenerator
> +from .scapy import ScapyTrafficGenerator
> +
> +
> +def create_traffic_generator(
> + tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
> +) -> CapturingTrafficGenerator:
> + """A factory function for creating traffic generator object from user config."""
> +
> + match traffic_generator_config.traffic_generator_type:
> + case TrafficGeneratorType.SCAPY:
> + return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> + case _:
> + raise ConfigurationError(
> + "Unknown traffic generator: "
> + f"{traffic_generator_config.traffic_generator_type}"
> + )
> diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> similarity index 96%
> rename from dts/framework/testbed_model/capturing_traffic_generator.py
> rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> index ab98987f8e..e521211ef0 100644
> --- a/dts/framework/testbed_model/capturing_traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> @@ -16,9 +16,9 @@
> from scapy.packet import Packet # type: ignore[import]
>
> from framework.settings import SETTINGS
> +from framework.testbed_model.port import Port
> from framework.utils import get_packet_summaries
>
> -from .hw.port import Port
> from .traffic_generator import TrafficGenerator
>
>
> @@ -130,7 +130,9 @@ def _send_packets_and_capture(
> for the specified duration. It must be able to handle no received packets.
> """
>
> - def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
> + def _write_capture_from_packets(
> + self, capture_name: str, packets: list[Packet]
> + ) -> None:
> file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
> self._logger.debug(f"Writing packets to {file_name}.")
> scapy.utils.wrpcap(file_name, packets)
> diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
> similarity index 95%
> rename from dts/framework/testbed_model/scapy.py
> rename to dts/framework/testbed_model/traffic_generator/scapy.py
> index af0d4dbb25..51864b6e6b 100644
> --- a/dts/framework/testbed_model/scapy.py
> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
> @@ -24,16 +24,15 @@
> from scapy.packet import Packet # type: ignore[import]
>
> from framework.config import OS, ScapyTrafficGeneratorConfig
> -from framework.logger import DTSLOG, getLogger
> from framework.remote_session import PythonShell
> from framework.settings import SETTINGS
> +from framework.testbed_model.node import Node
> +from framework.testbed_model.port import Port
>
> from .capturing_traffic_generator import (
> CapturingTrafficGenerator,
> _get_default_capture_name,
> )
> -from .hw.port import Port
> -from .tg_node import TGNode
>
> """
> ========= BEGIN RPC FUNCTIONS =========
> @@ -146,7 +145,7 @@ def quit(self) -> None:
> self._BaseServer__shutdown_request = True
> return None
>
> - def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
> + def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
> """Add a function to the server.
>
> This is meant to be executed remotely.
> @@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
> session: PythonShell
> rpc_server_proxy: xmlrpc.client.ServerProxy
> _config: ScapyTrafficGeneratorConfig
> - _tg_node: TGNode
> - _logger: DTSLOG
> -
> - def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
> - self._config = config
> - self._tg_node = tg_node
> - self._logger = getLogger(
> - f"{self._tg_node.name} {self._config.traffic_generator_type}"
> - )
> +
> + def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
> + super().__init__(tg_node, config)
>
> assert (
> self._tg_node.config.os == OS.linux
> @@ -235,7 +228,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
> function_bytes = marshal.dumps(function.__code__)
> self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
>
> - def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
> + def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
> # load the source of the function
> src = inspect.getsource(QuittableXMLRPCServer)
> # Lines with only whitespace break the repl if in the middle of a function
> @@ -280,7 +273,7 @@ def _send_packets_and_capture(
> scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
> return scapy_packets
>
> - def close(self):
> + def close(self) -> None:
> try:
> self.rpc_server_proxy.quit()
> except ConnectionRefusedError:
> diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> similarity index 80%
> rename from dts/framework/testbed_model/traffic_generator.py
> rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index 28c35d3ce4..ea7c3963da 100644
> --- a/dts/framework/testbed_model/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -12,11 +12,12 @@
>
> from scapy.packet import Packet # type: ignore[import]
>
> -from framework.logger import DTSLOG
> +from framework.config import TrafficGeneratorConfig
> +from framework.logger import DTSLOG, getLogger
> +from framework.testbed_model.node import Node
> +from framework.testbed_model.port import Port
> from framework.utils import get_packet_summaries
>
> -from .hw.port import Port
> -
>
> class TrafficGenerator(ABC):
> """The base traffic generator.
> @@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
> Defines the few basic methods that each traffic generator must implement.
> """
>
> + _config: TrafficGeneratorConfig
> + _tg_node: Node
> _logger: DTSLOG
>
> + def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> + self._config = config
> + self._tg_node = tg_node
> + self._logger = getLogger(
> + f"{self._tg_node.name} {self._config.traffic_generator_type}"
> + )
> +
> def send_packet(self, packet: Packet, port: Port) -> None:
> """Send a packet and block until it is fully sent.
>
> diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
> similarity index 100%
> rename from dts/framework/testbed_model/hw/virtual_device.py
> rename to dts/framework/testbed_model/virtual_device.py
> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> index d27c2c5b5f..f0c916471c 100644
> --- a/dts/framework/utils.py
> +++ b/dts/framework/utils.py
> @@ -7,7 +7,6 @@
> import json
> import os
> import subprocess
> -import sys
> from enum import Enum
> from pathlib import Path
> from subprocess import SubprocessError
> @@ -16,35 +15,7 @@
>
> from .exception import ConfigurationError
>
> -
> -class StrEnum(Enum):
> - @staticmethod
> - def _generate_next_value_(
> - name: str, start: int, count: int, last_values: object
> - ) -> str:
> - return name
> -
> - def __str__(self) -> str:
> - return self.name
> -
> -
> -REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> -
> -
> -def check_dts_python_version() -> None:
> - if sys.version_info.major < 3 or (
> - sys.version_info.major == 3 and sys.version_info.minor < 10
> - ):
> - print(
> - RED(
> - (
> - "WARNING: DTS execution node's python version is lower than"
> - "python 3.10, is deprecated and will not work in future releases."
> - )
> - ),
> - file=sys.stderr,
> - )
> - print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> +REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
>
>
> def expand_range(range_str: str) -> list[int]:
> @@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
> return expanded_range
>
>
> -def get_packet_summaries(packets: list[Packet]):
> +def get_packet_summaries(packets: list[Packet]) -> str:
> if len(packets) == 1:
> packet_summaries = packets[0].summary()
> else:
> @@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
> return f"Packet contents: \n{packet_summaries}"
>
>
> -def RED(text: str) -> str:
> - return f"\u001B[31;1m{str(text)}\u001B[0m"
> +class StrEnum(Enum):
> + @staticmethod
> + def _generate_next_value_(
> + name: str, start: int, count: int, last_values: object
> + ) -> str:
> + return name
I don't understand this function? I don't see it used anywhere. And the
parameters are unused?
> +
> + def __str__(self) -> str:
> + return self.name
>
>
> class MesonArgs(object):
> @@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
> if self._tarball_path and os.path.exists(self._tarball_path):
> os.remove(self._tarball_path)
>
> - def __fspath__(self):
> + def __fspath__(self) -> str:
> return str(self._tarball_path)
> diff --git a/dts/main.py b/dts/main.py
> index 43311fa847..5d4714b0c3 100755
> --- a/dts/main.py
> +++ b/dts/main.py
> @@ -10,10 +10,17 @@
>
> import logging
>
> -from framework import dts
> +from framework import settings
>
>
> def main() -> None:
> + """Set DTS settings, then run DTS.
> +
> + The DTS settings are taken from the command line arguments and the environment variables.
> + """
> + settings.SETTINGS = settings.get_settings()
> + from framework import dts
Why the import *inside* the main ?
> +
> dts.run_all()
>
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v5 01/23] dts: code adjustments for doc generation
2023-11-08 13:35 ` Yoan Picchi
@ 2023-11-15 7:46 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 7:46 UTC (permalink / raw)
To: Yoan Picchi
Cc: Thomas Monjalon, Honnappa Nagarahalli, Bruce Richardson,
Jeremy Spewock, Patrick Robb, Paul Szczepanek, dev
On Wed, Nov 8, 2023 at 2:35 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/6/23 17:15, Juraj Linkeš wrote:
> > The standard Python tool for generating API documentation, Sphinx,
> > imports modules one-by-one when generating the documentation. This
> > requires code changes:
> > * properly guarding argument parsing in the if __name__ == '__main__'
> > block,
> > * the logger used by DTS runner underwent the same treatment so that it
> > doesn't create log files outside of a DTS run,
> > * however, DTS uses the arguments to construct an object holding global
> > variables. The defaults for the global variables needed to be moved
> > from argument parsing elsewhere,
> > * importing the remote_session module from framework resulted in
> > circular imports because of one module trying to import another
> > module. This is fixed by reorganizing the code,
> > * some code reorganization was done because the resulting structure
> > makes more sense, improving documentation clarity.
> >
> > The are some other changes which are documentation related:
> > * added missing type annotation so they appear in the generated docs,
> > * reordered arguments in some methods,
> > * removed superfluous arguments and attributes,
> > * change private functions/methods/attributes to private and vice-versa.
> >
> > The above all appear in the generated documentation and the with them,
> > the documentation is improved.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/config/__init__.py | 10 ++-
> > dts/framework/dts.py | 33 +++++--
> > dts/framework/exception.py | 54 +++++-------
> > dts/framework/remote_session/__init__.py | 41 ++++-----
> > .../interactive_remote_session.py | 0
> > .../{remote => }/interactive_shell.py | 0
> > .../{remote => }/python_shell.py | 0
> > .../remote_session/remote/__init__.py | 27 ------
> > .../{remote => }/remote_session.py | 0
> > .../{remote => }/ssh_session.py | 12 +--
> > .../{remote => }/testpmd_shell.py | 0
> > dts/framework/settings.py | 87 +++++++++++--------
> > dts/framework/test_result.py | 4 +-
> > dts/framework/test_suite.py | 7 +-
> > dts/framework/testbed_model/__init__.py | 12 +--
> > dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
> > dts/framework/testbed_model/hw/__init__.py | 27 ------
> > .../linux_session.py | 6 +-
> > dts/framework/testbed_model/node.py | 26 ++++--
> > .../os_session.py | 22 ++---
> > dts/framework/testbed_model/{hw => }/port.py | 0
> > .../posix_session.py | 4 +-
> > dts/framework/testbed_model/sut_node.py | 8 +-
> > dts/framework/testbed_model/tg_node.py | 30 +------
> > .../traffic_generator/__init__.py | 24 +++++
> > .../capturing_traffic_generator.py | 6 +-
> > .../{ => traffic_generator}/scapy.py | 23 ++---
> > .../traffic_generator.py | 16 +++-
> > .../testbed_model/{hw => }/virtual_device.py | 0
> > dts/framework/utils.py | 46 +++-------
> > dts/main.py | 9 +-
> > 31 files changed, 259 insertions(+), 288 deletions(-)
> > rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
> > rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
> > rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
> > delete mode 100644 dts/framework/remote_session/remote/__init__.py
> > rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
> > rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
> > rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
> > rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
> > delete mode 100644 dts/framework/testbed_model/hw/__init__.py
> > rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
> > rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
> > rename dts/framework/testbed_model/{hw => }/port.py (100%)
> > rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
> > create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
> > rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (96%)
> > rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
> > rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
> > rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
> >
> > diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> > index cb7e00ba34..2044c82611 100644
> > --- a/dts/framework/config/__init__.py
> > +++ b/dts/framework/config/__init__.py
> > @@ -17,6 +17,7 @@
> > import warlock # type: ignore[import]
> > import yaml
> >
> > +from framework.exception import ConfigurationError
> > from framework.settings import SETTINGS
> > from framework.utils import StrEnum
> >
> > @@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
> > traffic_generator_type: TrafficGeneratorType
> >
> > @staticmethod
> > - def from_dict(d: dict):
> > + def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
>
> This function looks to be designed to support more trafic generator than
> just scapy, so setting its return type to scapy specifically looks
> wrong. Shouldn't it be a more generic traffic generator type? Like you
> did in create_traffic_generator()
>
The reason is the type in the constructor of the scapy traffic
generator - the type there should be ScapyTrafficGeneratorConfig and
if I change it anywhere in the chain, mypy reports an error. I don't
want to do any extra refactoring in this patch if we don't have to, so
we need to rethink this when adding a new traffic generator.
> > # This looks useless now, but is designed to allow expansion to traffic
> > # generators that require more configuration later.
> > match TrafficGeneratorType(d["type"]):
> > @@ -97,6 +98,10 @@ def from_dict(d: dict):
> > return ScapyTrafficGeneratorConfig(
> > traffic_generator_type=TrafficGeneratorType.SCAPY
> > )
> > + case _:
> > + raise ConfigurationError(
> > + f'Unknown traffic generator type "{d["type"]}".'
> > + )
> >
> >
> > @dataclass(slots=True, frozen=True)
<snip>
> > --- a/dts/framework/settings.py
> > +++ b/dts/framework/settings.py
<small snip>
> > @@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
> > return parser
> >
> >
> > -def _get_settings() -> _Settings:
> > +def get_settings() -> Settings:
> > parsed_args = _get_parser().parse_args()
> > - return _Settings(
> > + return Settings(
>
> That means we're parsing and creating a new setting object everytime
> we're trying to read the setting? Shouldn't we just save it and return a
> copy? That seems to be the old behavior, any reason to change it?
>
By old behavior, do you mean the behavior from the previous version?
I want the Settings object to be immutable, as much as it can be in
Python (that's why the dataclass if frozen), so that it's clear it
shouldn't be changed during runtime, as the object represents user
choices (any modifications would violate that). More below.
> Related to this, this do mean that the previously created setting
> variable is only used to set up the parser, so it might need to be
> renamed to default_setting if it doesnt get reused.
>
It is used. The reason the SETTINGS variable is implemented this way
is mostly because of Sphinx. Sphinx imports everything file by file:
When it imports a module that uses the SETTINGS variable (such as
node.py), the variable needs to be defined. On top of that, when
Sphinx accesses command line arguments, it sees it's own command line
arguments (which are incompatible with DTS), so we need to guard the
command line parsing against imports (we have it in if __name__ ==
"main" in main.py). This is why the defaults are split from the
command line parsing - when Sphinx imports the module, it uses the
object with defaults and during runtime we replace the object with
user-defined values.
There are other ways to do this, but I didn't find a better one with
all the constraints and requirements outlined above.
> > config_file_path=parsed_args.config_file,
> > output_dir=parsed_args.output_dir,
> > timeout=parsed_args.timeout,
> > - verbose=(parsed_args.verbose == "Y"),
> > - skip_setup=(parsed_args.skip_setup == "Y"),
> > + verbose=parsed_args.verbose,
> > + skip_setup=parsed_args.skip_setup,
> > dpdk_tarball_path=Path(
> > - DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
> > - )
> > - if not os.path.exists(parsed_args.tarball)
> > - else Path(parsed_args.tarball),
> > + Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
> > + if not os.path.exists(parsed_args.tarball)
> > + else Path(parsed_args.tarball)
> > + ),
> > compile_timeout=parsed_args.compile_timeout,
> > - test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
> > + test_cases=(
> > + parsed_args.test_cases.split(",") if parsed_args.test_cases else []
> > + ),
> > re_run=parsed_args.re_run,
> > )
> > -
> > -
> > -SETTINGS: _Settings = _get_settings()
<snip>
> > diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> > index fc01e0bf8e..7571e7b98d 100644
> > --- a/dts/framework/testbed_model/node.py
> > +++ b/dts/framework/testbed_model/node.py
> > @@ -12,23 +12,26 @@
> > from typing import Any, Callable, Type, Union
> >
> > from framework.config import (
> > + OS,
> > BuildTargetConfiguration,
> > ExecutionConfiguration,
> > NodeConfiguration,
> > )
> > +from framework.exception import ConfigurationError
> > from framework.logger import DTSLOG, getLogger
> > -from framework.remote_session import InteractiveShellType, OSSession, create_session
> > from framework.settings import SETTINGS
> >
> > -from .hw import (
> > +from .cpu import (
> > LogicalCore,
> > LogicalCoreCount,
> > LogicalCoreList,
> > LogicalCoreListFilter,
> > - VirtualDevice,
> > lcore_filter,
> > )
> > -from .hw.port import Port
> > +from .linux_session import LinuxSession
> > +from .os_session import InteractiveShellType, OSSession
> > +from .port import Port
> > +from .virtual_device import VirtualDevice
> >
> >
> > class Node(ABC):
> > @@ -69,6 +72,7 @@ def __init__(self, node_config: NodeConfiguration):
> > def _init_ports(self) -> None:
> > self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
> > self.main_session.update_ports(self.ports)
> > +
>
> Is the newline intended?
>
Hm, I don't really remember or see a reason for it, really. I can remove it.
> > for port in self.ports:
> > self.configure_port_state(port)
> >
> > @@ -172,9 +176,9 @@ def create_interactive_shell(
> >
> > return self.main_session.create_interactive_shell(
> > shell_cls,
> > - app_args,
> > timeout,
> > privileged,
> > + app_args,
> > )
> >
> > def filter_lcores(
> > @@ -205,7 +209,7 @@ def _get_remote_cpus(self) -> None:
> > self._logger.info("Getting CPU information.")
> > self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
> >
> > - def _setup_hugepages(self):
> > + def _setup_hugepages(self) -> None:
> > """
> > Setup hugepages on the Node. Different architectures can supply different
> > amounts of memory for hugepages and numa-based hugepage allocation may need
> > @@ -249,3 +253,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> > return lambda *args: None
> > else:
> > return func
> > +
> > +
> > +def create_session(
> > + node_config: NodeConfiguration, name: str, logger: DTSLOG
> > +) -> OSSession:
> > + match node_config.os:
> > + case OS.linux:
> > + return LinuxSession(node_config, name, logger)
> > + case _:
> > + raise ConfigurationError(f"Unsupported OS {node_config.os}")
> > diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
> > similarity index 95%
> > rename from dts/framework/remote_session/os_session.py
> > rename to dts/framework/testbed_model/os_session.py
> > index 8a709eac1c..76e595a518 100644
> > --- a/dts/framework/remote_session/os_session.py
> > +++ b/dts/framework/testbed_model/os_session.py
> > @@ -10,19 +10,19 @@
> >
> > from framework.config import Architecture, NodeConfiguration, NodeInfo
> > from framework.logger import DTSLOG
> > -from framework.remote_session.remote import InteractiveShell
> > -from framework.settings import SETTINGS
> > -from framework.testbed_model import LogicalCore
> > -from framework.testbed_model.hw.port import Port
> > -from framework.utils import MesonArgs
> > -
> > -from .remote import (
> > +from framework.remote_session import (
> > CommandResult,
> > InteractiveRemoteSession,
> > + InteractiveShell,
> > RemoteSession,
> > create_interactive_session,
> > create_remote_session,
> > )
> > +from framework.settings import SETTINGS
> > +from framework.utils import MesonArgs
> > +
> > +from .cpu import LogicalCore
> > +from .port import Port
> >
> > InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
> >
> > @@ -85,9 +85,9 @@ def send_command(
> > def create_interactive_shell(
> > self,
> > shell_cls: Type[InteractiveShellType],
> > - eal_parameters: str,
> > timeout: float,
> > privileged: bool,
> > + app_args: str,
>
> Is there a reason why the argument position got changed? I'd guess
> because it's more idomatic to have the extra arg at the end, but I just
> want to make sure it's intended.
>
Yes, this is very much intended. It's here to unite the method
signature with the signatures of the rest of the methods called down
the line.
I made this API change during API documentation as the different
signatures of basically the same methods would look terrible in the
docs.
> > ) -> InteractiveShellType:
> > """
> > See "create_interactive_shell" in SutNode
<snip>
> > diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> > index d27c2c5b5f..f0c916471c 100644
> > --- a/dts/framework/utils.py
> > +++ b/dts/framework/utils.py
> > @@ -7,7 +7,6 @@
> > import json
> > import os
> > import subprocess
> > -import sys
> > from enum import Enum
> > from pathlib import Path
> > from subprocess import SubprocessError
> > @@ -16,35 +15,7 @@
> >
> > from .exception import ConfigurationError
> >
> > -
> > -class StrEnum(Enum):
> > - @staticmethod
> > - def _generate_next_value_(
> > - name: str, start: int, count: int, last_values: object
> > - ) -> str:
> > - return name
> > -
> > - def __str__(self) -> str:
> > - return self.name
> > -
> > -
> > -REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> > -
> > -
> > -def check_dts_python_version() -> None:
> > - if sys.version_info.major < 3 or (
> > - sys.version_info.major == 3 and sys.version_info.minor < 10
> > - ):
> > - print(
> > - RED(
> > - (
> > - "WARNING: DTS execution node's python version is lower than"
> > - "python 3.10, is deprecated and will not work in future releases."
> > - )
> > - ),
> > - file=sys.stderr,
> > - )
> > - print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> > +REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> >
> >
> > def expand_range(range_str: str) -> list[int]:
> > @@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
> > return expanded_range
> >
> >
> > -def get_packet_summaries(packets: list[Packet]):
> > +def get_packet_summaries(packets: list[Packet]) -> str:
> > if len(packets) == 1:
> > packet_summaries = packets[0].summary()
> > else:
> > @@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
> > return f"Packet contents: \n{packet_summaries}"
> >
> >
> > -def RED(text: str) -> str:
> > - return f"\u001B[31;1m{str(text)}\u001B[0m"
> > +class StrEnum(Enum):
> > + @staticmethod
> > + def _generate_next_value_(
> > + name: str, start: int, count: int, last_values: object
> > + ) -> str:
> > + return name
>
> I don't understand this function? I don't see it used anywhere. And the
> parameters are unused?
>
This is an internal method of Enum that defines what happens when
auto() is called (which is used plenty).
> > +
> > + def __str__(self) -> str:
> > + return self.name
> >
> >
> > class MesonArgs(object):
> > @@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
> > if self._tarball_path and os.path.exists(self._tarball_path):
> > os.remove(self._tarball_path)
> >
> > - def __fspath__(self):
> > + def __fspath__(self) -> str:
> > return str(self._tarball_path)
> > diff --git a/dts/main.py b/dts/main.py
> > index 43311fa847..5d4714b0c3 100755
> > --- a/dts/main.py
> > +++ b/dts/main.py
> > @@ -10,10 +10,17 @@
> >
> > import logging
> >
> > -from framework import dts
> > +from framework import settings
> >
> >
> > def main() -> None:
> > + """Set DTS settings, then run DTS.
> > +
> > + The DTS settings are taken from the command line arguments and the environment variables.
> > + """
> > + settings.SETTINGS = settings.get_settings()
> > + from framework import dts
>
> Why the import *inside* the main ?
>
This is actually explained in the docstring added in one of the later
patches, so let me copy paste it here:
The DTS settings are taken from the command line arguments and the
environment variables.
The settings object is stored in the module-level variable
settings.SETTINGS which the entire
framework uses. After importing the module (or the variable), any
changes to the variable are
not going to be reflected without a re-import. This means that the
SETTINGS variable must
be modified before the settings module is imported anywhere else in
the framework.
> > +
> > dts.run_all()
> >
> >
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 02/23] dts: add docstring checker
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 01/23] dts: code adjustments for doc generation Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-07 17:38 ` Yoan Picchi
2023-11-06 17:15 ` [PATCH v5 03/23] dts: add basic developer docs Juraj Linkeš
` (21 subsequent siblings)
23 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Python docstrings are the in-code way to document the code. The
docstring checker of choice is pydocstyle which we're executing from
Pylama, but the current latest versions are not complatible due to [0],
so pin the pydocstyle version to the latest working version.
[0] https://github.com/klen/pylama/issues/232
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 12 ++++++------
dts/pyproject.toml | 6 +++++-
2 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index f7b3b6d602..a734fa71f0 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -489,20 +489,20 @@ files = [
[[package]]
name = "pydocstyle"
-version = "6.3.0"
+version = "6.1.1"
description = "Python docstring style checker"
optional = false
python-versions = ">=3.6"
files = [
- {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
- {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+ {file = "pydocstyle-6.1.1-py3-none-any.whl", hash = "sha256:6987826d6775056839940041beef5c08cc7e3d71d63149b48e36727f70144dc4"},
+ {file = "pydocstyle-6.1.1.tar.gz", hash = "sha256:1d41b7c459ba0ee6c345f2eb9ae827cab14a7533a88c5c6f7e94923f72df92dc"},
]
[package.dependencies]
-snowballstemmer = ">=2.2.0"
+snowballstemmer = "*"
[package.extras]
-toml = ["tomli (>=1.2.3)"]
+toml = ["toml"]
[[package]]
name = "pyflakes"
@@ -837,4 +837,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
+content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6762edfa6b..3943c87c87 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -25,6 +25,7 @@ PyYAML = "^6.0"
types-PyYAML = "^6.0.8"
fabric = "^2.7.1"
scapy = "^2.5.0"
+pydocstyle = "6.1.1"
[tool.poetry.group.dev.dependencies]
mypy = "^0.961"
@@ -39,10 +40,13 @@ requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
[tool.pylama]
-linters = "mccabe,pycodestyle,pyflakes"
+linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
format = "pylint"
max_line_length = 88 # https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html#line-length
+[tool.pylama.linter.pydocstyle]
+convention = "google"
+
[tool.mypy]
python_version = "3.10"
enable_error_code = ["ignore-without-code"]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v5 02/23] dts: add docstring checker
2023-11-06 17:15 ` [PATCH v5 02/23] dts: add docstring checker Juraj Linkeš
@ 2023-11-07 17:38 ` Yoan Picchi
0 siblings, 0 replies; 393+ messages in thread
From: Yoan Picchi @ 2023-11-07 17:38 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek
Cc: dev
On 11/6/23 17:15, Juraj Linkeš wrote:
> Python docstrings are the in-code way to document the code. The
> docstring checker of choice is pydocstyle which we're executing from
> Pylama, but the current latest versions are not complatible due to [0],
> so pin the pydocstyle version to the latest working version.
>
> [0] https://github.com/klen/pylama/issues/232
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/poetry.lock | 12 ++++++------
> dts/pyproject.toml | 6 +++++-
> 2 files changed, 11 insertions(+), 7 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index f7b3b6d602..a734fa71f0 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -489,20 +489,20 @@ files = [
>
> [[package]]
> name = "pydocstyle"
> -version = "6.3.0"
> +version = "6.1.1"
> description = "Python docstring style checker"
> optional = false
> python-versions = ">=3.6"
> files = [
> - {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
> - {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
> + {file = "pydocstyle-6.1.1-py3-none-any.whl", hash = "sha256:6987826d6775056839940041beef5c08cc7e3d71d63149b48e36727f70144dc4"},
> + {file = "pydocstyle-6.1.1.tar.gz", hash = "sha256:1d41b7c459ba0ee6c345f2eb9ae827cab14a7533a88c5c6f7e94923f72df92dc"},
> ]
>
> [package.dependencies]
> -snowballstemmer = ">=2.2.0"
> +snowballstemmer = "*"
>
> [package.extras]
> -toml = ["tomli (>=1.2.3)"]
> +toml = ["toml"]
>
> [[package]]
> name = "pyflakes"
> @@ -837,4 +837,4 @@ jsonschema = ">=4,<5"
> [metadata]
> lock-version = "2.0"
> python-versions = "^3.10"
> -content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
> +content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index 6762edfa6b..3943c87c87 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -25,6 +25,7 @@ PyYAML = "^6.0"
> types-PyYAML = "^6.0.8"
> fabric = "^2.7.1"
> scapy = "^2.5.0"
> +pydocstyle = "6.1.1"
>
> [tool.poetry.group.dev.dependencies]
> mypy = "^0.961"
> @@ -39,10 +40,13 @@ requires = ["poetry-core>=1.0.0"]
> build-backend = "poetry.core.masonry.api"
>
> [tool.pylama]
> -linters = "mccabe,pycodestyle,pyflakes"
> +linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
> format = "pylint"
> max_line_length = 88 # https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html#line-length
>
> +[tool.pylama.linter.pydocstyle]
> +convention = "google"
> +
> [tool.mypy]
> python_version = "3.10"
> enable_error_code = ["ignore-without-code"]
Reviewed-by: Yoan Picchi <yoan.picchi@arm.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 03/23] dts: add basic developer docs
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 01/23] dts: code adjustments for doc generation Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 02/23] dts: add docstring checker Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-07 14:39 ` Yoan Picchi
2023-11-06 17:15 ` [PATCH v5 04/23] dts: exceptions docstring update Juraj Linkeš
` (20 subsequent siblings)
23 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Expand the framework contribution guidelines and add how to document the
code with Python docstrings.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
1 file changed, 73 insertions(+)
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 32c18ee472..b1e99107c3 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
The results contain basic statistics of passed/failed test cases and DPDK version.
+Contributing to DTS
+-------------------
+
+There are two areas of contribution: The DTS framework and DTS test suites.
+
+The framework contains the logic needed to run test cases, such as connecting to nodes,
+running DPDK apps and collecting results.
+
+The test cases call APIs from the framework to test their scenarios. Adding test cases may
+require adding code to the framework as well.
+
+
+Framework Coding Guidelines
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When adding code to the DTS framework, pay attention to the rest of the code
+and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
+warnings when some of the basics are not met.
+
+The code must be properly documented with docstrings. The style must conform to
+the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
+See an example of the style
+`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
+For cases which are not covered by the Google style, refer
+to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
+the two style guides, where we deviate or where some additional clarification is helpful:
+
+ * The __init__() methods of classes are documented separately from the docstring of the class
+ itself.
+ * The docstrigs of implemented abstract methods should refer to the superclass's definition
+ if there's no deviation.
+ * Instance variables/attributes should be documented in the docstring of the class
+ in the ``Attributes:`` section.
+ * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
+ attributes which result in instance variables/attributes should also be recorded
+ in the ``Attributes:`` section.
+ * Class variables/attributes, on the other hand, should be documented with ``#:`` above
+ the type annotated line. The description may be omitted if the meaning is obvious.
+ * The Enum and TypedDict also process the attributes in particular ways and should be documented
+ with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
+ * When referencing a parameter of a function or a method in their docstring, don't use
+ any articles and put the parameter into single backticks. This mimics the style of
+ `Python's documentation <https://docs.python.org/3/index.html>`_.
+ * When specifying a value, use double backticks::
+
+ def foo(greet: bool) -> None:
+ """Demonstration of single and double backticks.
+
+ `greet` controls whether ``Hello World`` is printed.
+
+ Args:
+ greet: Whether to print the ``Hello World`` message.
+ """
+ if greet:
+ print(f"Hello World")
+
+ * The docstring maximum line length is the same as the code maximum line length.
+
+
How To Write a Test Suite
-------------------------
@@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
| These methods don't need to be implemented if there's no need for them in a test suite.
In that case, nothing will happen when they're is executed.
+#. **Configuration, traffic and other logic**
+
+ The ``TestSuite`` class contains a variety of methods for anything that
+ a test suite setup or teardown or a test case may need to do.
+
+ The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
+ and use the interactive shell instances directly.
+
+ These are the two main ways to call the framework logic in test suites. If there's any
+ functionality or logic missing from the framework, it should be implemented so that
+ the test suites can use one of these two ways.
+
#. **Test case verification**
Test case verification should be done with the ``verify`` method, which records the result.
@@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
and used by the test suite via the ``sut_node`` field.
+.. _dts_dev_tools:
+
DTS Developer Tools
-------------------
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v5 03/23] dts: add basic developer docs
2023-11-06 17:15 ` [PATCH v5 03/23] dts: add basic developer docs Juraj Linkeš
@ 2023-11-07 14:39 ` Yoan Picchi
2023-11-08 9:01 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-07 14:39 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek
Cc: dev
On 11/6/23 17:15, Juraj Linkeš wrote:
> Expand the framework contribution guidelines and add how to document the
> code with Python docstrings.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
> 1 file changed, 73 insertions(+)
>
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> index 32c18ee472..b1e99107c3 100644
> --- a/doc/guides/tools/dts.rst
> +++ b/doc/guides/tools/dts.rst
> @@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
> The results contain basic statistics of passed/failed test cases and DPDK version.
>
>
> +Contributing to DTS
> +-------------------
> +
> +There are two areas of contribution: The DTS framework and DTS test suites.
> +
> +The framework contains the logic needed to run test cases, such as connecting to nodes,
> +running DPDK apps and collecting results.
> +
> +The test cases call APIs from the framework to test their scenarios. Adding test cases may
> +require adding code to the framework as well.
> +
> +
> +Framework Coding Guidelines
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +When adding code to the DTS framework, pay attention to the rest of the code
> +and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
> +warnings when some of the basics are not met.
> +
> +The code must be properly documented with docstrings. The style must conform to
> +the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
> +See an example of the style
> +`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
> +For cases which are not covered by the Google style, refer
> +to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
> +the two style guides, where we deviate or where some additional clarification is helpful:
> +
> + * The __init__() methods of classes are documented separately from the docstring of the class
> + itself.
> + * The docstrigs of implemented abstract methods should refer to the superclass's definition
> + if there's no deviation.
> + * Instance variables/attributes should be documented in the docstring of the class
> + in the ``Attributes:`` section.
> + * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
> + attributes which result in instance variables/attributes should also be recorded
> + in the ``Attributes:`` section.
> + * Class variables/attributes, on the other hand, should be documented with ``#:`` above
> + the type annotated line. The description may be omitted if the meaning is obvious.
> + * The Enum and TypedDict also process the attributes in particular ways and should be documented
> + with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
> + * When referencing a parameter of a function or a method in their docstring, don't use
> + any articles and put the parameter into single backticks. This mimics the style of
> + `Python's documentation <https://docs.python.org/3/index.html>`_.
> + * When specifying a value, use double backticks::
> +
> + def foo(greet: bool) -> None:
> + """Demonstration of single and double backticks.
> +
> + `greet` controls whether ``Hello World`` is printed.
> +
> + Args:
> + greet: Whether to print the ``Hello World`` message.
> + """
> + if greet:
> + print(f"Hello World")
> +
> + * The docstring maximum line length is the same as the code maximum line length.
> +
> +
> How To Write a Test Suite
> -------------------------
>
> @@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
> | These methods don't need to be implemented if there's no need for them in a test suite.
> In that case, nothing will happen when they're is executed.
Not your change, but it does highlight a previous mistake : "they're is"
>
> +#. **Configuration, traffic and other logic**
> +
> + The ``TestSuite`` class contains a variety of methods for anything that
> + a test suite setup or teardown or a test case may need to do.
Three way or. There's a need for an oxford coma: setup, teardown, or a
test case
> +
> + The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
> + and use the interactive shell instances directly.
> +
> + These are the two main ways to call the framework logic in test suites. If there's any
> + functionality or logic missing from the framework, it should be implemented so that
> + the test suites can use one of these two ways.
> +
> #. **Test case verification**
>
> Test case verification should be done with the ``verify`` method, which records the result.
> @@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
> and used by the test suite via the ``sut_node`` field.
>
>
> +.. _dts_dev_tools:
> +
> DTS Developer Tools
> -------------------
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v5 03/23] dts: add basic developer docs
2023-11-07 14:39 ` Yoan Picchi
@ 2023-11-08 9:01 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 9:01 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, dev
On Tue, Nov 7, 2023 at 3:40 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/6/23 17:15, Juraj Linkeš wrote:
> > Expand the framework contribution guidelines and add how to document the
> > code with Python docstrings.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
> > 1 file changed, 73 insertions(+)
> >
> > diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> > index 32c18ee472..b1e99107c3 100644
> > --- a/doc/guides/tools/dts.rst
> > +++ b/doc/guides/tools/dts.rst
> > @@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
> > The results contain basic statistics of passed/failed test cases and DPDK version.
> >
> >
> > +Contributing to DTS
> > +-------------------
> > +
> > +There are two areas of contribution: The DTS framework and DTS test suites.
> > +
> > +The framework contains the logic needed to run test cases, such as connecting to nodes,
> > +running DPDK apps and collecting results.
> > +
> > +The test cases call APIs from the framework to test their scenarios. Adding test cases may
> > +require adding code to the framework as well.
> > +
> > +
> > +Framework Coding Guidelines
> > +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> > +
> > +When adding code to the DTS framework, pay attention to the rest of the code
> > +and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
> > +warnings when some of the basics are not met.
> > +
> > +The code must be properly documented with docstrings. The style must conform to
> > +the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
> > +See an example of the style
> > +`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
> > +For cases which are not covered by the Google style, refer
> > +to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
> > +the two style guides, where we deviate or where some additional clarification is helpful:
> > +
> > + * The __init__() methods of classes are documented separately from the docstring of the class
> > + itself.
> > + * The docstrigs of implemented abstract methods should refer to the superclass's definition
> > + if there's no deviation.
> > + * Instance variables/attributes should be documented in the docstring of the class
> > + in the ``Attributes:`` section.
> > + * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
> > + attributes which result in instance variables/attributes should also be recorded
> > + in the ``Attributes:`` section.
> > + * Class variables/attributes, on the other hand, should be documented with ``#:`` above
> > + the type annotated line. The description may be omitted if the meaning is obvious.
> > + * The Enum and TypedDict also process the attributes in particular ways and should be documented
> > + with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
> > + * When referencing a parameter of a function or a method in their docstring, don't use
> > + any articles and put the parameter into single backticks. This mimics the style of
> > + `Python's documentation <https://docs.python.org/3/index.html>`_.
> > + * When specifying a value, use double backticks::
> > +
> > + def foo(greet: bool) -> None:
> > + """Demonstration of single and double backticks.
> > +
> > + `greet` controls whether ``Hello World`` is printed.
> > +
> > + Args:
> > + greet: Whether to print the ``Hello World`` message.
> > + """
> > + if greet:
> > + print(f"Hello World")
> > +
> > + * The docstring maximum line length is the same as the code maximum line length.
> > +
> > +
> > How To Write a Test Suite
> > -------------------------
> >
> > @@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
> > | These methods don't need to be implemented if there's no need for them in a test suite.
> > In that case, nothing will happen when they're is executed.
>
> Not your change, but it does highlight a previous mistake : "they're is"
>
Good catch - we'll be adding to this guide in the future so we can fix it then.
> >
> > +#. **Configuration, traffic and other logic**
> > +
> > + The ``TestSuite`` class contains a variety of methods for anything that
> > + a test suite setup or teardown or a test case may need to do.
>
> Three way or. There's a need for an oxford coma: setup, teardown, or a
> test case
>
Thanks, I'll change this.
> > +
> > + The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
> > + and use the interactive shell instances directly.
> > +
> > + These are the two main ways to call the framework logic in test suites. If there's any
> > + functionality or logic missing from the framework, it should be implemented so that
> > + the test suites can use one of these two ways.
> > +
> > #. **Test case verification**
> >
> > Test case verification should be done with the ``verify`` method, which records the result.
> > @@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
> > and used by the test suite via the ``sut_node`` field.
> >
> >
> > +.. _dts_dev_tools:
> > +
> > DTS Developer Tools
> > -------------------
> >
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 04/23] dts: exceptions docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (2 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 03/23] dts: add basic developer docs Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 05/23] dts: settings " Juraj Linkeš
` (19 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/__init__.py | 12 ++++-
dts/framework/exception.py | 106 +++++++++++++++++++++++++------------
2 files changed, 83 insertions(+), 35 deletions(-)
diff --git a/dts/framework/__init__.py b/dts/framework/__init__.py
index d551ad4bf0..662e6ccad2 100644
--- a/dts/framework/__init__.py
+++ b/dts/framework/__init__.py
@@ -1,3 +1,13 @@
# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2022 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+
+"""Libraries and utilities for running DPDK Test Suite (DTS).
+
+The various modules in the DTS framework offer:
+
+* Connections to nodes, both interactive and non-interactive,
+* A straightforward way to add support for different operating systems of remote nodes,
+* Test suite setup, execution and teardown, along with test case setup, execution and teardown,
+* Pre-test suite setup and post-test suite teardown.
+"""
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 7489c03570..ee1562c672 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -3,8 +3,10 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-User-defined exceptions used across the framework.
+"""DTS exceptions.
+
+The exceptions all have different severities expressed as an integer.
+The highest severity of all raised exception is used as the exit code of DTS.
"""
from enum import IntEnum, unique
@@ -13,59 +15,79 @@
@unique
class ErrorSeverity(IntEnum):
- """
- The severity of errors that occur during DTS execution.
+ """The severity of errors that occur during DTS execution.
+
All exceptions are caught and the most severe error is used as return code.
"""
+ #:
NO_ERR = 0
+ #:
GENERIC_ERR = 1
+ #:
CONFIG_ERR = 2
+ #:
REMOTE_CMD_EXEC_ERR = 3
+ #:
SSH_ERR = 4
+ #:
DPDK_BUILD_ERR = 10
+ #:
TESTCASE_VERIFY_ERR = 20
+ #:
BLOCKING_TESTSUITE_ERR = 25
class DTSError(Exception):
- """
- The base exception from which all DTS exceptions are derived.
- Stores error severity.
+ """The base exception from which all DTS exceptions are subclassed.
+
+ Do not use this exception, only use subclassed exceptions.
"""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.GENERIC_ERR
class SSHTimeoutError(DTSError):
- """
- Command execution timeout.
- """
+ """The SSH execution of a command timed out."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_command: str
def __init__(self, command: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ command: The executed command.
+ """
self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self._command}"
+ """Add some context to the string representation."""
+ return f"{self._command} execution timed out."
class SSHConnectionError(DTSError):
- """
- SSH connection error.
- """
+ """An unsuccessful SSH connection."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
_errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ host: The hostname to which we're trying to connect.
+ errors: Any errors that occurred during the connection attempt.
+ """
self._host = host
self._errors = [] if errors is None else errors
def __str__(self) -> str:
+ """Include the errors in the string representation."""
message = f"Error trying to connect with {self._host}."
if self._errors:
message += f" Errors encountered while retrying: {', '.join(self._errors)}"
@@ -74,43 +96,53 @@ def __str__(self) -> str:
class SSHSessionDeadError(DTSError):
- """
- SSH session is not alive.
- It can no longer be used.
- """
+ """The SSH session is no longer alive."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
def __init__(self, host: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ host: The hostname of the disconnected node.
+ """
self._host = host
def __str__(self) -> str:
- return f"SSH session with {self._host} has died"
+ """Add some context to the string representation."""
+ return f"SSH session with {self._host} has died."
class ConfigurationError(DTSError):
- """
- Raised when an invalid configuration is encountered.
- """
+ """An invalid configuration."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.CONFIG_ERR
class RemoteCommandExecutionError(DTSError):
- """
- Raised when a command executed on a Node returns a non-zero exit status.
- """
+ """An unsuccessful execution of a remote command."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ #: The executed command.
command: str
_command_return_code: int
def __init__(self, command: str, command_return_code: int):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ command: The executed command.
+ command_return_code: The return code of the executed command.
+ """
self.command = command
self._command_return_code = command_return_code
def __str__(self) -> str:
+ """Include both the command and return code in the string representation."""
return (
f"Command {self.command} returned a non-zero exit code: "
f"{self._command_return_code}"
@@ -118,35 +150,41 @@ def __str__(self) -> str:
class RemoteDirectoryExistsError(DTSError):
- """
- Raised when a remote directory to be created already exists.
- """
+ """A directory that exists on a remote node."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
class DPDKBuildError(DTSError):
- """
- Raised when DPDK build fails for any reason.
- """
+ """A DPDK build failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.DPDK_BUILD_ERR
class TestCaseVerifyError(DTSError):
- """
- Used in test cases to verify the expected behavior.
- """
+ """A test case failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
class BlockingTestSuiteError(DTSError):
+ """A failure in a blocking test suite."""
+
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
_suite_name: str
def __init__(self, suite_name: str) -> None:
+ """Define the meaning of the first argument.
+
+ Args:
+ suite_name: The blocking test suite.
+ """
self._suite_name = suite_name
def __str__(self) -> str:
+ """Add some context to the string representation."""
return f"Blocking suite {self._suite_name} failed."
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 05/23] dts: settings docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (3 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 04/23] dts: exceptions docstring update Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 06/23] dts: logger and " Juraj Linkeš
` (18 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/settings.py | 101 +++++++++++++++++++++++++++++++++++++-
1 file changed, 100 insertions(+), 1 deletion(-)
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 7f5841d073..787db7c198 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -3,6 +3,70 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+"""Environment variables and command line arguments parsing.
+
+This is a simple module utilizing the built-in argparse module to parse command line arguments,
+augment them with values from environment variables and make them available across the framework.
+
+The command line value takes precedence, followed by the environment variable value,
+followed by the default value defined in this module.
+
+The command line arguments along with the supported environment variables are:
+
+.. option:: --config-file
+.. envvar:: DTS_CFG_FILE
+
+ The path to the YAML test run configuration file.
+
+.. option:: --output-dir, --output
+.. envvar:: DTS_OUTPUT_DIR
+
+ The directory where DTS logs and results are saved.
+
+.. option:: --compile-timeout
+.. envvar:: DTS_COMPILE_TIMEOUT
+
+ The timeout for compiling DPDK.
+
+.. option:: -t, --timeout
+.. envvar:: DTS_TIMEOUT
+
+ The timeout for all DTS operation except for compiling DPDK.
+
+.. option:: -v, --verbose
+.. envvar:: DTS_VERBOSE
+
+ Set to any value to enable logging everything to the console.
+
+.. option:: -s, --skip-setup
+.. envvar:: DTS_SKIP_SETUP
+
+ Set to any value to skip building DPDK.
+
+.. option:: --tarball, --snapshot, --git-ref
+.. envvar:: DTS_DPDK_TARBALL
+
+ The path to a DPDK tarball, git commit ID, tag ID or tree ID to test.
+
+.. option:: --test-cases
+.. envvar:: DTS_TESTCASES
+
+ A comma-separated list of test cases to execute. Unknown test cases will be silently ignored.
+
+.. option:: --re-run, --re_run
+.. envvar:: DTS_RERUN
+
+ Re-run each test case this many times in case of a failure.
+
+Attributes:
+ SETTINGS: The module level variable storing framework-wide DTS settings.
+
+Typical usage example::
+
+ from framework.settings import SETTINGS
+ foo = SETTINGS.foo
+"""
+
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
@@ -16,6 +80,23 @@
def _env_arg(env_var: str) -> Any:
+ """A helper method augmenting the argparse Action with environment variables.
+
+ If the supplied environment variable is defined, then the default value
+ of the argument is modified. This satisfies the priority order of
+ command line argument > environment variable > default value.
+
+ Arguments with no values (flags) should be defined using the const keyword argument
+ (True or False). When the argument is specified, it will be set to const, if not specified,
+ the default will be stored (possibly modified by the corresponding environment variable).
+
+ Other arguments work the same as default argparse arguments, that is using
+ the default 'store' action.
+
+ Returns:
+ The modified argparse.Action.
+ """
+
class _EnvironmentArgument(argparse.Action):
def __init__(
self,
@@ -68,14 +149,28 @@ def __call__(
@dataclass(slots=True)
class Settings:
+ """Default framework-wide user settings.
+
+ The defaults may be modified at the start of the run.
+ """
+
+ #:
config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ #:
output_dir: str = "output"
+ #:
timeout: float = 15
+ #:
verbose: bool = False
+ #:
skip_setup: bool = False
+ #:
dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ #:
compile_timeout: float = 1200
+ #:
test_cases: list[str] = field(default_factory=list)
+ #:
re_run: int = 0
@@ -169,7 +264,7 @@ def _get_parser() -> argparse.ArgumentParser:
action=_env_arg("DTS_RERUN"),
default=SETTINGS.re_run,
type=int,
- help="[DTS_RERUN] Re-run each test case the specified amount of times "
+ help="[DTS_RERUN] Re-run each test case the specified number of times "
"if a test failure occurs",
)
@@ -177,6 +272,10 @@ def _get_parser() -> argparse.ArgumentParser:
def get_settings() -> Settings:
+ """Create new settings with inputs from the user.
+
+ The inputs are taken from the command line and from environment variables.
+ """
parsed_args = _get_parser().parse_args()
return Settings(
config_file_path=parsed_args.config_file,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 06/23] dts: logger and settings docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (4 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 05/23] dts: settings " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 07/23] dts: dts runner and main " Juraj Linkeš
` (17 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/logger.py | 72 +++++++++++++++++++++----------
dts/framework/utils.py | 96 ++++++++++++++++++++++++++++++-----------
2 files changed, 121 insertions(+), 47 deletions(-)
diff --git a/dts/framework/logger.py b/dts/framework/logger.py
index bb2991e994..d3eb75a4e4 100644
--- a/dts/framework/logger.py
+++ b/dts/framework/logger.py
@@ -3,9 +3,9 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-DTS logger module with several log level. DTS framework and TestSuite logs
-are saved in different log files.
+"""DTS logger module.
+
+DTS framework and TestSuite logs are saved in different log files.
"""
import logging
@@ -18,19 +18,21 @@
stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
-class LoggerDictType(TypedDict):
- logger: "DTSLOG"
- name: str
- node: str
-
+class DTSLOG(logging.LoggerAdapter):
+ """DTS logger adapter class for framework and testsuites.
-# List for saving all using loggers
-Loggers: list[LoggerDictType] = []
+ The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
+ variable control the verbosity of output. If enabled, all messages will be emitted to the
+ console.
+ The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+ variable modify the directory where the logs will be stored.
-class DTSLOG(logging.LoggerAdapter):
- """
- DTS log class for framework and testsuite.
+ Attributes:
+ node: The additional identifier. Currently unused.
+ sh: The handler which emits logs to console.
+ fh: The handler which emits logs to a file.
+ verbose_fh: Just as fh, but logs with a different, more verbose, format.
"""
_logger: logging.Logger
@@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
verbose_fh: logging.FileHandler
def __init__(self, logger: logging.Logger, node: str = "suite"):
+ """Extend the constructor with additional handlers.
+
+ One handler logs to the console, the other one to a file, with either a regular or verbose
+ format.
+
+ Args:
+ logger: The logger from which to create the logger adapter.
+ node: An additional identifier. Currently unused.
+ """
self._logger = logger
# 1 means log everything, this will be used by file handlers if their level
# is not set
@@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
def logger_exit(self) -> None:
- """
- Remove stream handler and logfile handler.
- """
+ """Remove the stream handler and the logfile handler."""
for handler in (self.sh, self.fh, self.verbose_fh):
handler.flush()
self._logger.removeHandler(handler)
+class _LoggerDictType(TypedDict):
+ logger: DTSLOG
+ name: str
+ node: str
+
+
+# List for saving all loggers in use
+_Loggers: list[_LoggerDictType] = []
+
+
def getLogger(name: str, node: str = "suite") -> DTSLOG:
+ """Get DTS logger adapter identified by name and node.
+
+ An existing logger will be return if one with the exact name and node already exists.
+ A new one will be created and stored otherwise.
+
+ Args:
+ name: The name of the logger.
+ node: An additional identifier for the logger.
+
+ Returns:
+ A logger uniquely identified by both name and node.
"""
- Get logger handler and if there's no handler for specified Node will create one.
- """
- global Loggers
+ global _Loggers
# return saved logger
- logger: LoggerDictType
- for logger in Loggers:
+ logger: _LoggerDictType
+ for logger in _Loggers:
if logger["name"] == name and logger["node"] == node:
return logger["logger"]
# return new logger
dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
- Loggers.append({"logger": dts_logger, "name": name, "node": node})
+ _Loggers.append({"logger": dts_logger, "name": name, "node": node})
return dts_logger
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index f0c916471c..0613adf7ad 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -3,6 +3,16 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Various utility classes and functions.
+
+These are used in multiple modules across the framework. They're here because
+they provide some non-specific functionality, greatly simplify imports or just don't
+fit elsewhere.
+
+Attributes:
+ REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
+"""
+
import atexit
import json
import os
@@ -19,12 +29,20 @@
def expand_range(range_str: str) -> list[int]:
- """
- Process range string into a list of integers. There are two possible formats:
- n - a single integer
- n-m - a range of integers
+ """Process `range_str` into a list of integers.
+
+ There are two possible formats of `range_str`:
+
+ * ``n`` - a single integer,
+ * ``n-m`` - a range of integers.
- The returned range includes both n and m. Empty string returns an empty list.
+ The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
+
+ Args:
+ range_str: The range to expand.
+
+ Returns:
+ All the numbers from the range.
"""
expanded_range: list[int] = []
if range_str:
@@ -39,6 +57,14 @@ def expand_range(range_str: str) -> list[int]:
def get_packet_summaries(packets: list[Packet]) -> str:
+ """Format a string summary from `packets`.
+
+ Args:
+ packets: The packets to format.
+
+ Returns:
+ The summary of `packets`.
+ """
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -49,6 +75,8 @@ def get_packet_summaries(packets: list[Packet]) -> str:
class StrEnum(Enum):
+ """Enum with members stored as strings."""
+
@staticmethod
def _generate_next_value_(
name: str, start: int, count: int, last_values: object
@@ -56,22 +84,29 @@ def _generate_next_value_(
return name
def __str__(self) -> str:
+ """The string representation is the name of the member."""
return self.name
class MesonArgs(object):
- """
- Aggregate the arguments needed to build DPDK:
- default_library: Default library type, Meson allows "shared", "static" and "both".
- Defaults to None, in which case the argument won't be used.
- Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
- Do not use -D with them, for example:
- meson_args = MesonArgs(enable_kmods=True).
- """
+ """Aggregate the arguments needed to build DPDK."""
_default_library: str
def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
+ """Initialize the meson arguments.
+
+ Args:
+ default_library: The default library type, Meson supports ``shared``, ``static`` and
+ ``both``. Defaults to :data:`None`, in which case the argument won't be used.
+ dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Example:
+ ::
+
+ meson_args = MesonArgs(enable_kmods=True).
+ """
self._default_library = (
f"--default-library={default_library}" if default_library else ""
)
@@ -83,6 +118,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
)
def __str__(self) -> str:
+ """The actual args."""
return " ".join(f"{self._default_library} {self._dpdk_args}".split())
@@ -93,35 +129,33 @@ class _TarCompressionFormat(StrEnum):
and Enum values are the associated file extensions.
"""
+ #:
gzip = "gz"
+ #:
compress = "Z"
+ #:
bzip2 = "bz2"
+ #:
lzip = "lz"
+ #:
lzma = "lzma"
+ #:
lzop = "lzo"
+ #:
xz = "xz"
+ #:
zstd = "zst"
class DPDKGitTarball(object):
- """Create a compressed tarball of DPDK from the repository.
-
- The DPDK version is specified with git object git_ref.
- The tarball will be compressed with _TarCompressionFormat,
- which must be supported by the DTS execution environment.
- The resulting tarball will be put into output_dir.
+ """Compressed tarball of DPDK from the repository.
- The class supports the os.PathLike protocol,
+ The class supports the :class:`os.PathLike` protocol,
which is used to get the Path of the tarball::
from pathlib import Path
tarball = DPDKGitTarball("HEAD", "output")
tarball_path = Path(tarball)
-
- Arguments:
- git_ref: A git commit ID, tag ID or tree ID.
- output_dir: The directory where to put the resulting tarball.
- tar_compression_format: The compression format to use.
"""
_git_ref: str
@@ -136,6 +170,17 @@ def __init__(
output_dir: str,
tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
):
+ """Create the tarball during initialization.
+
+ The DPDK version is specified with `git_ref`. The tarball will be compressed with
+ `tar_compression_format`, which must be supported by the DTS execution environment.
+ The resulting tarball will be put into `output_dir`.
+
+ Args:
+ git_ref: A git commit ID, tag ID or tree ID.
+ output_dir: The directory where to put the resulting tarball.
+ tar_compression_format: The compression format to use.
+ """
self._git_ref = git_ref
self._tar_compression_format = tar_compression_format
@@ -204,4 +249,5 @@ def _delete_tarball(self) -> None:
os.remove(self._tarball_path)
def __fspath__(self) -> str:
+ """The os.PathLike protocol implementation."""
return str(self._tarball_path)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 07/23] dts: dts runner and main docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (5 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 06/23] dts: logger and " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 08/23] dts: test suite " Juraj Linkeš
` (16 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/dts.py | 128 ++++++++++++++++++++++++++++++++++++-------
dts/main.py | 8 ++-
2 files changed, 112 insertions(+), 24 deletions(-)
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 4c7fb0c40a..331fed7dc4 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,33 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+r"""Test suite runner module.
+
+A DTS run is split into stages:
+
+ #. Execution stage,
+ #. Build target stage,
+ #. Test suite stage,
+ #. Test case stage.
+
+The module is responsible for running tests on testbeds defined in the test run configuration.
+Each setup or teardown of each stage is recorded in a :class:`~framework.test_result.DTSResult` or
+one of its subclasses. The test case results are also recorded.
+
+If an error occurs, the current stage is aborted, the error is recorded and the run continues in
+the next iteration of the same stage. The return code is the highest `severity` of all
+:class:`~.framework.exception.DTSError`\s.
+
+Example:
+ An error occurs in a build target setup. The current build target is aborted and the run
+ continues with the next build target. If the errored build target was the last one in the given
+ execution, the next execution begins.
+
+Attributes:
+ dts_logger: The logger instance used in this module.
+ result: The top level result used in the module.
+"""
+
import sys
from .config import (
@@ -23,9 +50,38 @@
def run_all() -> None:
- """
- The main process of DTS. Runs all build targets in all executions from the main
- config file.
+ """Run all build targets in all executions from the test run configuration.
+
+ Before running test suites, executions and build targets are first set up.
+ The executions and build targets defined in the test run configuration are iterated over.
+ The executions define which tests to run and where to run them and build targets define
+ the DPDK build setup.
+
+ The tests suites are set up for each execution/build target tuple and each scheduled
+ test case within the test suite is set up, executed and torn down. After all test cases
+ have been executed, the test suite is torn down and the next build target will be tested.
+
+ All the nested steps look like this:
+
+ #. Execution setup
+
+ #. Build target setup
+
+ #. Test suite setup
+
+ #. Test case setup
+ #. Test case logic
+ #. Test case teardown
+
+ #. Test suite teardown
+
+ #. Build target teardown
+
+ #. Execution teardown
+
+ The test cases are filtered according to the specification in the test run configuration and
+ the :option:`--test-cases` command line argument or
+ the :envvar:`DTS_TESTCASES` environment variable.
"""
global dts_logger
global result
@@ -87,6 +143,8 @@ def run_all() -> None:
def _check_dts_python_version() -> None:
+ """Check the required Python version - v3.10."""
+
def RED(text: str) -> str:
return f"\u001B[31;1m{str(text)}\u001B[0m"
@@ -111,9 +169,16 @@ def _run_execution(
execution: ExecutionConfiguration,
result: DTSResult,
) -> None:
- """
- Run the given execution. This involves running the execution setup as well as
- running all build targets in the given execution.
+ """Run the given execution.
+
+ This involves running the execution setup as well as running all build targets
+ in the given execution. After that, execution teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: An execution's test run configuration.
+ result: The top level result object.
"""
dts_logger.info(
f"Running execution with SUT '{execution.system_under_test_node.name}'."
@@ -150,8 +215,18 @@ def _run_build_target(
execution: ExecutionConfiguration,
execution_result: ExecutionResult,
) -> None:
- """
- Run the given build target.
+ """Run the given build target.
+
+ This involves running the build target setup as well as running all test suites
+ in the given execution the build target is defined in.
+ After that, build target teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ build_target: A build target's test run configuration.
+ execution: The build target's execution's test run configuration.
+ execution_result: The execution level result object associated with the execution.
"""
dts_logger.info(f"Running build target '{build_target.name}'.")
build_target_result = execution_result.add_build_target(build_target)
@@ -183,10 +258,17 @@ def _run_all_suites(
execution: ExecutionConfiguration,
build_target_result: BuildTargetResult,
) -> None:
- """
- Use the given build_target to run execution's test suites
- with possibly only a subset of test cases.
- If no subset is specified, run all test cases.
+ """Run the execution's (possibly a subset) test suites using the current build_target.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
"""
end_build_target = False
if not execution.skip_smoke_tests:
@@ -215,16 +297,22 @@ def _run_single_suite(
build_target_result: BuildTargetResult,
test_suite_config: TestSuiteConfig,
) -> None:
- """Runs a single test suite.
+ """Run all test suite in a single test suite module.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
Args:
- sut_node: Node to run tests on.
- execution: Execution the test case belongs to.
- build_target_result: Build target configuration test case is run on
- test_suite_config: Test suite configuration
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
+ test_suite_config: Test suite test run configuration specifying the test suite module
+ and possibly a subset of test cases of test suites in that module.
Raises:
- BlockingTestSuiteError: If a test suite that was marked as blocking fails.
+ BlockingTestSuiteError: If a blocking test suite fails.
"""
try:
full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
@@ -248,9 +336,7 @@ def _run_single_suite(
def _exit_dts() -> None:
- """
- Process all errors and exit with the proper exit code.
- """
+ """Process all errors and exit with the proper exit code."""
result.process()
if dts_logger:
diff --git a/dts/main.py b/dts/main.py
index 5d4714b0c3..f703615d11 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -4,9 +4,7 @@
# Copyright(c) 2022 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
-"""
-A test framework for testing DPDK.
-"""
+"""The DTS executable."""
import logging
@@ -17,6 +15,10 @@ def main() -> None:
"""Set DTS settings, then run DTS.
The DTS settings are taken from the command line arguments and the environment variables.
+ The settings object is stored in the module-level variable settings.SETTINGS which the entire
+ framework uses. After importing the module (or the variable), any changes to the variable are
+ not going to be reflected without a re-import. This means that the SETTINGS variable must
+ be modified before the settings module is imported anywhere else in the framework.
"""
settings.SETTINGS = settings.get_settings()
from framework import dts
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 08/23] dts: test suite docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (6 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 07/23] dts: dts runner and main " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 09/23] dts: test result " Juraj Linkeš
` (15 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_suite.py | 223 +++++++++++++++++++++++++++---------
1 file changed, 168 insertions(+), 55 deletions(-)
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index d53553bf34..8daac35818 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -2,8 +2,19 @@
# Copyright(c) 2010-2014 Intel Corporation
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Base class for creating DTS test cases.
+"""Features common to all test suites.
+
+The module defines the :class:`TestSuite` class which doesn't contain any test cases, and as such
+must be extended by subclasses which add test cases. The :class:`TestSuite` contains the basics
+needed by subclasses:
+
+ * Test suite and test case execution flow,
+ * Testbed (SUT, TG) configuration,
+ * Packet sending and verification,
+ * Test case verification.
+
+The module also defines a function, :func:`get_test_suites`,
+for gathering test suites from a Python module.
"""
import importlib
@@ -31,25 +42,44 @@
class TestSuite(object):
- """
- The base TestSuite class provides methods for handling basic flow of a test suite:
- * test case filtering and collection
- * test suite setup/cleanup
- * test setup/cleanup
- * test case execution
- * error handling and results storage
- Test cases are implemented by derived classes. Test cases are all methods
- starting with test_, further divided into performance test cases
- (starting with test_perf_) and functional test cases (all other test cases).
- By default, all test cases will be executed. A list of testcase str names
- may be specified in conf.yaml or on the command line
- to filter which test cases to run.
- The methods named [set_up|tear_down]_[suite|test_case] should be overridden
- in derived classes if the appropriate suite/test case fixtures are needed.
+ """The base class with methods for handling the basic flow of a test suite.
+
+ * Test case filtering and collection,
+ * Test suite setup/cleanup,
+ * Test setup/cleanup,
+ * Test case execution,
+ * Error handling and results storage.
+
+ Test cases are implemented by subclasses. Test cases are all methods starting with ``test_``,
+ further divided into performance test cases (starting with ``test_perf_``)
+ and functional test cases (all other test cases).
+
+ By default, all test cases will be executed. A list of testcase names may be specified
+ in the YAML test run configuration file and in the :option:`--test-cases` command line argument
+ or in the :envvar:`DTS_TESTCASES` environment variable to filter which test cases to run.
+ The union of both lists will be used. Any unknown test cases from the latter lists
+ will be silently ignored.
+
+ If the :option:`--re-run` command line argument or the :envvar:`DTS_RERUN` environment variable
+ is set, in case of a test case failure, the test case will be executed again until it passes
+ or it fails that many times in addition of the first failure.
+
+ The methods named ``[set_up|tear_down]_[suite|test_case]`` should be overridden in subclasses
+ if the appropriate test suite/test case fixtures are needed.
+
+ The test suite is aware of the testbed (the SUT and TG) it's running on. From this, it can
+ properly choose the IP addresses and other configuration that must be tailored to the testbed.
+
+ Attributes:
+ sut_node: The SUT node where the test suite is running.
+ tg_node: The TG node where the test suite is running.
+ is_blocking: Whether the test suite is blocking. A failure of a blocking test suite
+ will block the execution of all subsequent test suites in the current build target.
"""
sut_node: SutNode
- is_blocking = False
+ tg_node: TGNode
+ is_blocking: bool = False
_logger: DTSLOG
_test_cases_to_run: list[str]
_func: bool
@@ -72,6 +102,19 @@ def __init__(
func: bool,
build_target_result: BuildTargetResult,
):
+ """Initialize the test suite testbed information and basic configuration.
+
+ Process what test cases to run, create the associated :class:`TestSuiteResult`,
+ find links between ports and set up default IP addresses to be used when configuring them.
+
+ Args:
+ sut_node: The SUT node where the test suite will run.
+ tg_node: The TG node where the test suite will run.
+ test_cases: The list of test cases to execute.
+ If empty, all test cases will be executed.
+ func: Whether to run functional tests.
+ build_target_result: The build target result this test suite is run in.
+ """
self.sut_node = sut_node
self.tg_node = tg_node
self._logger = getLogger(self.__class__.__name__)
@@ -95,6 +138,7 @@ def __init__(
self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
def _process_links(self) -> None:
+ """Construct links between SUT and TG ports."""
for sut_port in self.sut_node.ports:
for tg_port in self.tg_node.ports:
if (sut_port.identifier, sut_port.peer) == (
@@ -106,27 +150,42 @@ def _process_links(self) -> None:
)
def set_up_suite(self) -> None:
- """
- Set up test fixtures common to all test cases; this is done before
- any test case is run.
+ """Set up test fixtures common to all test cases.
+
+ This is done before any test case has been run.
"""
def tear_down_suite(self) -> None:
- """
- Tear down the previously created test fixtures common to all test cases.
+ """Tear down the previously created test fixtures common to all test cases.
+
+ This is done after all test have been run.
"""
def set_up_test_case(self) -> None:
- """
- Set up test fixtures before each test case.
+ """Set up test fixtures before each test case.
+
+ This is done before *each* test case.
"""
def tear_down_test_case(self) -> None:
- """
- Tear down the previously created test fixtures after each test case.
+ """Tear down the previously created test fixtures after each test case.
+
+ This is done after *each* test case.
"""
def configure_testbed_ipv4(self, restore: bool = False) -> None:
+ """Configure IPv4 addresses on all testbed ports.
+
+ The configured ports are:
+
+ * SUT ingress port,
+ * SUT egress port,
+ * TG ingress port,
+ * TG egress port.
+
+ Args:
+ restore: If :data:`True`, will remove the configuration instead.
+ """
delete = True if restore else False
enable = False if restore else True
self._configure_ipv4_forwarding(enable)
@@ -153,11 +212,13 @@ def _configure_ipv4_forwarding(self, enable: bool) -> None:
def send_packet_and_capture(
self, packet: Packet, duration: float = 1
) -> list[Packet]:
- """
- Send a packet through the appropriate interface and
- receive on the appropriate interface.
- Modify the packet with l3/l2 addresses corresponding
- to the testbed and desired traffic.
+ """Send and receive `packet` using the associated TG.
+
+ Send `packet` through the appropriate interface and receive on the appropriate interface.
+ Modify the packet with l3/l2 addresses corresponding to the testbed and desired traffic.
+
+ Returns:
+ A list of received packets.
"""
packet = self._adjust_addresses(packet)
return self.tg_node.send_packet_and_capture(
@@ -165,13 +226,25 @@ def send_packet_and_capture(
)
def get_expected_packet(self, packet: Packet) -> Packet:
+ """Inject the proper L2/L3 addresses into `packet`.
+
+ Args:
+ packet: The packet to modify.
+
+ Returns:
+ `packet` with injected L2/L3 addresses.
+ """
return self._adjust_addresses(packet, expected=True)
def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
- """
+ """L2 and L3 address additions in both directions.
+
Assumptions:
- Two links between SUT and TG, one link is TG -> SUT,
- the other SUT -> TG.
+ Two links between SUT and TG, one link is TG -> SUT, the other SUT -> TG.
+
+ Args:
+ packet: The packet to modify.
+ expected: If True, the direction is SUT -> TG, otherwise the direction is TG -> SUT.
"""
if expected:
# The packet enters the TG from SUT
@@ -197,6 +270,19 @@ def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
return Ether(packet.build())
def verify(self, condition: bool, failure_description: str) -> None:
+ """Verify `condition` and handle failures.
+
+ When `condition` is :data:`False`, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ condition: The condition to check.
+ failure_description: A short description of the failure
+ that will be stored in the raised exception.
+
+ Raises:
+ TestCaseVerifyError: `condition` is :data:`False`.
+ """
if not condition:
self._fail_test_case_verify(failure_description)
@@ -216,6 +302,19 @@ def _fail_test_case_verify(self, failure_description: str) -> None:
def verify_packets(
self, expected_packet: Packet, received_packets: list[Packet]
) -> None:
+ """Verify that `expected_packet` has been received.
+
+ Go through `received_packets` and check that `expected_packet` is among them.
+ If not, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ expected_packet: The packet we're expecting to receive.
+ received_packets: The packets where we're looking for `expected_packet`.
+
+ Raises:
+ TestCaseVerifyError: `expected_packet` is not among `received_packets`.
+ """
for received_packet in received_packets:
if self._compare_packets(expected_packet, received_packet):
break
@@ -303,10 +402,14 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
return True
def run(self) -> None:
- """
- Setup, execute and teardown the whole suite.
- Suite execution consists of running all test cases scheduled to be executed.
- A test cast run consists of setup, execution and teardown of said test case.
+ """Set up, execute and tear down the whole suite.
+
+ Test suite execution consists of running all test cases scheduled to be executed.
+ A test case run consists of setup, execution and teardown of said test case.
+
+ Record the setup and the teardown and handle failures.
+
+ The list of scheduled test cases is constructed when creating the :class:`TestSuite` object.
"""
test_suite_name = self.__class__.__name__
@@ -338,9 +441,7 @@ def run(self) -> None:
raise BlockingTestSuiteError(test_suite_name)
def _execute_test_suite(self) -> None:
- """
- Execute all test cases scheduled to be executed in this suite.
- """
+ """Execute all test cases scheduled to be executed in this suite."""
if self._func:
for test_case_method in self._get_functional_test_cases():
test_case_name = test_case_method.__name__
@@ -357,14 +458,18 @@ def _execute_test_suite(self) -> None:
self._run_test_case(test_case_method, test_case_result)
def _get_functional_test_cases(self) -> list[MethodType]:
- """
- Get all functional test cases.
+ """Get all functional test cases defined in this TestSuite.
+
+ Returns:
+ The list of functional test cases of this TestSuite.
"""
return self._get_test_cases(r"test_(?!perf_)")
def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
- """
- Return a list of test cases matching test_case_regex.
+ """Return a list of test cases matching test_case_regex.
+
+ Returns:
+ The list of test cases matching test_case_regex of this TestSuite.
"""
self._logger.debug(f"Searching for test cases in {self.__class__.__name__}.")
filtered_test_cases = []
@@ -378,9 +483,7 @@ def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
return filtered_test_cases
def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool:
- """
- Check whether the test case should be executed.
- """
+ """Check whether the test case should be scheduled to be executed."""
match = bool(re.match(test_case_regex, test_case_name))
if self._test_cases_to_run:
return match and test_case_name in self._test_cases_to_run
@@ -390,9 +493,9 @@ def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool
def _run_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Setup, execute and teardown a test case in this suite.
- Exceptions are caught and recorded in logs and results.
+ """Setup, execute and teardown a test case in this suite.
+
+ Record the result of the setup and the teardown and handle failures.
"""
test_case_name = test_case_method.__name__
@@ -427,9 +530,7 @@ def _run_test_case(
def _execute_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Execute one test case and handle failures.
- """
+ """Execute one test case, record the result and handle failures."""
test_case_name = test_case_method.__name__
try:
self._logger.info(f"Starting test case execution: {test_case_name}")
@@ -452,6 +553,18 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
+ r"""Find all :class:`TestSuite`\s in a Python module.
+
+ Args:
+ testsuite_module_path: The path to the Python module.
+
+ Returns:
+ The list of :class:`TestSuite`\s found within the Python module.
+
+ Raises:
+ ConfigurationError: The test suite module was not found.
+ """
+
def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 09/23] dts: test result docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (7 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 08/23] dts: test suite " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 10/23] dts: config " Juraj Linkeš
` (14 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_result.py | 292 ++++++++++++++++++++++++++++-------
1 file changed, 234 insertions(+), 58 deletions(-)
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 603e18872c..f553948454 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -2,8 +2,25 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-Generic result container and reporters
+r"""Record and process DTS results.
+
+The results are recorded in a hierarchical manner:
+
+ * :class:`DTSResult` contains
+ * :class:`ExecutionResult` contains
+ * :class:`BuildTargetResult` contains
+ * :class:`TestSuiteResult` contains
+ * :class:`TestCaseResult`
+
+Each result may contain multiple lower level results, e.g. there are multiple
+:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
+The results have common parts, such as setup and teardown results, captured in :class:`BaseResult`,
+which also defines some common behaviors in its methods.
+
+Each result class has its own idiosyncrasies which they implement in overridden methods.
+
+The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+variable modify the directory where the files with results will be stored.
"""
import os.path
@@ -26,26 +43,34 @@
class Result(Enum):
- """
- An Enum defining the possible states that
- a setup, a teardown or a test case may end up in.
- """
+ """The possible states that a setup, a teardown or a test case may end up in."""
+ #:
PASS = auto()
+ #:
FAIL = auto()
+ #:
ERROR = auto()
+ #:
SKIP = auto()
def __bool__(self) -> bool:
+ """Only PASS is True."""
return self is self.PASS
class FixtureResult(object):
- """
- A record that stored the result of a setup or a teardown.
- The default is FAIL because immediately after creating the object
- the setup of the corresponding stage will be executed, which also guarantees
- the execution of teardown.
+ """A record that stores the result of a setup or a teardown.
+
+ FAIL is a sensible default since it prevents false positives
+ (which could happen if the default was TRUE).
+
+ Preventing false positives or other false results is preferable since a failure
+ is mostly likely to be investigated (the other false results may not be investigated at all).
+
+ Attributes:
+ result: The associated result.
+ error: The error in case of a failure.
"""
result: Result
@@ -56,21 +81,32 @@ def __init__(
result: Result = Result.FAIL,
error: Exception | None = None,
):
+ """Initialize the constructor with the fixture result and store a possible error.
+
+ Args:
+ result: The result to store.
+ error: The error which happened when a failure occurred.
+ """
self.result = result
self.error = error
def __bool__(self) -> bool:
+ """A wrapper around the stored :class:`Result`."""
return bool(self.result)
class Statistics(dict):
- """
- A helper class used to store the number of test cases by its result
- along a few other basic information.
- Using a dict provides a convenient way to format the data.
+ """How many test cases ended in which result state along some other basic information.
+
+ Subclassing :class:`dict` provides a convenient way to format the data.
"""
def __init__(self, dpdk_version: str | None):
+ """Extend the constructor with relevant keys.
+
+ Args:
+ dpdk_version: The version of tested DPDK.
+ """
super(Statistics, self).__init__()
for result in Result:
self[result.name] = 0
@@ -78,8 +114,17 @@ def __init__(self, dpdk_version: str | None):
self["DPDK VERSION"] = dpdk_version
def __iadd__(self, other: Result) -> "Statistics":
- """
- Add a Result to the final count.
+ """Add a Result to the final count.
+
+ Example:
+ stats: Statistics = Statistics() # empty Statistics
+ stats += Result.PASS # add a Result to `stats`
+
+ Args:
+ other: The Result to add to this statistics object.
+
+ Returns:
+ The modified statistics object.
"""
self[other.name] += 1
self["PASS RATE"] = (
@@ -90,9 +135,7 @@ def __iadd__(self, other: Result) -> "Statistics":
return self
def __str__(self) -> str:
- """
- Provide a string representation of the data.
- """
+ """Each line contains the formatted key = value pair."""
stats_str = ""
for key, value in self.items():
stats_str += f"{key:<12} = {value}\n"
@@ -102,10 +145,16 @@ def __str__(self) -> str:
class BaseResult(object):
- """
- The Base class for all results. Stores the results of
- the setup and teardown portions of the corresponding stage
- and a list of results from each inner stage in _inner_results.
+ """Common data and behavior of DTS results.
+
+ Stores the results of the setup and teardown portions of the corresponding stage.
+ The hierarchical nature of DTS results is captured recursively in an internal list.
+ A stage is each level in this particular hierarchy (pre-execution or the top-most level,
+ execution, build target, test suite and test case.)
+
+ Attributes:
+ setup_result: The result of the setup of the particular stage.
+ teardown_result: The results of the teardown of the particular stage.
"""
setup_result: FixtureResult
@@ -113,15 +162,28 @@ class BaseResult(object):
_inner_results: MutableSequence["BaseResult"]
def __init__(self):
+ """Initialize the constructor."""
self.setup_result = FixtureResult()
self.teardown_result = FixtureResult()
self._inner_results = []
def update_setup(self, result: Result, error: Exception | None = None) -> None:
+ """Store the setup result.
+
+ Args:
+ result: The result of the setup.
+ error: The error that occurred in case of a failure.
+ """
self.setup_result.result = result
self.setup_result.error = error
def update_teardown(self, result: Result, error: Exception | None = None) -> None:
+ """Store the teardown result.
+
+ Args:
+ result: The result of the teardown.
+ error: The error that occurred in case of a failure.
+ """
self.teardown_result.result = result
self.teardown_result.error = error
@@ -141,27 +203,55 @@ def _get_inner_errors(self) -> list[Exception]:
]
def get_errors(self) -> list[Exception]:
+ """Compile errors from the whole result hierarchy.
+
+ Returns:
+ The errors from setup, teardown and all errors found in the whole result hierarchy.
+ """
return self._get_setup_teardown_errors() + self._get_inner_errors()
def add_stats(self, statistics: Statistics) -> None:
+ """Collate stats from the whole result hierarchy.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be collated.
+ """
for inner_result in self._inner_results:
inner_result.add_stats(statistics)
class TestCaseResult(BaseResult, FixtureResult):
- """
- The test case specific result.
- Stores the result of the actual test case.
- Also stores the test case name.
+ r"""The test case specific result.
+
+ Stores the result of the actual test case. This is done by adding an extra superclass
+ in :class:`FixtureResult`. The setup and teardown results are :class:`FixtureResult`\s and
+ the class is itself a record of the test case.
+
+ Attributes:
+ test_case_name: The test case name.
"""
test_case_name: str
def __init__(self, test_case_name: str):
+ """Extend the constructor with `test_case_name`.
+
+ Args:
+ test_case_name: The test case's name.
+ """
super(TestCaseResult, self).__init__()
self.test_case_name = test_case_name
def update(self, result: Result, error: Exception | None = None) -> None:
+ """Update the test case result.
+
+ This updates the result of the test case itself and doesn't affect
+ the results of the setup and teardown steps in any way.
+
+ Args:
+ result: The result of the test case.
+ error: The error that occurred in case of a failure.
+ """
self.result = result
self.error = error
@@ -171,38 +261,66 @@ def _get_inner_errors(self) -> list[Exception]:
return []
def add_stats(self, statistics: Statistics) -> None:
+ r"""Add the test case result to statistics.
+
+ The base method goes through the hierarchy recursively and this method is here to stop
+ the recursion, as the :class:`TestCaseResult`\s are the leaves of the hierarchy tree.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be added.
+ """
statistics += self.result
def __bool__(self) -> bool:
+ """The test case passed only if setup, teardown and the test case itself passed."""
return (
bool(self.setup_result) and bool(self.teardown_result) and bool(self.result)
)
class TestSuiteResult(BaseResult):
- """
- The test suite specific result.
- The _inner_results list stores results of test cases in a given test suite.
- Also stores the test suite name.
+ """The test suite specific result.
+
+ The internal list stores the results of all test cases in a given test suite.
+
+ Attributes:
+ suite_name: The test suite name.
"""
suite_name: str
def __init__(self, suite_name: str):
+ """Extend the constructor with `suite_name`.
+
+ Args:
+ suite_name: The test suite's name.
+ """
super(TestSuiteResult, self).__init__()
self.suite_name = suite_name
def add_test_case(self, test_case_name: str) -> TestCaseResult:
+ """Add and return the inner result (test case).
+
+ Returns:
+ The test case's result.
+ """
test_case_result = TestCaseResult(test_case_name)
self._inner_results.append(test_case_result)
return test_case_result
class BuildTargetResult(BaseResult):
- """
- The build target specific result.
- The _inner_results list stores results of test suites in a given build target.
- Also stores build target specifics, such as compiler used to build DPDK.
+ """The build target specific result.
+
+ The internal list stores the results of all test suites in a given build target.
+
+ Attributes:
+ arch: The DPDK build target architecture.
+ os: The DPDK build target operating system.
+ cpu: The DPDK build target CPU.
+ compiler: The DPDK build target compiler.
+ compiler_version: The DPDK build target compiler version.
+ dpdk_version: The built DPDK version.
"""
arch: Architecture
@@ -213,6 +331,11 @@ class BuildTargetResult(BaseResult):
dpdk_version: str | None
def __init__(self, build_target: BuildTargetConfiguration):
+ """Extend the constructor with the `build_target`'s build target config.
+
+ Args:
+ build_target: The build target's test run configuration.
+ """
super(BuildTargetResult, self).__init__()
self.arch = build_target.arch
self.os = build_target.os
@@ -222,20 +345,35 @@ def __init__(self, build_target: BuildTargetConfiguration):
self.dpdk_version = None
def add_build_target_info(self, versions: BuildTargetInfo) -> None:
+ """Add information about the build target gathered at runtime.
+
+ Args:
+ versions: The additional information.
+ """
self.compiler_version = versions.compiler_version
self.dpdk_version = versions.dpdk_version
def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
+ """Add and return the inner result (test suite).
+
+ Returns:
+ The test suite's result.
+ """
test_suite_result = TestSuiteResult(test_suite_name)
self._inner_results.append(test_suite_result)
return test_suite_result
class ExecutionResult(BaseResult):
- """
- The execution specific result.
- The _inner_results list stores results of build targets in a given execution.
- Also stores the SUT node configuration.
+ """The execution specific result.
+
+ The internal list stores the results of all build targets in a given execution.
+
+ Attributes:
+ sut_node: The SUT node used in the execution.
+ sut_os_name: The operating system of the SUT node.
+ sut_os_version: The operating system version of the SUT node.
+ sut_kernel_version: The operating system kernel version of the SUT node.
"""
sut_node: NodeConfiguration
@@ -244,36 +382,55 @@ class ExecutionResult(BaseResult):
sut_kernel_version: str
def __init__(self, sut_node: NodeConfiguration):
+ """Extend the constructor with the `sut_node`'s config.
+
+ Args:
+ sut_node: The SUT node's test run configuration used in the execution.
+ """
super(ExecutionResult, self).__init__()
self.sut_node = sut_node
def add_build_target(
self, build_target: BuildTargetConfiguration
) -> BuildTargetResult:
+ """Add and return the inner result (build target).
+
+ Args:
+ build_target: The build target's test run configuration.
+
+ Returns:
+ The build target's result.
+ """
build_target_result = BuildTargetResult(build_target)
self._inner_results.append(build_target_result)
return build_target_result
def add_sut_info(self, sut_info: NodeInfo) -> None:
+ """Add SUT information gathered at runtime.
+
+ Args:
+ sut_info: The additional SUT node information.
+ """
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
class DTSResult(BaseResult):
- """
- Stores environment information and test results from a DTS run, which are:
- * Execution level information, such as SUT and TG hardware.
- * Build target level information, such as compiler, target OS and cpu.
- * Test suite results.
- * All errors that are caught and recorded during DTS execution.
+ """Stores environment information and test results from a DTS run.
- The information is stored in nested objects.
+ * Execution level information, such as testbed and the test suite list,
+ * Build target level information, such as compiler, target OS and cpu,
+ * Test suite and test case results,
+ * All errors that are caught and recorded during DTS execution.
- The class is capable of computing the return code used to exit DTS with
- from the stored error.
+ The information is stored hierarchically. This is the first level of the hierarchy
+ and as such is where the data form the whole hierarchy is collated or processed.
- It also provides a brief statistical summary of passed/failed test cases.
+ The internal list stores the results of all executions.
+
+ Attributes:
+ dpdk_version: The DPDK version to record.
"""
dpdk_version: str | None
@@ -284,6 +441,11 @@ class DTSResult(BaseResult):
_stats_filename: str
def __init__(self, logger: DTSLOG):
+ """Extend the constructor with top-level specifics.
+
+ Args:
+ logger: The logger instance the whole result will use.
+ """
super(DTSResult, self).__init__()
self.dpdk_version = None
self._logger = logger
@@ -293,21 +455,33 @@ def __init__(self, logger: DTSLOG):
self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
+ """Add and return the inner result (execution).
+
+ Args:
+ sut_node: The SUT node's test run configuration.
+
+ Returns:
+ The execution's result.
+ """
execution_result = ExecutionResult(sut_node)
self._inner_results.append(execution_result)
return execution_result
def add_error(self, error: Exception) -> None:
+ """Record an error that occurred outside any execution.
+
+ Args:
+ error: The exception to record.
+ """
self._errors.append(error)
def process(self) -> None:
- """
- Process the data after a DTS run.
- The data is added to nested objects during runtime and this parent object
- is not updated at that time. This requires us to process the nested data
- after it's all been gathered.
+ """Process the data after a whole DTS run.
+
+ The data is added to inner objects during runtime and this object is not updated
+ at that time. This requires us to process the inner data after it's all been gathered.
- The processing gathers all errors and the result statistics of test cases.
+ The processing gathers all errors and the statistics of test case results.
"""
self._errors += self.get_errors()
if self._errors and self._logger:
@@ -321,8 +495,10 @@ def process(self) -> None:
stats_file.write(str(self._stats_result))
def get_return_code(self) -> int:
- """
- Go through all stored Exceptions and return the highest error code found.
+ """Go through all stored Exceptions and return the final DTS error code.
+
+ Returns:
+ The highest error code found.
"""
for error in self._errors:
error_return_code = ErrorSeverity.GENERIC_ERR
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 10/23] dts: config docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (8 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 09/23] dts: test result " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 11/23] dts: remote session " Juraj Linkeš
` (13 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 371 ++++++++++++++++++++++++++-----
dts/framework/config/types.py | 132 +++++++++++
2 files changed, 446 insertions(+), 57 deletions(-)
create mode 100644 dts/framework/config/types.py
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 2044c82611..0aa149a53d 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -3,8 +3,34 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Yaml config parsing methods
+"""Testbed configuration and test suite specification.
+
+This package offers classes that hold real-time information about the testbed, hold test run
+configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
+the YAML test run configuration file
+and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+
+The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
+this package. The allowed keys and types inside this dictionary are defined in
+the :doc:`types <framework.config.types>` module.
+
+The test run configuration has two main sections:
+
+ * The :class:`ExecutionConfiguration` which defines what tests are going to be run
+ and how DPDK will be built. It also references the testbed where these tests and DPDK
+ are going to be run,
+ * The nodes of the testbed are defined in the other section,
+ a :class:`list` of :class:`NodeConfiguration` objects.
+
+The real-time information about testbed is supposed to be gathered at runtime.
+
+The classes defined in this package make heavy use of :mod:`dataclasses`.
+All of them use slots and are frozen:
+
+ * Slots enables some optimizations, by pre-allocating space for the defined
+ attributes in the underlying data structure,
+ * Frozen makes the object immutable. This enables further optimizations,
+ and makes it thread safe should we every want to move in that direction.
"""
import json
@@ -12,11 +38,20 @@
import pathlib
from dataclasses import dataclass
from enum import auto, unique
-from typing import Any, TypedDict, Union
+from typing import Union
import warlock # type: ignore[import]
import yaml
+from framework.config.types import (
+ BuildTargetConfigDict,
+ ConfigurationDict,
+ ExecutionConfigDict,
+ NodeConfigDict,
+ PortConfigDict,
+ TestSuiteConfigDict,
+ TrafficGeneratorConfigDict,
+)
from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -24,55 +59,97 @@
@unique
class Architecture(StrEnum):
+ r"""The supported architectures of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
i686 = auto()
+ #:
x86_64 = auto()
+ #:
x86_32 = auto()
+ #:
arm64 = auto()
+ #:
ppc64le = auto()
@unique
class OS(StrEnum):
+ r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
linux = auto()
+ #:
freebsd = auto()
+ #:
windows = auto()
@unique
class CPUType(StrEnum):
+ r"""The supported CPUs of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
native = auto()
+ #:
armv8a = auto()
+ #:
dpaa2 = auto()
+ #:
thunderx = auto()
+ #:
xgene1 = auto()
@unique
class Compiler(StrEnum):
+ r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
gcc = auto()
+ #:
clang = auto()
+ #:
icc = auto()
+ #:
msvc = auto()
@unique
class TrafficGeneratorType(StrEnum):
+ """The supported traffic generators."""
+
+ #:
SCAPY = auto()
-# Slots enables some optimizations, by pre-allocating space for the defined
-# attributes in the underlying data structure.
-#
-# Frozen makes the object immutable. This enables further optimizations,
-# and makes it thread safe should we every want to move in that direction.
@dataclass(slots=True, frozen=True)
class HugepageConfiguration:
+ r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ amount: The number of hugepages.
+ force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
+ """
+
amount: int
force_first_numa: bool
@dataclass(slots=True, frozen=True)
class PortConfig:
+ r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ node: The :class:`~framework.testbed_model.node.Node` where this port exists.
+ pci: The PCI address of the port.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK.
+ os_driver: The operating system driver name when the operating system controls the port.
+ peer_node: The :class:`~framework.testbed_model.node.Node` of the port
+ connected to this port.
+ peer_pci: The PCI address of the port connected to this port.
+ """
+
node: str
pci: str
os_driver_for_dpdk: str
@@ -81,18 +158,44 @@ class PortConfig:
peer_pci: str
@staticmethod
- def from_dict(node: str, d: dict) -> "PortConfig":
+ def from_dict(node: str, d: PortConfigDict) -> "PortConfig":
+ """A convenience method that creates the object from fewer inputs.
+
+ Args:
+ node: The node where this port exists.
+ d: The configuration dictionary.
+
+ Returns:
+ The port configuration instance.
+ """
return PortConfig(node=node, **d)
@dataclass(slots=True, frozen=True)
class TrafficGeneratorConfig:
+ """The configuration of traffic generators.
+
+ The class will be expanded when more configuration is needed.
+
+ Attributes:
+ traffic_generator_type: The type of the traffic generator.
+ """
+
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
- # This looks useless now, but is designed to allow expansion to traffic
- # generators that require more configuration later.
+ def from_dict(d: TrafficGeneratorConfigDict) -> "ScapyTrafficGeneratorConfig":
+ """A convenience method that produces traffic generator config of the proper type.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The traffic generator configuration instance.
+
+ Raises:
+ ConfigurationError: An unknown traffic generator type was encountered.
+ """
match TrafficGeneratorType(d["type"]):
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGeneratorConfig(
@@ -106,11 +209,31 @@ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
@dataclass(slots=True, frozen=True)
class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+ """Scapy traffic generator specific configuration."""
+
pass
@dataclass(slots=True, frozen=True)
class NodeConfiguration:
+ r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ name: The name of the :class:`~framework.testbed_model.node.Node`.
+ hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
+ Can be an IP or a domain name.
+ user: The name of the user used to connect to
+ the :class:`~framework.testbed_model.node.Node`.
+ password: The password of the user. The use of passwords is heavily discouraged.
+ Please use keys instead.
+ arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
+ os: The operating system of the :class:`~framework.testbed_model.node.Node`.
+ lcores: A comma delimited list of logical cores to use when running DPDK.
+ use_first_core: If :data:`True`, the first logical core won't be used.
+ hugepages: An optional hugepage configuration.
+ ports: The ports that can be used in testing.
+ """
+
name: str
hostname: str
user: str
@@ -123,57 +246,91 @@ class NodeConfiguration:
ports: list[PortConfig]
@staticmethod
- def from_dict(d: dict) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
- hugepage_config = d.get("hugepages")
- if hugepage_config:
- if "force_first_numa" not in hugepage_config:
- hugepage_config["force_first_numa"] = False
- hugepage_config = HugepageConfiguration(**hugepage_config)
-
- common_config = {
- "name": d["name"],
- "hostname": d["hostname"],
- "user": d["user"],
- "password": d.get("password"),
- "arch": Architecture(d["arch"]),
- "os": OS(d["os"]),
- "lcores": d.get("lcores", "1"),
- "use_first_core": d.get("use_first_core", False),
- "hugepages": hugepage_config,
- "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
- }
-
+ def from_dict(
+ d: NodeConfigDict,
+ ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
+ """A convenience method that processes the inputs before creating a specialized instance.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ Either an SUT or TG configuration instance.
+ """
+ hugepage_config = None
+ if "hugepages" in d:
+ hugepage_config_dict = d["hugepages"]
+ if "force_first_numa" not in hugepage_config_dict:
+ hugepage_config_dict["force_first_numa"] = False
+ hugepage_config = HugepageConfiguration(**hugepage_config_dict)
+
+ # The calls here contain duplicated code which is here because Mypy doesn't
+ # properly support dictionary unpacking with TypedDicts
if "traffic_generator" in d:
return TGNodeConfiguration(
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
traffic_generator=TrafficGeneratorConfig.from_dict(
d["traffic_generator"]
),
- **common_config,
)
else:
return SutNodeConfiguration(
- memory_channels=d.get("memory_channels", 1), **common_config
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
+ memory_channels=d.get("memory_channels", 1),
)
@dataclass(slots=True, frozen=True)
class SutNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
+
+ Attributes:
+ memory_channels: The number of memory channels to use when running DPDK.
+ """
+
memory_channels: int
@dataclass(slots=True, frozen=True)
class TGNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
+
+ Attributes:
+ traffic_generator: The configuration of the traffic generator present on the TG node.
+ """
+
traffic_generator: ScapyTrafficGeneratorConfig
@dataclass(slots=True, frozen=True)
class NodeInfo:
- """Class to hold important versions within the node.
-
- This class, unlike the NodeConfiguration class, cannot be generated at the start.
- This is because we need to initialize a connection with the node before we can
- collect the information needed in this class. Therefore, it cannot be a part of
- the configuration class above.
+ """Supplemental node information.
+
+ Attributes:
+ os_name: The name of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ os_version: The version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ kernel_version: The kernel version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
"""
os_name: str
@@ -183,6 +340,20 @@ class NodeInfo:
@dataclass(slots=True, frozen=True)
class BuildTargetConfiguration:
+ """DPDK build configuration.
+
+ The configuration used for building DPDK.
+
+ Attributes:
+ arch: The target architecture to build for.
+ os: The target os to build for.
+ cpu: The target CPU to build for.
+ compiler: The compiler executable to use.
+ compiler_wrapper: This string will be put in front of the compiler when
+ executing the build. Useful for adding wrapper commands, such as ``ccache``.
+ name: The name of the compiler.
+ """
+
arch: Architecture
os: OS
cpu: CPUType
@@ -191,7 +362,18 @@ class BuildTargetConfiguration:
name: str
@staticmethod
- def from_dict(d: dict) -> "BuildTargetConfiguration":
+ def from_dict(d: BuildTargetConfigDict) -> "BuildTargetConfiguration":
+ r"""A convenience method that processes the inputs before creating an instance.
+
+ `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
+ `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The build target configuration instance.
+ """
return BuildTargetConfiguration(
arch=Architecture(d["arch"]),
os=OS(d["os"]),
@@ -204,23 +386,29 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
@dataclass(slots=True, frozen=True)
class BuildTargetInfo:
- """Class to hold important versions within the build target.
+ """Various versions and other information about a build target.
- This is very similar to the NodeInfo class, it just instead holds information
- for the build target.
+ Attributes:
+ dpdk_version: The DPDK version that was built.
+ compiler_version: The version of the compiler used to build DPDK.
"""
dpdk_version: str
compiler_version: str
-class TestSuiteConfigDict(TypedDict):
- suite: str
- cases: list[str]
-
-
@dataclass(slots=True, frozen=True)
class TestSuiteConfig:
+ """Test suite configuration.
+
+ Information about a single test suite to be executed.
+
+ Attributes:
+ test_suite: The name of the test suite module without the starting ``TestSuite_``.
+ test_cases: The names of test cases from this test suite to execute.
+ If empty, all test cases will be executed.
+ """
+
test_suite: str
test_cases: list[str]
@@ -228,6 +416,14 @@ class TestSuiteConfig:
def from_dict(
entry: str | TestSuiteConfigDict,
) -> "TestSuiteConfig":
+ """Create an instance from two different types.
+
+ Args:
+ entry: Either a suite name or a dictionary containing the config.
+
+ Returns:
+ The test suite configuration instance.
+ """
if isinstance(entry, str):
return TestSuiteConfig(test_suite=entry, test_cases=[])
elif isinstance(entry, dict):
@@ -238,19 +434,49 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class ExecutionConfiguration:
+ """The configuration of an execution.
+
+ The configuration contains testbed information, what tests to execute
+ and with what DPDK build.
+
+ Attributes:
+ build_targets: A list of DPDK builds to test.
+ perf: Whether to run performance tests.
+ func: Whether to run functional tests.
+ skip_smoke_tests: Whether to skip smoke tests.
+ test_suites: The names of test suites and/or test cases to execute.
+ system_under_test_node: The SUT node to use in this execution.
+ traffic_generator_node: The TG node to use in this execution.
+ vdevs: The names of virtual devices to test.
+ """
+
build_targets: list[BuildTargetConfiguration]
perf: bool
func: bool
+ skip_smoke_tests: bool
test_suites: list[TestSuiteConfig]
system_under_test_node: SutNodeConfiguration
traffic_generator_node: TGNodeConfiguration
vdevs: list[str]
- skip_smoke_tests: bool
@staticmethod
def from_dict(
- d: dict, node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]]
+ d: ExecutionConfigDict,
+ node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]],
) -> "ExecutionConfiguration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ The build target and the test suite config is transformed into their respective objects.
+ SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
+ just stored.
+
+ Args:
+ d: The configuration dictionary.
+ node_map: A dictionary mapping node names to their config objects.
+
+ Returns:
+ The execution configuration instance.
+ """
build_targets: list[BuildTargetConfiguration] = list(
map(BuildTargetConfiguration.from_dict, d["build_targets"])
)
@@ -291,10 +517,31 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class Configuration:
+ """DTS testbed and test configuration.
+
+ The node configuration is not stored in this object. Rather, all used node configurations
+ are stored inside the execution configuration where the nodes are actually used.
+
+ Attributes:
+ executions: Execution configurations.
+ """
+
executions: list[ExecutionConfiguration]
@staticmethod
- def from_dict(d: dict) -> "Configuration":
+ def from_dict(d: ConfigurationDict) -> "Configuration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ Build target and test suite config is transformed into their respective objects.
+ SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
+ just stored.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The whole configuration instance.
+ """
nodes: list[Union[SutNodeConfiguration | TGNodeConfiguration]] = list(
map(NodeConfiguration.from_dict, d["nodes"])
)
@@ -313,9 +560,17 @@ def from_dict(d: dict) -> "Configuration":
def load_config() -> Configuration:
- """
- Loads the configuration file and the configuration file schema,
- validates the configuration file, and creates a configuration object.
+ """Load DTS test run configuration from a file.
+
+ Load the YAML test run configuration file
+ and :download:`the configuration file schema <conf_yaml_schema.json>`,
+ validate the test run configuration file, and create a test run configuration object.
+
+ The YAML test run configuration file is specified in the :option:`--config-file` command line
+ argument or the :envvar:`DTS_CFG_FILE` environment variable.
+
+ Returns:
+ The parsed test run configuration.
"""
with open(SETTINGS.config_file_path, "r") as f:
config_data = yaml.safe_load(f)
@@ -326,6 +581,8 @@ def load_config() -> Configuration:
with open(schema_path, "r") as f:
schema = json.load(f)
- config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
- config_obj: Configuration = Configuration.from_dict(dict(config))
+ config = warlock.model_factory(schema, name="_Config")(config_data)
+ config_obj: Configuration = Configuration.from_dict(
+ dict(config) # type: ignore[arg-type]
+ )
return config_obj
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
new file mode 100644
index 0000000000..1927910d88
--- /dev/null
+++ b/dts/framework/config/types.py
@@ -0,0 +1,132 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+"""Configuration dictionary contents specification.
+
+These type definitions serve as documentation of the configuration dictionary contents.
+
+The definitions use the built-in :class:`~typing.TypedDict` construct.
+"""
+
+from typing import TypedDict
+
+
+class PortConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ pci: str
+ #:
+ os_driver_for_dpdk: str
+ #:
+ os_driver: str
+ #:
+ peer_node: str
+ #:
+ peer_pci: str
+
+
+class TrafficGeneratorConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ type: str
+
+
+class HugepageConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ amount: int
+ #:
+ force_first_numa: bool
+
+
+class NodeConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ hugepages: HugepageConfigurationDict
+ #:
+ name: str
+ #:
+ hostname: str
+ #:
+ user: str
+ #:
+ password: str
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ lcores: str
+ #:
+ use_first_core: bool
+ #:
+ ports: list[PortConfigDict]
+ #:
+ memory_channels: int
+ #:
+ traffic_generator: TrafficGeneratorConfigDict
+
+
+class BuildTargetConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ cpu: str
+ #:
+ compiler: str
+ #:
+ compiler_wrapper: str
+
+
+class TestSuiteConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ suite: str
+ #:
+ cases: list[str]
+
+
+class ExecutionSUTConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ node_name: str
+ #:
+ vdevs: list[str]
+
+
+class ExecutionConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ build_targets: list[BuildTargetConfigDict]
+ #:
+ perf: bool
+ #:
+ func: bool
+ #:
+ skip_smoke_tests: bool
+ #:
+ test_suites: TestSuiteConfigDict
+ #:
+ system_under_test_node: ExecutionSUTConfigDict
+ #:
+ traffic_generator_node: str
+
+
+class ConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ nodes: list[NodeConfigDict]
+ #:
+ executions: list[ExecutionConfigDict]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 11/23] dts: remote session docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (9 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 10/23] dts: config " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 12/23] dts: interactive " Juraj Linkeš
` (12 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/__init__.py | 39 +++++-
.../remote_session/remote_session.py | 128 +++++++++++++-----
dts/framework/remote_session/ssh_session.py | 16 +--
3 files changed, 135 insertions(+), 48 deletions(-)
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 5e7ddb2b05..51a01d6b5e 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -2,12 +2,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-The package provides modules for managing remote connections to a remote host (node),
-differentiated by OS.
-The package provides a factory function, create_session, that returns the appropriate
-remote connection based on the passed configuration. The differences are in the
-underlying transport protocol (e.g. SSH) and remote OS (e.g. Linux).
+"""Remote interactive and non-interactive sessions.
+
+This package provides modules for managing remote connections to a remote host (node).
+
+The non-interactive sessions send commands and return their output and exit code.
+
+The interactive sessions open an interactive shell which is continuously open,
+allowing it to send and receive data within that particular shell.
"""
# pylama:ignore=W0611
@@ -26,10 +28,35 @@
def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> RemoteSession:
+ """Factory for non-interactive remote sessions.
+
+ The function returns an SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The SSH remote session.
+ """
return SSHSession(node_config, name, logger)
def create_interactive_session(
node_config: NodeConfiguration, logger: DTSLOG
) -> InteractiveRemoteSession:
+ """Factory for interactive remote sessions.
+
+ The function returns an interactive SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The interactive SSH remote session.
+ """
return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
index 0647d93de4..629c2d7b9c 100644
--- a/dts/framework/remote_session/remote_session.py
+++ b/dts/framework/remote_session/remote_session.py
@@ -3,6 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Base remote session.
+
+This module contains the abstract base class for remote sessions and defines
+the structure of the result of a command execution.
+"""
+
+
import dataclasses
from abc import ABC, abstractmethod
from pathlib import PurePath
@@ -15,8 +22,14 @@
@dataclasses.dataclass(slots=True, frozen=True)
class CommandResult:
- """
- The result of remote execution of a command.
+ """The result of remote execution of a command.
+
+ Attributes:
+ name: The name of the session that executed the command.
+ command: The executed command.
+ stdout: The standard output the command produced.
+ stderr: The standard error output the command produced.
+ return_code: The return code the command exited with.
"""
name: str
@@ -26,6 +39,7 @@ class CommandResult:
return_code: int
def __str__(self) -> str:
+ """Format the command outputs."""
return (
f"stdout: '{self.stdout}'\n"
f"stderr: '{self.stderr}'\n"
@@ -34,13 +48,24 @@ def __str__(self) -> str:
class RemoteSession(ABC):
- """
- The base class for defining which methods must be implemented in order to connect
- to a remote host (node) and maintain a remote session. The derived classes are
- supposed to implement/use some underlying transport protocol (e.g. SSH) to
- implement the methods. On top of that, it provides some basic services common to
- all derived classes, such as keeping history and logging what's being executed
- on the remote node.
+ """Non-interactive remote session.
+
+ The abstract methods must be implemented in order to connect to a remote host (node)
+ and maintain a remote session.
+ The subclasses must use (or implement) some underlying transport protocol (e.g. SSH)
+ to implement the methods. On top of that, it provides some basic services common to all
+ subclasses, such as keeping history and logging what's being executed on the remote node.
+
+ Attributes:
+ name: The name of the session.
+ hostname: The node's hostname. Could be an IP (possibly with port, separated by a colon)
+ or a domain name.
+ ip: The IP address of the node or a domain name, whichever was used in `hostname`.
+ port: The port of the node, if given in `hostname`.
+ username: The username used in the connection.
+ password: The password used in the connection. Most frequently empty,
+ as the use of passwords is discouraged.
+ history: The executed commands during this session.
"""
name: str
@@ -59,6 +84,16 @@ def __init__(
session_name: str,
logger: DTSLOG,
):
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ session_name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Raises:
+ SSHConnectionError: If the connection to the node was not successful.
+ """
self._node_config = node_config
self.name = session_name
@@ -79,8 +114,13 @@ def __init__(
@abstractmethod
def _connect(self) -> None:
- """
- Create connection to assigned node.
+ """Create a connection to the node.
+
+ The implementation must assign the established session to self.session.
+
+ The implementation must except all exceptions and convert them to an SSHConnectionError.
+
+ The implementation may optionally implement retry attempts.
"""
def send_command(
@@ -90,11 +130,24 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- Send a command to the connected node using optional env vars
- and return CommandResult.
- If verify is True, check the return code of the executed command
- and raise a RemoteCommandExecutionError if the command failed.
+ """Send `command` to the connected node.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds to execute `command`.
+ verify: If :data:`True`, will check the exit code of `command`.
+ env: A dictionary with environment variables to be used with `command` execution.
+
+ Raises:
+ SSHSessionDeadError: If the session isn't alive when sending `command`.
+ SSHTimeoutError: If `command` execution timed out.
+ RemoteCommandExecutionError: If verify is :data:`True` and `command` execution failed.
+
+ Returns:
+ The output of the command along with the return code.
"""
self._logger.info(
f"Sending: '{command}'" + (f" with env vars: '{env}'" if env else "")
@@ -115,29 +168,36 @@ def send_command(
def _send_command(
self, command: str, timeout: float, env: dict | None
) -> CommandResult:
- """
- Use the underlying protocol to execute the command using optional env vars
- and return CommandResult.
+ """Send a command to the connected node.
+
+ The implementation must execute the command remotely with `env` environment variables
+ and return the result.
+
+ The implementation must except all exceptions and raise an SSHSessionDeadError if
+ the session is not alive and an SSHTimeoutError if the command execution times out.
"""
def close(self, force: bool = False) -> None:
- """
- Close the remote session and free all used resources.
+ """Close the remote session and free all used resources.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
"""
self._logger.logger_exit()
self._close(force)
@abstractmethod
def _close(self, force: bool = False) -> None:
- """
- Execute protocol specific steps needed to close the session properly.
+ """Protocol specific steps needed to close the session properly.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
+ This doesn't have to be implemented in the overloaded method.
"""
@abstractmethod
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the remote session is still responding."""
@abstractmethod
def copy_from(
@@ -147,12 +207,12 @@ def copy_from(
) -> None:
"""Copy a file from the remote Node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote Node associated with this remote session
+ to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
- destination_file: a file or directory path on the local filesystem.
+ source_file: The file on the remote Node.
+ destination_file: A file or directory path on the local filesystem.
"""
@abstractmethod
@@ -163,10 +223,10 @@ def copy_to(
) -> None:
"""Copy a file from local filesystem to the remote Node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file` on the remote Node
+ associated with this remote session.
Args:
- source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ source_file: The file on the local filesystem.
+ destination_file: A file or directory path on the remote Node.
"""
diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
index cee11d14d6..7186490a9a 100644
--- a/dts/framework/remote_session/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -1,6 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""SSH session remote session."""
+
import socket
import traceback
from pathlib import PurePath
@@ -26,13 +28,8 @@
class SSHSession(RemoteSession):
"""A persistent SSH connection to a remote Node.
- The connection is implemented with the Fabric Python library.
-
- Args:
- node_config: The configuration of the Node to connect to.
- session_name: The name of the session.
- logger: The logger used for logging.
- This should be passed from the parent OSSession.
+ The connection is implemented with
+ `the Fabric Python library <https://docs.fabfile.org/en/latest/>`_.
Attributes:
session: The underlying Fabric SSH connection.
@@ -80,6 +77,7 @@ def _connect(self) -> None:
raise SSHConnectionError(self.hostname, errors)
def is_alive(self) -> bool:
+ """Overrides :meth:`~.remote_session.RemoteSession.is_alive`."""
return self.session.is_connected
def _send_command(
@@ -89,7 +87,7 @@ def _send_command(
Args:
command: The command to execute.
- timeout: Wait at most this many seconds for the execution to complete.
+ timeout: Wait at most this long in seconds to execute the command.
env: Extra environment variables that will be used in command execution.
Raises:
@@ -118,6 +116,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
self.session.get(str(destination_file), str(source_file))
def copy_to(
@@ -125,6 +124,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
self.session.put(str(source_file), str(destination_file))
def _close(self, force: bool = False) -> None:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 12/23] dts: interactive remote session docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (10 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 11/23] dts: remote session " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 13/23] dts: port and virtual device " Juraj Linkeš
` (11 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../interactive_remote_session.py | 36 +++----
.../remote_session/interactive_shell.py | 99 +++++++++++--------
dts/framework/remote_session/python_shell.py | 26 ++++-
dts/framework/remote_session/testpmd_shell.py | 61 +++++++++---
4 files changed, 150 insertions(+), 72 deletions(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 9085a668e8..c1bf30ac61 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -22,27 +22,23 @@
class InteractiveRemoteSession:
"""SSH connection dedicated to interactive applications.
- This connection is created using paramiko and is a persistent connection to the
- host. This class defines methods for connecting to the node and configures this
- connection to send "keep alive" packets every 30 seconds. Because paramiko attempts
- to use SSH keys to establish a connection first, providing a password is optional.
- This session is utilized by InteractiveShells and cannot be interacted with
- directly.
-
- Arguments:
- node_config: Configuration class for the node you are connecting to.
- _logger: Desired logger for this session to use.
+ The connection is created using `paramiko <https://docs.paramiko.org/en/latest/>`_
+ and is a persistent connection to the host. This class defines the methods for connecting
+ to the node and configures the connection to send "keep alive" packets every 30 seconds.
+ Because paramiko attempts to use SSH keys to establish a connection first, providing
+ a password is optional. This session is utilized by InteractiveShells
+ and cannot be interacted with directly.
Attributes:
- hostname: Hostname that will be used to initialize a connection to the node.
- ip: A subsection of hostname that removes the port for the connection if there
+ hostname: The hostname that will be used to initialize a connection to the node.
+ ip: A subsection of `hostname` that removes the port for the connection if there
is one. If there is no port, this will be the same as hostname.
- port: Port to use for the ssh connection. This will be extracted from the
- hostname if there is a port included, otherwise it will default to 22.
+ port: Port to use for the ssh connection. This will be extracted from `hostname`
+ if there is a port included, otherwise it will default to ``22``.
username: User to connect to the node with.
password: Password of the user connecting to the host. This will default to an
empty string if a password is not provided.
- session: Underlying paramiko connection.
+ session: The underlying paramiko connection.
Raises:
SSHConnectionError: There is an error creating the SSH connection.
@@ -58,9 +54,15 @@ class InteractiveRemoteSession:
_node_config: NodeConfiguration
_transport: Transport | None
- def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG) -> None:
+ def __init__(self, node_config: NodeConfiguration, logger: DTSLOG) -> None:
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+ """
self._node_config = node_config
- self._logger = _logger
+ self._logger = logger
self.hostname = node_config.hostname
self.username = node_config.user
self.password = node_config.password if node_config.password else ""
diff --git a/dts/framework/remote_session/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
index c24376b2a8..a98a822e91 100644
--- a/dts/framework/remote_session/interactive_shell.py
+++ b/dts/framework/remote_session/interactive_shell.py
@@ -3,18 +3,20 @@
"""Common functionality for interactive shell handling.
-This base class, InteractiveShell, is meant to be extended by other classes that
-contain functionality specific to that shell type. These derived classes will often
-modify things like the prompt to expect or the arguments to pass into the application,
-but still utilize the same method for sending a command and collecting output. How
-this output is handled however is often application specific. If an application needs
-elevated privileges to start it is expected that the method for gaining those
-privileges is provided when initializing the class.
+The base class, :class:`InteractiveShell`, is meant to be extended by subclasses that contain
+functionality specific to that shell type. These subclasses will often modify things like
+the prompt to expect or the arguments to pass into the application, but still utilize
+the same method for sending a command and collecting output. How this output is handled however
+is often application specific. If an application needs elevated privileges to start it is expected
+that the method for gaining those privileges is provided when initializing the class.
+
+The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+environment variable configure the timeout of getting the output from command execution.
"""
from abc import ABC
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from paramiko import Channel, SSHClient, channel # type: ignore[import]
@@ -30,28 +32,6 @@ class InteractiveShell(ABC):
and collecting input until reaching a certain prompt. All interactive applications
will use the same SSH connection, but each will create their own channel on that
session.
-
- Arguments:
- interactive_session: The SSH session dedicated to interactive shells.
- logger: Logger used for displaying information in the console.
- get_privileged_command: Method for modifying a command to allow it to use
- elevated privileges. If this is None, the application will not be started
- with elevated privileges.
- app_args: Command line arguments to be passed to the application on startup.
- timeout: Timeout used for the SSH channel that is dedicated to this interactive
- shell. This timeout is for collecting output, so if reading from the buffer
- and no output is gathered within the timeout, an exception is thrown.
-
- Attributes
- _default_prompt: Prompt to expect at the end of output when sending a command.
- This is often overridden by derived classes.
- _command_extra_chars: Extra characters to add to the end of every command
- before sending them. This is often overridden by derived classes and is
- most commonly an additional newline character.
- path: Path to the executable to start the interactive application.
- dpdk_app: Whether this application is a DPDK app. If it is, the build
- directory for DPDK on the node will be prepended to the path to the
- executable.
"""
_interactive_session: SSHClient
@@ -61,10 +41,22 @@ class InteractiveShell(ABC):
_logger: DTSLOG
_timeout: float
_app_args: str
- _default_prompt: str = ""
- _command_extra_chars: str = ""
- path: PurePath
- dpdk_app: bool = False
+
+ #: Prompt to expect at the end of output when sending a command.
+ #: This is often overridden by subclasses.
+ _default_prompt: ClassVar[str] = ""
+
+ #: Extra characters to add to the end of every command
+ #: before sending them. This is often overridden by subclasses and is
+ #: most commonly an additional newline character.
+ _command_extra_chars: ClassVar[str] = ""
+
+ #: Path to the executable to start the interactive application.
+ path: ClassVar[PurePath]
+
+ #: Whether this application is a DPDK app. If it is, the build directory
+ #: for DPDK on the node will be prepended to the path to the executable.
+ dpdk_app: ClassVar[bool] = False
def __init__(
self,
@@ -74,6 +66,19 @@ def __init__(
app_args: str = "",
timeout: float = SETTINGS.timeout,
) -> None:
+ """Create an SSH channel during initialization.
+
+ Args:
+ interactive_session: The SSH session dedicated to interactive shells.
+ logger: The logger instance this session will use.
+ get_privileged_command: A method for modifying a command to allow it to use
+ elevated privileges. If :data:`None`, the application will not be started
+ with elevated privileges.
+ app_args: The command line arguments to be passed to the application on startup.
+ timeout: The timeout used for the SSH channel that is dedicated to this interactive
+ shell. This timeout is for collecting output, so if reading from the buffer
+ and no output is gathered within the timeout, an exception is thrown.
+ """
self._interactive_session = interactive_session
self._ssh_channel = self._interactive_session.invoke_shell()
self._stdin = self._ssh_channel.makefile_stdin("w")
@@ -92,6 +97,10 @@ def _start_application(
This method is often overridden by subclasses as their process for
starting may look different.
+
+ Args:
+ get_privileged_command: A function (but could be any callable) that produces
+ the version of the command with elevated privileges.
"""
start_command = f"{self.path} {self._app_args}"
if get_privileged_command is not None:
@@ -99,16 +108,24 @@ def _start_application(
self.send_command(start_command)
def send_command(self, command: str, prompt: str | None = None) -> str:
- """Send a command and get all output before the expected ending string.
+ """Send `command` and get all output before the expected ending string.
Lines that expect input are not included in the stdout buffer, so they cannot
- be used for expect. For example, if you were prompted to log into something
- with a username and password, you cannot expect "username:" because it won't
- yet be in the stdout buffer. A workaround for this could be consuming an
- extra newline character to force the current prompt into the stdout buffer.
+ be used for expect.
+
+ Example:
+ If you were prompted to log into something with a username and password,
+ you cannot expect ``username:`` because it won't yet be in the stdout buffer.
+ A workaround for this could be consuming an extra newline character to force
+ the current `prompt` into the stdout buffer.
+
+ Args:
+ command: The command to send.
+ prompt: After sending the command, `send_command` will be expecting this string.
+ If :data:`None`, will use the class's default prompt.
Returns:
- All output in the buffer before expected string
+ All output in the buffer before expected string.
"""
self._logger.info(f"Sending: '{command}'")
if prompt is None:
@@ -126,8 +143,10 @@ def send_command(self, command: str, prompt: str | None = None) -> str:
return out
def close(self) -> None:
+ """Properly free all resources."""
self._stdin.close()
self._ssh_channel.close()
def __del__(self) -> None:
+ """Make sure the session is properly closed before deleting the object."""
self.close()
diff --git a/dts/framework/remote_session/python_shell.py b/dts/framework/remote_session/python_shell.py
index cc3ad48a68..c8e5957ef7 100644
--- a/dts/framework/remote_session/python_shell.py
+++ b/dts/framework/remote_session/python_shell.py
@@ -1,12 +1,32 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Python interactive shell.
+
+Typical usage example in a TestSuite::
+
+ from framework.remote_session import PythonShell
+ python_shell = self.tg_node.create_interactive_shell(
+ PythonShell, timeout=5, privileged=True
+ )
+ python_shell.send_command("print('Hello World')")
+ pytyon_shell.close()
+"""
+
from pathlib import PurePath
+from typing import ClassVar
from .interactive_shell import InteractiveShell
class PythonShell(InteractiveShell):
- _default_prompt: str = ">>>"
- _command_extra_chars: str = "\n"
- path: PurePath = PurePath("python3")
+ """Python interactive shell."""
+
+ #: Python's prompt.
+ _default_prompt: ClassVar[str] = ">>>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
+
+ #: The Python executable.
+ path: ClassVar[PurePath] = PurePath("python3")
diff --git a/dts/framework/remote_session/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
index 1455b5a199..2632515d74 100644
--- a/dts/framework/remote_session/testpmd_shell.py
+++ b/dts/framework/remote_session/testpmd_shell.py
@@ -1,45 +1,82 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Testpmd interactive shell.
+
+Typical usage example in a TestSuite::
+
+ testpmd_shell = self.sut_node.create_interactive_shell(
+ TestPmdShell, privileged=True
+ )
+ devices = testpmd_shell.get_devices()
+ for device in devices:
+ print(device)
+ testpmd_shell.close()
+"""
+
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from .interactive_shell import InteractiveShell
class TestPmdDevice(object):
+ """The data of a device that testpmd can recognize.
+
+ Attributes:
+ pci_address: The PCI address of the device.
+ """
+
pci_address: str
def __init__(self, pci_address_line: str):
+ """Initialize the device from the testpmd output line string.
+
+ Args:
+ pci_address_line: A line of testpmd output that contains a device.
+ """
self.pci_address = pci_address_line.strip().split(": ")[1].strip()
def __str__(self) -> str:
+ """The PCI address captures what the device is."""
return self.pci_address
class TestPmdShell(InteractiveShell):
- path: PurePath = PurePath("app", "dpdk-testpmd")
- dpdk_app: bool = True
- _default_prompt: str = "testpmd>"
- _command_extra_chars: str = (
- "\n" # We want to append an extra newline to every command
- )
+ """Testpmd interactive shell.
+
+ The testpmd shell users should never use
+ the :meth:`~framework.remote_session.interactive_shell.InteractiveShell.send_command` method
+ directly, but rather call specialized methods. If there isn't one that satisfies a need,
+ it should be added.
+ """
+
+ #: The path to the testpmd executable.
+ path: ClassVar[PurePath] = PurePath("app", "dpdk-testpmd")
+
+ #: Flag this as a DPDK app so that it's clear this is not a system app and
+ #: needs to be looked in a specific path.
+ dpdk_app: ClassVar[bool] = True
+
+ #: The testpmd's prompt.
+ _default_prompt: ClassVar[str] = "testpmd>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
def _start_application(
self, get_privileged_command: Callable[[str], str] | None
) -> None:
- """See "_start_application" in InteractiveShell."""
self._app_args += " -- -i"
super()._start_application(get_privileged_command)
def get_devices(self) -> list[TestPmdDevice]:
- """Get a list of device names that are known to testpmd
+ """Get a list of device names that are known to testpmd.
- Uses the device info listed in testpmd and then parses the output to
- return only the names of the devices.
+ Uses the device info listed in testpmd and then parses the output.
Returns:
- A list of strings representing device names (e.g. 0000:14:00.1)
+ A list of devices.
"""
dev_info: str = self.send_command("show device info all")
dev_list: list[TestPmdDevice] = []
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 13/23] dts: port and virtual device docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (11 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 12/23] dts: interactive " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 14/23] dts: cpu " Juraj Linkeš
` (10 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/__init__.py | 16 ++++--
dts/framework/testbed_model/port.py | 53 +++++++++++++++----
dts/framework/testbed_model/virtual_device.py | 17 +++++-
3 files changed, 71 insertions(+), 15 deletions(-)
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 8ced05653b..a02be1f2d9 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -2,9 +2,19 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-This package contains the classes used to model the physical traffic generator,
-system under test and any other components that need to be interacted with.
+"""Testbed modelling.
+
+This package defines the testbed elements DTS works with:
+
+ * A system under test node: :class:`SutNode`,
+ * A traffic generator node: :class:`TGNode`,
+ * The ports of network interface cards (NICs) present on nodes: :class:`Port`,
+ * The logical cores of CPUs present on nodes: :class:`LogicalCore`,
+ * The virtual devices that can be created on nodes: :class:`VirtualDevice`,
+ * The operating systems running on nodes: :class:`LinuxSession` and :class:`PosixSession`.
+
+DTS needs to be able to connect to nodes and understand some of the hardware present on these nodes
+to properly build and test DPDK.
"""
# pylama:ignore=W0611
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 680c29bfe3..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -2,6 +2,13 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""NIC port model.
+
+Basic port information, such as location (the port are identified by their PCI address on a node),
+drivers and address.
+"""
+
+
from dataclasses import dataclass
from framework.config import PortConfig
@@ -9,24 +16,35 @@
@dataclass(slots=True, frozen=True)
class PortIdentifier:
+ """The port identifier.
+
+ Attributes:
+ node: The node where the port resides.
+ pci: The PCI address of the port on `node`.
+ """
+
node: str
pci: str
@dataclass(slots=True)
class Port:
- """
- identifier: The PCI address of the port on a node.
-
- os_driver: The driver used by this port when the OS is controlling it.
- Example: i40e
- os_driver_for_dpdk: The driver the device must be bound to for DPDK to use it,
- Example: vfio-pci.
+ """Physical port on a node.
- Note: os_driver and os_driver_for_dpdk may be the same thing.
- Example: mlx5_core
+ The ports are identified by the node they're on and their PCI addresses. The port on the other
+ side of the connection is also captured here.
+ Each port is serviced by a driver, which may be different for the operating system (`os_driver`)
+ and for DPDK (`os_driver_for_dpdk`). For some devices, they are the same, e.g.: ``mlx5_core``.
- peer: The identifier of a port this port is connected with.
+ Attributes:
+ identifier: The PCI address of the port on a node.
+ os_driver: The operating system driver name when the operating system controls the port,
+ e.g.: ``i40e``.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK, e.g.: ``vfio-pci``.
+ peer: The identifier of a port this port is connected with.
+ The `peer` is on a different node.
+ mac_address: The MAC address of the port.
+ logical_name: The logical name of the port. Must be discovered.
"""
identifier: PortIdentifier
@@ -37,6 +55,12 @@ class Port:
logical_name: str = ""
def __init__(self, node_name: str, config: PortConfig):
+ """Initialize the port from `node_name` and `config`.
+
+ Args:
+ node_name: The name of the port's node.
+ config: The test run configuration of the port.
+ """
self.identifier = PortIdentifier(
node=node_name,
pci=config.pci,
@@ -47,14 +71,23 @@ def __init__(self, node_name: str, config: PortConfig):
@property
def node(self) -> str:
+ """The node where the port resides."""
return self.identifier.node
@property
def pci(self) -> str:
+ """The PCI address of the port."""
return self.identifier.pci
@dataclass(slots=True, frozen=True)
class PortLink:
+ """The physical, cabled connection between the ports.
+
+ Attributes:
+ sut_port: The port on the SUT node connected to `tg_port`.
+ tg_port: The port on the TG node connected to `sut_port`.
+ """
+
sut_port: Port
tg_port: Port
diff --git a/dts/framework/testbed_model/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
index eb664d9f17..e9b5e9c3be 100644
--- a/dts/framework/testbed_model/virtual_device.py
+++ b/dts/framework/testbed_model/virtual_device.py
@@ -1,16 +1,29 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Virtual devices model.
+
+Alongside support for physical hardware, DPDK can create various virtual devices.
+"""
+
class VirtualDevice(object):
- """
- Base class for virtual devices used by DPDK.
+ """Base class for virtual devices used by DPDK.
+
+ Attributes:
+ name: The name of the virtual device.
"""
name: str
def __init__(self, name: str):
+ """Initialize the virtual device.
+
+ Args:
+ name: The name of the virtual device.
+ """
self.name = name
def __str__(self) -> str:
+ """This corresponds to the name used for DPDK devices."""
return self.name
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 14/23] dts: cpu docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (12 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 13/23] dts: port and virtual device " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 15/23] dts: os session " Juraj Linkeš
` (9 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/cpu.py | 196 +++++++++++++++++++++--------
1 file changed, 144 insertions(+), 52 deletions(-)
diff --git a/dts/framework/testbed_model/cpu.py b/dts/framework/testbed_model/cpu.py
index 8fe785dfe4..4edeb4a7c2 100644
--- a/dts/framework/testbed_model/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -1,6 +1,22 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""CPU core representation and filtering.
+
+This module provides a unified representation of logical CPU cores along
+with filtering capabilities.
+
+When symmetric multiprocessing (SMP or multithreading) is enabled on a server,
+the physical CPU cores are split into logical CPU cores with different IDs.
+
+:class:`LogicalCoreCountFilter` filters by the number of logical cores. It's possible to specify
+the socket from which to filter the number of logical cores. It's also possible to not use all
+logical CPU cores from each physical core (e.g. only the first logical core of each physical core).
+
+:class:`LogicalCoreListFilter` filters by logical core IDs. This mostly checks that
+the logical cores are actually present on the server.
+"""
+
import dataclasses
from abc import ABC, abstractmethod
from collections.abc import Iterable, ValuesView
@@ -11,9 +27,17 @@
@dataclass(slots=True, frozen=True)
class LogicalCore(object):
- """
- Representation of a CPU core. A physical core is represented in OS
- by multiple logical cores (lcores) if CPU multithreading is enabled.
+ """Representation of a logical CPU core.
+
+ A physical core is represented in OS by multiple logical cores (lcores)
+ if CPU multithreading is enabled. When multithreading is disabled, their IDs are the same.
+
+ Attributes:
+ lcore: The logical core ID of a CPU core. It's the same as `core` with
+ disabled multithreading.
+ core: The physical core ID of a CPU core.
+ socket: The physical socket ID where the CPU resides.
+ node: The NUMA node ID where the CPU resides.
"""
lcore: int
@@ -22,27 +46,36 @@ class LogicalCore(object):
node: int
def __int__(self) -> int:
+ """The CPU is best represented by the logical core, as that's what we configure in EAL."""
return self.lcore
class LogicalCoreList(object):
- """
- Convert these options into a list of logical core ids.
- lcore_list=[LogicalCore1, LogicalCore2] - a list of LogicalCores
- lcore_list=[0,1,2,3] - a list of int indices
- lcore_list=['0','1','2-3'] - a list of str indices; ranges are supported
- lcore_list='0,1,2-3' - a comma delimited str of indices; ranges are supported
-
- The class creates a unified format used across the framework and allows
- the user to use either a str representation (using str(instance) or directly
- in f-strings) or a list representation (by accessing instance.lcore_list).
- Empty lcore_list is allowed.
+ r"""A unified way to store :class:`LogicalCore`\s.
+
+ Create a unified format used across the framework and allow the user to use
+ either a :class:`str` representation (using ``str(instance)`` or directly in f-strings)
+ or a :class:`list` representation (by accessing the `lcore_list` property,
+ which stores logical core IDs).
"""
_lcore_list: list[int]
_lcore_str: str
def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
+ """Process `lcore_list`, then sort.
+
+ There are four supported logical core list formats::
+
+ lcore_list=[LogicalCore1, LogicalCore2] # a list of LogicalCores
+ lcore_list=[0,1,2,3] # a list of int indices
+ lcore_list=['0','1','2-3'] # a list of str indices; ranges are supported
+ lcore_list='0,1,2-3' # a comma delimited str of indices; ranges are supported
+
+ Args:
+ lcore_list: Various ways to represent multiple logical cores.
+ Empty `lcore_list` is allowed.
+ """
self._lcore_list = []
if isinstance(lcore_list, str):
lcore_list = lcore_list.split(",")
@@ -60,6 +93,7 @@ def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
@property
def lcore_list(self) -> list[int]:
+ """The logical core IDs."""
return self._lcore_list
def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
@@ -89,28 +123,30 @@ def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
return formatted_core_list
def __str__(self) -> str:
+ """The consecutive ranges of logical core IDs."""
return self._lcore_str
@dataclasses.dataclass(slots=True, frozen=True)
class LogicalCoreCount(object):
- """
- Define the number of logical cores to use.
- If sockets is not None, socket_count is ignored.
- """
+ """Define the number of logical cores per physical cores per sockets."""
+ #: Use this many logical cores per each physical core.
lcores_per_core: int = 1
+ #: Use this many physical cores per each socket.
cores_per_socket: int = 2
+ #: Use this many sockets.
socket_count: int = 1
+ #: Use exactly these sockets. This takes precedence over `socket_count`,
+ #: so when `sockets` is not :data:`None`, `socket_count` is ignored.
sockets: list[int] | None = None
class LogicalCoreFilter(ABC):
- """
- Filter according to the input filter specifier. Each filter needs to be
- implemented in a derived class.
- This class only implements operations common to all filters, such as sorting
- the list to be filtered beforehand.
+ """Common filtering class.
+
+ Each filter needs to be implemented in a subclass. This base class sorts the list of cores
+ and defines the filtering method, which must be implemented by subclasses.
"""
_filter_specifier: LogicalCoreCount | LogicalCoreList
@@ -122,6 +158,17 @@ def __init__(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
):
+ """Filter according to the input filter specifier.
+
+ The input `lcore_list` is copied and sorted by physical core before filtering.
+ The list is copied so that the original is left intact.
+
+ Args:
+ lcore_list: The logical CPU cores to filter.
+ filter_specifier: Filter cores from `lcore_list` according to this filter.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If data:`False`,
+ sort in descending order.
+ """
self._filter_specifier = filter_specifier
# sorting by core is needed in case hyperthreading is enabled
@@ -132,31 +179,45 @@ def __init__(
@abstractmethod
def filter(self) -> list[LogicalCore]:
- """
- Use self._filter_specifier to filter self._lcores_to_filter
- and return the list of filtered LogicalCores.
- self._lcores_to_filter is a sorted copy of the original list,
- so it may be modified.
+ r"""Filter the cores.
+
+ Use `self._filter_specifier` to filter `self._lcores_to_filter` and return
+ the filtered :class:`LogicalCore`\s.
+ `self._lcores_to_filter` is a sorted copy of the original list, so it may be modified.
+
+ Returns:
+ The filtered cores.
"""
class LogicalCoreCountFilter(LogicalCoreFilter):
- """
+ """Filter cores by specified counts.
+
Filter the input list of LogicalCores according to specified rules:
- Use cores from the specified number of sockets or from the specified socket ids.
- If sockets is specified, it takes precedence over socket_count.
- From each of those sockets, use only cores_per_socket of cores.
- And for each core, use lcores_per_core of logical cores. Hypertheading
- must be enabled for this to take effect.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+
+ * The input `filter_specifier` is :class:`LogicalCoreCount`,
+ * Use cores from the specified number of sockets or from the specified socket ids,
+ * If `sockets` is specified, it takes precedence over `socket_count`,
+ * From each of those sockets, use only `cores_per_socket` of cores,
+ * And for each core, use `lcores_per_core` of logical cores. Hypertheading
+ must be enabled for this to take effect.
"""
_filter_specifier: LogicalCoreCount
def filter(self) -> list[LogicalCore]:
+ """Filter the cores according to :class:`LogicalCoreCount`.
+
+ Start by filtering the allowed sockets. The cores matching the allowed socket are returned.
+ The cores of each socket are stored in separate lists.
+
+ Then filter the allowed physical cores from those lists of cores per socket. When filtering
+ physical cores, store the desired number of logical cores per physical core which then
+ together constitute the final filtered list.
+
+ Returns:
+ The filtered cores.
+ """
sockets_to_filter = self._filter_sockets(self._lcores_to_filter)
filtered_lcores = []
for socket_to_filter in sockets_to_filter:
@@ -166,24 +227,37 @@ def filter(self) -> list[LogicalCore]:
def _filter_sockets(
self, lcores_to_filter: Iterable[LogicalCore]
) -> ValuesView[list[LogicalCore]]:
- """
- Remove all lcores that don't match the specified socket(s).
- If self._filter_specifier.sockets is not None, keep lcores from those sockets,
- otherwise keep lcores from the first
- self._filter_specifier.socket_count sockets.
+ """Filter a list of cores per each allowed socket.
+
+ The sockets may be specified in two ways, either a number or a specific list of sockets.
+ In case of a specific list, we just need to return the cores from those sockets.
+ If filtering a number of cores, we need to go through all cores and note which sockets
+ appear and only filter from the first n that appear.
+
+ Args:
+ lcores_to_filter: The cores to filter. These must be sorted by the physical core.
+
+ Returns:
+ A list of lists of logical CPU cores. Each list contains cores from one socket.
"""
allowed_sockets: set[int] = set()
socket_count = self._filter_specifier.socket_count
if self._filter_specifier.sockets:
+ # when sockets in filter is specified, the sockets are already set
socket_count = len(self._filter_specifier.sockets)
allowed_sockets = set(self._filter_specifier.sockets)
+ # filter socket_count sockets from all sockets by checking the socket of each CPU
filtered_lcores: dict[int, list[LogicalCore]] = {}
for lcore in lcores_to_filter:
if not self._filter_specifier.sockets:
+ # this is when sockets is not set, so we do the actual filtering
+ # when it is set, allowed_sockets is already defined and can't be changed
if len(allowed_sockets) < socket_count:
+ # allowed_sockets is a set, so adding an existing socket won't re-add it
allowed_sockets.add(lcore.socket)
if lcore.socket in allowed_sockets:
+ # separate sockets per socket; this makes it easier in further processing
if lcore.socket in filtered_lcores:
filtered_lcores[lcore.socket].append(lcore)
else:
@@ -200,12 +274,13 @@ def _filter_sockets(
def _filter_cores_from_socket(
self, lcores_to_filter: Iterable[LogicalCore]
) -> list[LogicalCore]:
- """
- Keep only the first self._filter_specifier.cores_per_socket cores.
- In multithreaded environments, keep only
- the first self._filter_specifier.lcores_per_core lcores of those cores.
- """
+ """Filter a list of cores from the given socket.
+
+ Go through the cores and note how many logical cores per physical core have been filtered.
+ Returns:
+ The filtered logical CPU cores.
+ """
# no need to use ordered dict, from Python3.7 the dict
# insertion order is preserved (LIFO).
lcore_count_per_core_map: dict[int, int] = {}
@@ -248,15 +323,21 @@ def _filter_cores_from_socket(
class LogicalCoreListFilter(LogicalCoreFilter):
- """
- Filter the input list of Logical Cores according to the input list of
- lcore indices.
- An empty LogicalCoreList won't filter anything.
+ """Filter the logical CPU cores by logical CPU core IDs.
+
+ This is a simple filter that looks at logical CPU IDs and only filter those that match.
+
+ The input filter is :class:`LogicalCoreList`. An empty LogicalCoreList won't filter anything.
"""
_filter_specifier: LogicalCoreList
def filter(self) -> list[LogicalCore]:
+ """Filter based on logical CPU core ID.
+
+ Return:
+ The filtered logical CPU cores.
+ """
if not len(self._filter_specifier.lcore_list):
return self._lcores_to_filter
@@ -279,6 +360,17 @@ def lcore_filter(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool,
) -> LogicalCoreFilter:
+ """Factory for using the right filter with `filter_specifier`.
+
+ Args:
+ core_list: The logical CPU cores to filter.
+ filter_specifier: The filter to use.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If :data:`False`,
+ sort in descending order.
+
+ Returns:
+ The filter matching `filter_specifier`.
+ """
if isinstance(filter_specifier, LogicalCoreList):
return LogicalCoreListFilter(core_list, filter_specifier, ascending)
elif isinstance(filter_specifier, LogicalCoreCount):
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 15/23] dts: os session docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (13 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 14/23] dts: cpu " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 16/23] dts: posix and linux sessions " Juraj Linkeš
` (8 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/os_session.py | 275 ++++++++++++++++------
1 file changed, 208 insertions(+), 67 deletions(-)
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 76e595a518..bad75d52e7 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -2,6 +2,29 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""OS-aware remote session.
+
+DPDK supports multiple different operating systems, meaning it can run on these different operating
+systems. This module defines the common API that OS-unaware layers use and translates the API into
+OS-aware calls/utility usage.
+
+Note:
+ Running commands with administrative privileges requires OS awareness. This is the only layer
+ that's aware of OS differences, so this is where non-privileged command get converted
+ to privileged commands.
+
+Example:
+ A user wishes to remove a directory on
+ a remote :class:`~framework.testbed_model.sut_node.SutNode`.
+ The :class:`~framework.testbed_model.sut_node.SutNode` object isn't aware what OS the node
+ is running - it delegates the OS translation logic
+ to :attr:`~framework.testbed_model.node.Node.main_session`. The SUT node calls
+ :meth:`~OSSession.remove_remote_dir` with a generic, OS-unaware path and
+ the :attr:`~framework.testbed_model.node.Node.main_session` translates that
+ to ``rm -rf`` if the node's OS is Linux and other commands for other OSs.
+ It also translates the path to match the underlying OS.
+"""
+
from abc import ABC, abstractmethod
from collections.abc import Iterable
from ipaddress import IPv4Interface, IPv6Interface
@@ -28,10 +51,16 @@
class OSSession(ABC):
- """
- The OS classes create a DTS node remote session and implement OS specific
+ """OS-unaware to OS-aware translation API definition.
+
+ The OSSession classes create a remote session to a DTS node and implement OS specific
behavior. There a few control methods implemented by the base class, the rest need
- to be implemented by derived classes.
+ to be implemented by subclasses.
+
+ Attributes:
+ name: The name of the session.
+ remote_session: The remote session maintaining the connection to the node.
+ interactive_session: The interactive remote session maintaining the connection to the node.
"""
_config: NodeConfiguration
@@ -46,6 +75,15 @@ def __init__(
name: str,
logger: DTSLOG,
):
+ """Initialize the OS-aware session.
+
+ Connect to the node right away and also create an interactive remote session.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
self._config = node_config
self.name = name
self._logger = logger
@@ -53,15 +91,15 @@ def __init__(
self.interactive_session = create_interactive_session(node_config, logger)
def close(self, force: bool = False) -> None:
- """
- Close the remote session.
+ """Close the underlying remote session.
+
+ Args:
+ force: Force the closure of the connection.
"""
self.remote_session.close(force)
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the underlying remote session is still responding."""
return self.remote_session.is_alive()
def send_command(
@@ -72,10 +110,23 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- An all-purpose API in case the command to be executed is already
- OS-agnostic, such as when the path to the executed command has been
- constructed beforehand.
+ """An all-purpose API for OS-agnostic commands.
+
+ This can be used for an execution of a portable command that's executed the same way
+ on all operating systems, such as Python.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds to execute the command.
+ privileged: Whether to run the command with administrative privileges.
+ verify: If True, will check the exit code of the command.
+ env: A dictionary with environment variables to be used with the command execution.
+
+ Raises:
+ RemoteCommandExecutionError: If verify is True and the command failed.
"""
if privileged:
command = self._get_privileged_command(command)
@@ -89,8 +140,20 @@ def create_interactive_shell(
privileged: bool,
app_args: str,
) -> InteractiveShellType:
- """
- See "create_interactive_shell" in SutNode
+ """Factory for interactive session handlers.
+
+ Instantiate `shell_cls` according to the remote OS specifics.
+
+ Args:
+ shell_cls: The class of the shell.
+ timeout: Timeout for reading output from the SSH channel. If you are
+ reading from the buffer and don't receive any data within the timeout
+ it will throw an error.
+ privileged: Whether to run the shell with administrative privileges.
+ app_args: The arguments to be passed to the application.
+
+ Returns:
+ An instance of the desired interactive application shell.
"""
return shell_cls(
self.interactive_session.session,
@@ -114,27 +177,42 @@ def _get_privileged_command(command: str) -> str:
@abstractmethod
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
- """
- Try to find DPDK remote dir in remote_dir.
+ """Try to find DPDK directory in `remote_dir`.
+
+ The directory is the one which is created after the extraction of the tarball. The files
+ are usually extracted into a directory starting with ``dpdk-``.
+
+ Returns:
+ The absolute path of the DPDK remote directory, empty path if not found.
"""
@abstractmethod
def get_remote_tmp_dir(self) -> PurePath:
- """
- Get the path of the temporary directory of the remote OS.
+ """Get the path of the temporary directory of the remote OS.
+
+ Returns:
+ The absolute path of the temporary directory.
"""
@abstractmethod
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for the target architecture. Get
- information from the node if needed.
+ """Create extra environment variables needed for the target architecture.
+
+ Different architectures may require different configuration, such as setting 32-bit CFLAGS.
+
+ Returns:
+ A dictionary with keys as environment variables.
"""
@abstractmethod
def join_remote_path(self, *args: str | PurePath) -> PurePath:
- """
- Join path parts using the path separator that fits the remote OS.
+ """Join path parts using the path separator that fits the remote OS.
+
+ Args:
+ args: Any number of paths to join.
+
+ Returns:
+ The resulting joined path.
"""
@abstractmethod
@@ -143,13 +221,13 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from the remote Node to the local filesystem.
+ """Copy a file from the remote node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote node associated with this remote
+ session to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
+ source_file: the file on the remote node.
destination_file: a file or directory path on the local filesystem.
"""
@@ -159,14 +237,14 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from local filesystem to the remote Node.
+ """Copy a file from local filesystem to the remote node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file`
+ on the remote node associated with this remote session.
Args:
source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ destination_file: a file or directory path on the remote node.
"""
@abstractmethod
@@ -176,8 +254,12 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
- """
- Remove remote directory, by default remove recursively and forcefully.
+ """Remove remote directory, by default remove recursively and forcefully.
+
+ Args:
+ remote_dir_path: The path of the directory to remove.
+ recursive: If :data:`True`, also remove all contents inside the directory.
+ force: If :data:`True`, ignore all warnings and try to remove at all costs.
"""
@abstractmethod
@@ -186,9 +268,12 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
- """
- Extract remote tarball in place. If expected_dir is a non-empty string, check
- whether the dir exists after extracting the archive.
+ """Extract remote tarball in its remote directory.
+
+ Args:
+ remote_tarball_path: The path of the tarball on the remote node.
+ expected_dir: If non-empty, check whether `expected_dir` exists after extracting
+ the archive.
"""
@abstractmethod
@@ -201,69 +286,119 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
- """
- Build DPDK in the input dir with specified environment variables and meson
- arguments.
+ """Build DPDK on the remote node.
+
+ An extracted DPDK tarball must be present on the node. The build consists of two steps::
+
+ meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
+ ninja -C remote_dpdk_build_dir
+
+ The :option:`--compile-timeout` command line argument and the :envvar:`DTS_COMPILE_TIMEOUT`
+ environment variable configure the timeout of DPDK build.
+
+ Args:
+ env_vars: Use these environment variables then building DPDK.
+ meson_args: Use these meson arguments when building DPDK.
+ remote_dpdk_dir: The directory on the remote node where DPDK will be built.
+ remote_dpdk_build_dir: The target build directory on the remote node.
+ rebuild: If True, do a subsequent build with ``meson configure`` instead
+ of ``meson setup``.
+ timeout: Wait at most this long in seconds for the build to execute.
"""
@abstractmethod
def get_dpdk_version(self, version_path: str | PurePath) -> str:
- """
- Inspect DPDK version on the remote node from version_path.
+ """Inspect the DPDK version on the remote node.
+
+ Args:
+ version_path: The path to the VERSION file containing the DPDK version.
+
+ Returns:
+ The DPDK version.
"""
@abstractmethod
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
- """
- Compose a list of LogicalCores present on the remote node.
- If use_first_core is False, the first physical core won't be used.
+ r"""Get the list of :class:`~framework.testbed_model.cpu.LogicalCore`\s on the remote node.
+
+ Args:
+ use_first_core: If :data:`False`, the first physical core won't be used.
+
+ Returns:
+ The logical cores present on the node.
"""
@abstractmethod
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
- """
- Kill and cleanup all DPDK apps identified by dpdk_prefix_list. If
- dpdk_prefix_list is empty, attempt to find running DPDK apps to kill and clean.
+ """Kill and cleanup all DPDK apps.
+
+ Args:
+ dpdk_prefix_list: Kill all apps identified by `dpdk_prefix_list`.
+ If `dpdk_prefix_list` is empty, attempt to find running DPDK apps to kill and clean.
"""
@abstractmethod
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
- """
- Get the DPDK file prefix that will be used when running DPDK apps.
+ """Make OS-specific modification to the DPDK file prefix.
+
+ Args:
+ dpdk_prefix: The OS-unaware file prefix.
+
+ Returns:
+ The OS-specific file prefix.
"""
@abstractmethod
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
- """
- Get the node's Hugepage Size, configure the specified amount of hugepages
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Configure hugepages on the node.
+
+ Get the node's Hugepage Size, configure the specified count of hugepages
if needed and mount the hugepages if needed.
- If force_first_numa is True, configure hugepages just on the first socket.
+
+ Args:
+ hugepage_count: Configure this many hugepages.
+ force_first_numa: If :data:`True`, configure hugepages just on the first socket.
"""
@abstractmethod
def get_compiler_version(self, compiler_name: str) -> str:
- """
- Get installed version of compiler used for DPDK
+ """Get installed version of compiler used for DPDK.
+
+ Args:
+ compiler_name: The name of the compiler executable.
+
+ Returns:
+ The compiler's version.
"""
@abstractmethod
def get_node_info(self) -> NodeInfo:
- """
- Collect information about the node
+ """Collect additional information about the node.
+
+ Returns:
+ Node information.
"""
@abstractmethod
def update_ports(self, ports: list[Port]) -> None:
- """
- Get additional information about ports:
- Logical name (e.g. enp7s0) if applicable
- Mac address
+ """Get additional information about ports from the operating system and update them.
+
+ The additional information is:
+
+ * Logical name (e.g. ``enp7s0``) if applicable,
+ * Mac address.
+
+ Args:
+ ports: The ports to update.
"""
@abstractmethod
def configure_port_state(self, port: Port, enable: bool) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port` in the operating system.
+
+ Args:
+ port: The port to configure.
+ enable: If :data:`True`, enable the port, otherwise shut it down.
"""
@abstractmethod
@@ -273,12 +408,18 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
- """
- Configure (add or delete) an IP address of the input port.
+ """Configure an IP address on `port` in the operating system.
+
+ Args:
+ address: The address to configure.
+ port: The port to configure.
+ delete: If :data:`True`, remove the IP address, otherwise configure it.
"""
@abstractmethod
def configure_ipv4_forwarding(self, enable: bool) -> None:
- """
- Enable IPv4 forwarding in the underlying OS.
+ """Enable IPv4 forwarding in the operating system.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
"""
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 16/23] dts: posix and linux sessions docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (14 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 15/23] dts: os session " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 17/23] dts: node " Juraj Linkeš
` (7 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/linux_session.py | 63 ++++++++++-----
dts/framework/testbed_model/posix_session.py | 81 +++++++++++++++++---
2 files changed, 113 insertions(+), 31 deletions(-)
diff --git a/dts/framework/testbed_model/linux_session.py b/dts/framework/testbed_model/linux_session.py
index f472bb8f0f..279954ff63 100644
--- a/dts/framework/testbed_model/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -2,6 +2,13 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""Linux OS translator.
+
+Translate OS-unaware calls into Linux calls/utilities. Most of Linux distributions are mostly
+compliant with POSIX standards, so this module only implements the parts that aren't.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import json
from ipaddress import IPv4Interface, IPv6Interface
from typing import TypedDict, Union
@@ -17,43 +24,51 @@
class LshwConfigurationOutput(TypedDict):
+ """The relevant parts of ``lshw``'s ``configuration`` section."""
+
+ #:
link: str
class LshwOutput(TypedDict):
- """
- A model of the relevant information from json lshw output, e.g.:
- {
- ...
- "businfo" : "pci@0000:08:00.0",
- "logicalname" : "enp8s0",
- "version" : "00",
- "serial" : "52:54:00:59:e1:ac",
- ...
- "configuration" : {
- ...
- "link" : "yes",
- ...
- },
- ...
+ """A model of the relevant information from ``lshw``'s json output.
+
+ e.g.::
+
+ {
+ ...
+ "businfo" : "pci@0000:08:00.0",
+ "logicalname" : "enp8s0",
+ "version" : "00",
+ "serial" : "52:54:00:59:e1:ac",
+ ...
+ "configuration" : {
+ ...
+ "link" : "yes",
+ ...
+ },
+ ...
"""
+ #:
businfo: str
+ #:
logicalname: NotRequired[str]
+ #:
serial: NotRequired[str]
+ #:
configuration: LshwConfigurationOutput
class LinuxSession(PosixSession):
- """
- The implementation of non-Posix compliant parts of Linux remote sessions.
- """
+ """The implementation of non-Posix compliant parts of Linux."""
@staticmethod
def _get_privileged_command(command: str) -> str:
return f"sudo -- sh -c '{command}'"
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_cpus`."""
cpu_info = self.send_command("lscpu -p=CPU,CORE,SOCKET,NODE|grep -v \\#").stdout
lcores = []
for cpu_line in cpu_info.splitlines():
@@ -65,18 +80,20 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
return lcores
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return dpdk_prefix
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.setup_hugepages`."""
self._logger.info("Getting Hugepage information.")
hugepage_size = self._get_hugepage_size()
hugepages_total = self._get_hugepages_total()
self._numa_nodes = self._get_numa_nodes()
- if force_first_numa or hugepages_total != hugepage_amount:
+ if force_first_numa or hugepages_total != hugepage_count:
# when forcing numa, we need to clear existing hugepages regardless
# of size, so they can be moved to the first numa node
- self._configure_huge_pages(hugepage_amount, hugepage_size, force_first_numa)
+ self._configure_huge_pages(hugepage_count, hugepage_size, force_first_numa)
else:
self._logger.info("Hugepages already configured.")
self._mount_huge_pages()
@@ -140,6 +157,7 @@ def _configure_huge_pages(
)
def update_ports(self, ports: list[Port]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.update_ports`."""
self._logger.debug("Gathering port info.")
for port in ports:
assert (
@@ -178,6 +196,7 @@ def _update_port_attr(
)
def configure_port_state(self, port: Port, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_state`."""
state = "up" if enable else "down"
self.send_command(
f"ip link set dev {port.logical_name} {state}", privileged=True
@@ -189,6 +208,7 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_ip_address`."""
command = "del" if delete else "add"
self.send_command(
f"ip address {command} {address} dev {port.logical_name}",
@@ -197,5 +217,6 @@ def configure_port_ip_address(
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_ipv4_forwarding`."""
state = 1 if enable else 0
self.send_command(f"sysctl -w net.ipv4.ip_forward={state}", privileged=True)
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 1d1d5b1b26..a4824aa274 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -2,6 +2,15 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""POSIX compliant OS translator.
+
+Translates OS-unaware calls into POSIX compliant calls/utilities. POSIX is a set of standards
+for portability between Unix operating systems which not all Linux distributions
+(or the tools most frequently bundled with said distributions) adhere to. Most of Linux
+distributions are mostly compliant though.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import re
from collections.abc import Iterable
from pathlib import PurePath, PurePosixPath
@@ -15,13 +24,21 @@
class PosixSession(OSSession):
- """
- An intermediary class implementing the Posix compliant parts of
- Linux and other OS remote sessions.
- """
+ """An intermediary class implementing the POSIX standard."""
@staticmethod
def combine_short_options(**opts: bool) -> str:
+ """Combine shell options into one argument.
+
+ These are options such as ``-x``, ``-v``, ``-f`` which are combined into ``-xvf``.
+
+ Args:
+ opts: The keys are option names (usually one letter) and the bool values indicate
+ whether to include the option in the resulting argument.
+
+ Returns:
+ The options combined into one argument.
+ """
ret_opts = ""
for opt, include in opts.items():
if include:
@@ -33,17 +50,19 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.guess_dpdk_remote_dir`."""
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
def get_remote_tmp_dir(self) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_tmp_dir`."""
return PurePosixPath("/tmp")
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for i686 arch build. Get information
- from the node if needed.
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_build_env_vars`.
+
+ Supported architecture: ``i686``.
"""
env_vars = {}
if arch == Architecture.i686:
@@ -63,6 +82,7 @@ def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
return env_vars
def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
return PurePosixPath(*args)
def copy_from(
@@ -70,6 +90,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_from`."""
self.remote_session.copy_from(source_file, destination_file)
def copy_to(
@@ -77,6 +98,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_to`."""
self.remote_session.copy_to(source_file, destination_file)
def remove_remote_dir(
@@ -85,6 +107,7 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.remove_remote_dir`."""
opts = PosixSession.combine_short_options(r=recursive, f=force)
self.send_command(f"rm{opts} {remote_dir_path}")
@@ -93,6 +116,7 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
self.send_command(
f"tar xfm {remote_tarball_path} "
f"-C {PurePosixPath(remote_tarball_path).parent}",
@@ -110,6 +134,7 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.build_dpdk`."""
try:
if rebuild:
# reconfigure, then build
@@ -140,12 +165,14 @@ def build_dpdk(
raise DPDKBuildError(f"DPDK build failed when doing '{e.command}'.")
def get_dpdk_version(self, build_dir: str | PurePath) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_version`."""
out = self.send_command(
f"cat {self.join_remote_path(build_dir, 'VERSION')}", verify=True
)
return out.stdout
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.kill_cleanup_dpdk_apps`."""
self._logger.info("Cleaning up DPDK apps.")
dpdk_runtime_dirs = self._get_dpdk_runtime_dirs(dpdk_prefix_list)
if dpdk_runtime_dirs:
@@ -159,6 +186,14 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
def _get_dpdk_runtime_dirs(
self, dpdk_prefix_list: Iterable[str]
) -> list[PurePosixPath]:
+ """Find runtime directories DPDK apps are currently using.
+
+ Args:
+ dpdk_prefix_list: The prefixes DPDK apps were started with.
+
+ Returns:
+ The paths of DPDK apps' runtime dirs.
+ """
prefix = PurePosixPath("/var", "run", "dpdk")
if not dpdk_prefix_list:
remote_prefixes = self._list_remote_dirs(prefix)
@@ -170,9 +205,13 @@ def _get_dpdk_runtime_dirs(
return [PurePosixPath(prefix, dpdk_prefix) for dpdk_prefix in dpdk_prefix_list]
def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
- """
- Return a list of directories of the remote_dir.
- If remote_path doesn't exist, return None.
+ """Contents of remote_path.
+
+ Args:
+ remote_path: List the contents of this path.
+
+ Returns:
+ The contents of remote_path. If remote_path doesn't exist, return None.
"""
out = self.send_command(
f"ls -l {remote_path} | awk '/^d/ {{print $NF}}'"
@@ -183,6 +222,17 @@ def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
return out.splitlines()
def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[int]:
+ """Find PIDs of running DPDK apps.
+
+ Look at each "config" file found in dpdk_runtime_dirs and find the PIDs of processes
+ that opened those file.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+
+ Returns:
+ The PIDs of running DPDK apps.
+ """
pids = []
pid_regex = r"p(\d+)"
for dpdk_runtime_dir in dpdk_runtime_dirs:
@@ -203,6 +253,14 @@ def _remote_files_exists(self, remote_path: PurePath) -> bool:
def _check_dpdk_hugepages(
self, dpdk_runtime_dirs: Iterable[str | PurePath]
) -> None:
+ """Check there aren't any leftover hugepages.
+
+ If any hugegapes are found, emit a warning. The hugepages are investigated in the
+ "hugepage_info" file of dpdk_runtime_dirs.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+ """
for dpdk_runtime_dir in dpdk_runtime_dirs:
hugepage_info = PurePosixPath(dpdk_runtime_dir, "hugepage_info")
if self._remote_files_exists(hugepage_info):
@@ -220,9 +278,11 @@ def _remove_dpdk_runtime_dirs(
self.remove_remote_dir(dpdk_runtime_dir)
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return ""
def get_compiler_version(self, compiler_name: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_compiler_version`."""
match compiler_name:
case "gcc":
return self.send_command(
@@ -240,6 +300,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
raise ValueError(f"Unknown compiler {compiler_name}")
def get_node_info(self) -> NodeInfo:
+ """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
os_release_info = self.send_command(
"awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
SETTINGS.timeout,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 17/23] dts: node docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (15 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 16/23] dts: posix and linux sessions " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 18/23] dts: sut and tg nodes " Juraj Linkeš
` (6 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/node.py | 191 +++++++++++++++++++---------
1 file changed, 131 insertions(+), 60 deletions(-)
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 7571e7b98d..abf86793a7 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+A node is any host/server DTS connects to.
+
+The base class, :class:`Node`, provides functionality common to all nodes and is supposed
+to be extended by subclasses with functionality specific to each node type.
+The decorator :func:`Node.skip_setup` can be used without subclassing.
"""
from abc import ABC
@@ -35,10 +40,22 @@
class Node(ABC):
- """
- Basic class for node management. This class implements methods that
- manage a node, such as information gathering (of CPU/PCI/NIC) and
- environment setup.
+ """The base class for node management.
+
+ It shouldn't be instantiated, but rather subclassed.
+ It implements common methods to manage any node:
+
+ * Connection to the node,
+ * Hugepages setup.
+
+ Attributes:
+ main_session: The primary OS-aware remote session used to communicate with the node.
+ config: The node configuration.
+ name: The name of the node.
+ lcores: The list of logical cores that DTS can use on the node.
+ It's derived from logical cores present on the node and the test run configuration.
+ ports: The ports of this node specified in the test run configuration.
+ virtual_devices: The virtual devices used on the node.
"""
main_session: OSSession
@@ -52,6 +69,17 @@ class Node(ABC):
virtual_devices: list[VirtualDevice]
def __init__(self, node_config: NodeConfiguration):
+ """Connect to the node and gather info during initialization.
+
+ Extra gathered information:
+
+ * The list of available logical CPUs. This is then filtered by
+ the ``lcores`` configuration in the YAML test run configuration file,
+ * Information about ports from the YAML test run configuration file.
+
+ Args:
+ node_config: The node's test run configuration.
+ """
self.config = node_config
self.name = node_config.name
self._logger = getLogger(self.name)
@@ -60,7 +88,7 @@ def __init__(self, node_config: NodeConfiguration):
self._logger.info(f"Connected to node: {self.name}")
self._get_remote_cpus()
- # filter the node lcores according to user config
+ # filter the node lcores according to the test run configuration
self.lcores = LogicalCoreListFilter(
self.lcores, LogicalCoreList(self.config.lcores)
).filter()
@@ -77,9 +105,14 @@ def _init_ports(self) -> None:
self.configure_port_state(port)
def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- Perform the execution setup that will be done for each execution
- this node is part of.
+ """Execution setup steps.
+
+ Configure hugepages and call :meth:`_set_up_execution` where
+ the rest of the configuration steps (if any) are implemented.
+
+ Args:
+ execution_config: The execution test run configuration according to which
+ the setup steps will be taken.
"""
self._setup_hugepages()
self._set_up_execution(execution_config)
@@ -88,58 +121,74 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
self.virtual_devices.append(VirtualDevice(vdev))
def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution setup steps.
"""
def tear_down_execution(self) -> None:
- """
- Perform the execution teardown that will be done after each execution
- this node is part of concludes.
+ """Execution teardown steps.
+
+ There are currently no common execution teardown steps common to all DTS node types.
"""
self.virtual_devices = []
self._tear_down_execution()
def _tear_down_execution(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution teardown steps.
"""
def set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Perform the build target setup that will be done for each build target
- tested on this node.
+ """Build target setup steps.
+
+ There are currently no common build target setup steps common to all DTS node types.
+
+ Args:
+ build_target_config: The build target test run configuration according to which
+ the setup steps will be taken.
"""
self._set_up_build_target(build_target_config)
def _set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target setup steps.
"""
def tear_down_build_target(self) -> None:
- """
- Perform the build target teardown that will be done after each build target
- tested on this node.
+ """Build target teardown steps.
+
+ There are currently no common build target teardown steps common to all DTS node types.
"""
self._tear_down_build_target()
def _tear_down_build_target(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target teardown steps.
"""
def create_session(self, name: str) -> OSSession:
- """
- Create and return a new OSSession tailored to the remote OS.
+ """Create and return a new OS-aware remote session.
+
+ The returned session won't be used by the node creating it. The session must be used by
+ the caller. The session will be maintained for the entire lifecycle of the node object,
+ at the end of which the session will be cleaned up automatically.
+
+ Note:
+ Any number of these supplementary sessions may be created.
+
+ Args:
+ name: The name of the session.
+
+ Returns:
+ A new OS-aware remote session.
"""
session_name = f"{self.name} {name}"
connection = create_session(
@@ -157,19 +206,19 @@ def create_interactive_shell(
privileged: bool = False,
app_args: str = "",
) -> InteractiveShellType:
- """Create a handler for an interactive session.
+ """Factory for interactive session handlers.
- Instantiate shell_cls according to the remote OS specifics.
+ Instantiate `shell_cls` according to the remote OS specifics.
Args:
shell_cls: The class of the shell.
- timeout: Timeout for reading output from the SSH channel. If you are
- reading from the buffer and don't receive any data within the timeout
- it will throw an error.
+ timeout: Timeout for reading output from the SSH channel. If you are reading from
+ the buffer and don't receive any data within the timeout it will throw an error.
privileged: Whether to run the shell with administrative privileges.
app_args: The arguments to be passed to the application.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not shell_cls.dpdk_app:
shell_cls.path = self.main_session.join_remote_path(shell_cls.path)
@@ -186,14 +235,22 @@ def filter_lcores(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
) -> list[LogicalCore]:
- """
- Filter the LogicalCores found on the Node according to
- a LogicalCoreCount or a LogicalCoreList.
+ """Filter the node's logical cores that DTS can use.
+
+ Logical cores that DTS can use are the ones that are present on the node, but filtered
+ according to the test run configuration. The `filter_specifier` will filter cores from
+ those logical cores.
+
+ Args:
+ filter_specifier: Two different filters can be used, one that specifies the number
+ of logical cores per core, cores per socket and the number of sockets,
+ and another one that specifies a logical core list.
+ ascending: If :data:`True`, use cores with the lowest numerical id first and continue
+ in ascending order. If :data:`False`, start with the highest id and continue
+ in descending order. This ordering affects which sockets to consider first as well.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+ Returns:
+ The filtered logical cores.
"""
self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
return lcore_filter(
@@ -203,17 +260,14 @@ def filter_lcores(
).filter()
def _get_remote_cpus(self) -> None:
- """
- Scan CPUs in the remote OS and store a list of LogicalCores.
- """
+ """Scan CPUs in the remote OS and store a list of LogicalCores."""
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
def _setup_hugepages(self) -> None:
- """
- Setup hugepages on the Node. Different architectures can supply different
- amounts of memory for hugepages and numa-based hugepage allocation may need
- to be considered.
+ """Setup hugepages on the node.
+
+ Configure the hugepages only if they're specified in the node's test run configuration.
"""
if self.config.hugepages:
self.main_session.setup_hugepages(
@@ -221,8 +275,11 @@ def _setup_hugepages(self) -> None:
)
def configure_port_state(self, port: Port, enable: bool = True) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port`.
+
+ Args:
+ port: The port to enable/disable.
+ enable: :data:`True` to enable, :data:`False` to disable.
"""
self.main_session.configure_port_state(port, enable)
@@ -232,15 +289,17 @@ def configure_port_ip_address(
port: Port,
delete: bool = False,
) -> None:
- """
- Configure the IP address of a port on this node.
+ """Add an IP address to `port` on this node.
+
+ Args:
+ address: The IP address with mask in CIDR format. Can be either IPv4 or IPv6.
+ port: The port to which to add the address.
+ delete: If :data:`True`, will delete the address from the port instead of adding it.
"""
self.main_session.configure_port_ip_address(address, port, delete)
def close(self) -> None:
- """
- Close all connections and free other resources.
- """
+ """Close all connections and free other resources."""
if self.main_session:
self.main_session.close()
for session in self._other_sessions:
@@ -249,6 +308,11 @@ def close(self) -> None:
@staticmethod
def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+ """Skip the decorated function.
+
+ The :option:`--skip-setup` command line argument and the :envvar:`DTS_SKIP_SETUP`
+ environment variable enable the decorator.
+ """
if SETTINGS.skip_setup:
return lambda *args: None
else:
@@ -258,6 +322,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
def create_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> OSSession:
+ """Factory for OS-aware sessions.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
match node_config.os:
case OS.linux:
return LinuxSession(node_config, name, logger)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 18/23] dts: sut and tg nodes docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (16 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 17/23] dts: node " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 19/23] dts: base traffic generators " Juraj Linkeš
` (5 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/sut_node.py | 219 ++++++++++++++++--------
dts/framework/testbed_model/tg_node.py | 42 +++--
2 files changed, 170 insertions(+), 91 deletions(-)
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 4e33cf02ea..b57d48fd31 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -3,6 +3,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""System under test (DPDK + hardware) node.
+
+A system under test (SUT) is the combination of DPDK
+and the hardware we're testing with DPDK (NICs, crypto and other devices).
+An SUT node is where this SUT runs.
+"""
+
+
import os
import tarfile
import time
@@ -26,6 +34,11 @@
class EalParameters(object):
+ """The environment abstraction layer parameters.
+
+ The string representation can be created by converting the instance to a string.
+ """
+
def __init__(
self,
lcore_list: LogicalCoreList,
@@ -35,21 +48,23 @@ def __init__(
vdevs: list[VirtualDevice],
other_eal_param: str,
):
- """
- Generate eal parameters character string;
- :param lcore_list: the list of logical cores to use.
- :param memory_channels: the number of memory channels to use.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
+ """Initialize the parameters according to inputs.
+
+ Process the parameters into the format used on the command line.
+
+ Args:
+ lcore_list: The list of logical cores to use.
+ memory_channels: The number of memory channels to use.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``
"""
self._lcore_list = f"-l {lcore_list}"
self._memory_channels = f"-n {memory_channels}"
@@ -61,6 +76,7 @@ def __init__(
self._other_eal_param = other_eal_param
def __str__(self) -> str:
+ """Create the EAL string."""
return (
f"{self._lcore_list} "
f"{self._memory_channels} "
@@ -72,11 +88,21 @@ def __str__(self) -> str:
class SutNode(Node):
- """
- A class for managing connections to the System under Test, providing
- methods that retrieve the necessary information about the node (such as
- CPU, memory and NIC details) and configuration capabilities.
- Another key capability is building DPDK according to given build target.
+ """The system under test node.
+
+ The SUT node extends :class:`Node` with DPDK specific features:
+
+ * DPDK build,
+ * Gathering of DPDK build info,
+ * The running of DPDK apps, interactively or one-time execution,
+ * DPDK apps cleanup.
+
+ The :option:`--tarball` command line argument and the :envvar:`DTS_DPDK_TARBALL`
+ environment variable configure the path to the DPDK tarball
+ or the git commit ID, tag ID or tree ID to test.
+
+ Attributes:
+ config: The SUT node configuration
"""
config: SutNodeConfiguration
@@ -93,6 +119,11 @@ class SutNode(Node):
_compiler_version: str | None
def __init__(self, node_config: SutNodeConfiguration):
+ """Extend the constructor with SUT node specifics.
+
+ Args:
+ node_config: The SUT node's test run configuration.
+ """
super(SutNode, self).__init__(node_config)
self._dpdk_prefix_list = []
self._build_target_config = None
@@ -111,6 +142,12 @@ def __init__(self, node_config: SutNodeConfiguration):
@property
def _remote_dpdk_dir(self) -> PurePath:
+ """The remote DPDK dir.
+
+ This internal property should be set after extracting the DPDK tarball. If it's not set,
+ that implies the DPDK setup step has been skipped, in which case we can guess where
+ a previous build was located.
+ """
if self.__remote_dpdk_dir is None:
self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
return self.__remote_dpdk_dir
@@ -121,6 +158,11 @@ def _remote_dpdk_dir(self, value: PurePath) -> None:
@property
def remote_dpdk_build_dir(self) -> PurePath:
+ """The remote DPDK build directory.
+
+ This is the directory where DPDK was built.
+ We assume it was built in a subdirectory of the extracted tarball.
+ """
if self._build_target_config:
return self.main_session.join_remote_path(
self._remote_dpdk_dir, self._build_target_config.name
@@ -130,6 +172,7 @@ def remote_dpdk_build_dir(self) -> PurePath:
@property
def dpdk_version(self) -> str:
+ """Last built DPDK version."""
if self._dpdk_version is None:
self._dpdk_version = self.main_session.get_dpdk_version(
self._remote_dpdk_dir
@@ -138,12 +181,14 @@ def dpdk_version(self) -> str:
@property
def node_info(self) -> NodeInfo:
+ """Additional node information."""
if self._node_info is None:
self._node_info = self.main_session.get_node_info()
return self._node_info
@property
def compiler_version(self) -> str:
+ """The node's compiler version."""
if self._compiler_version is None:
if self._build_target_config is not None:
self._compiler_version = self.main_session.get_compiler_version(
@@ -158,6 +203,11 @@ def compiler_version(self) -> str:
return self._compiler_version
def get_build_target_info(self) -> BuildTargetInfo:
+ """Get additional build target information.
+
+ Returns:
+ The build target information,
+ """
return BuildTargetInfo(
dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
)
@@ -168,8 +218,9 @@ def _guess_dpdk_remote_dir(self) -> PurePath:
def _set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Setup DPDK on the SUT node.
+ """Setup DPDK on the SUT node.
+
+ Additional build target setup steps on top of those in :class:`Node`.
"""
# we want to ensure that dpdk_version and compiler_version is reset for new
# build targets
@@ -182,9 +233,7 @@ def _set_up_build_target(
def _configure_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Populate common environment variables and set build target config.
- """
+ """Populate common environment variables and set build target config."""
self._env_vars = {}
self._build_target_config = build_target_config
self._env_vars.update(
@@ -199,9 +248,7 @@ def _configure_build_target(
@Node.skip_setup
def _copy_dpdk_tarball(self) -> None:
- """
- Copy to and extract DPDK tarball on the SUT node.
- """
+ """Copy to and extract DPDK tarball on the SUT node."""
self._logger.info("Copying DPDK tarball to SUT.")
self.main_session.copy_to(SETTINGS.dpdk_tarball_path, self._remote_tmp_dir)
@@ -232,8 +279,9 @@ def _copy_dpdk_tarball(self) -> None:
@Node.skip_setup
def _build_dpdk(self) -> None:
- """
- Build DPDK. Uses the already configured target. Assumes that the tarball has
+ """Build DPDK.
+
+ Uses the already configured target. Assumes that the tarball has
already been copied to and extracted on the SUT node.
"""
self.main_session.build_dpdk(
@@ -244,15 +292,19 @@ def _build_dpdk(self) -> None:
)
def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePath:
- """
- Build one or all DPDK apps. Requires DPDK to be already built on the SUT node.
- When app_name is 'all', build all example apps.
- When app_name is any other string, tries to build that example app.
- Return the directory path of the built app. If building all apps, return
- the path to the examples directory (where all apps reside).
- The meson_dpdk_args are keyword arguments
- found in meson_option.txt in root DPDK directory. Do not use -D with them,
- for example: enable_kmods=True.
+ """Build one or all DPDK apps.
+
+ Requires DPDK to be already built on the SUT node.
+
+ Args:
+ app_name: The name of the DPDK app to build.
+ When `app_name` is ``all``, build all example apps.
+ meson_dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Returns:
+ The directory path of the built app. If building all apps, return
+ the path to the examples directory (where all apps reside).
"""
self.main_session.build_dpdk(
self._env_vars,
@@ -273,9 +325,7 @@ def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePa
)
def kill_cleanup_dpdk_apps(self) -> None:
- """
- Kill all dpdk applications on the SUT. Cleanup hugepages.
- """
+ """Kill all dpdk applications on the SUT, then clean up hugepages."""
if self._dpdk_kill_session and self._dpdk_kill_session.is_alive():
# we can use the session if it exists and responds
self._dpdk_kill_session.kill_cleanup_dpdk_apps(self._dpdk_prefix_list)
@@ -294,33 +344,34 @@ def create_eal_parameters(
vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
- """
- Generate eal parameters character string;
- :param lcore_filter_specifier: a number of lcores/cores/sockets to use
- or a list of lcore ids to use.
- The default will select one lcore for each of two cores
- on one socket, in ascending order of core ids.
- :param ascending_cores: True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the
- highest id and continue in descending order. This ordering
- affects which sockets to consider first as well.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param append_prefix_timestamp: if True, will append a timestamp to
- DPDK file prefix.
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
- :return: eal param string, eg:
- '-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420';
- """
+ """Compose the EAL parameters.
+
+ Process the list of cores and the DPDK prefix and pass that along with
+ the rest of the arguments.
+ Args:
+ lcore_filter_specifier: A number of lcores/cores/sockets to use
+ or a list of lcore ids to use.
+ The default will select one lcore for each of two cores
+ on one socket, in ascending order of core ids.
+ ascending_cores: Sort cores in ascending order (lowest to highest IDs).
+ If :data:`False`, sort in descending order.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ append_prefix_timestamp: If :data:`True`, will append a timestamp to DPDK file prefix.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``.
+
+ Returns:
+ An EAL param string, such as
+ ``-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420``.
+ """
lcore_list = LogicalCoreList(
self.filter_lcores(lcore_filter_specifier, ascending_cores)
)
@@ -346,14 +397,29 @@ def create_eal_parameters(
def run_dpdk_app(
self, app_path: PurePath, eal_args: "EalParameters", timeout: float = 30
) -> CommandResult:
- """
- Run DPDK application on the remote node.
+ """Run DPDK application on the remote node.
+
+ The application is not run interactively - the command that starts the application
+ is executed and then the call waits for it to finish execution.
+
+ Args:
+ app_path: The remote path to the DPDK application.
+ eal_args: EAL parameters to run the DPDK application with.
+ timeout: Wait at most this long in seconds to execute the command.
+
+ Returns:
+ The result of the DPDK app execution.
"""
return self.main_session.send_command(
f"{app_path} {eal_args}", timeout, privileged=True, verify=True
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Enable/disable IPv4 forwarding on the node.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
+ """
self.main_session.configure_ipv4_forwarding(enable)
def create_interactive_shell(
@@ -363,9 +429,13 @@ def create_interactive_shell(
privileged: bool = False,
eal_parameters: EalParameters | str | None = None,
) -> InteractiveShellType:
- """Factory method for creating a handler for an interactive session.
+ """Extend the factory for interactive session handlers.
+
+ The extensions are SUT node specific:
- Instantiate shell_cls according to the remote OS specifics.
+ * The default for `eal_parameters`,
+ * The interactive shell path `shell_cls.path` is prepended with path to the remote
+ DPDK build directory for DPDK apps.
Args:
shell_cls: The class of the shell.
@@ -375,9 +445,10 @@ def create_interactive_shell(
privileged: Whether to run the shell with administrative privileges.
eal_parameters: List of EAL parameters to use to launch the app. If this
isn't provided or an empty string is passed, it will default to calling
- create_eal_parameters().
+ :meth:`create_eal_parameters`.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not eal_parameters:
eal_parameters = self.create_eal_parameters()
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 166eb8430e..69eb33ccb1 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -5,13 +5,8 @@
"""Traffic generator node.
-This is the node where the traffic generator resides.
-The distinction between a node and a traffic generator is as follows:
-A node is a host that DTS connects to. It could be a baremetal server,
-a VM or a container.
-A traffic generator is software running on the node.
-A traffic generator node is a node running a traffic generator.
-A node can be a traffic generator node as well as system under test node.
+A traffic generator (TG) generates traffic that's sent towards the SUT node.
+A TG node is where the TG runs.
"""
from scapy.packet import Packet # type: ignore[import]
@@ -24,13 +19,16 @@
class TGNode(Node):
- """Manage connections to a node with a traffic generator.
+ """The traffic generator node.
- Apart from basic node management capabilities, the Traffic Generator node has
- specialized methods for handling the traffic generator running on it.
+ The TG node extends :class:`Node` with TG specific features:
- Arguments:
- node_config: The user configuration of the traffic generator node.
+ * Traffic generator initialization,
+ * The sending of traffic and receiving packets,
+ * The sending of traffic without receiving packets.
+
+ Not all traffic generators are capable of capturing traffic, which is why there
+ must be a way to send traffic without that.
Attributes:
traffic_generator: The traffic generator running on the node.
@@ -39,6 +37,13 @@ class TGNode(Node):
traffic_generator: CapturingTrafficGenerator
def __init__(self, node_config: TGNodeConfiguration):
+ """Extend the constructor with TG node specifics.
+
+ Initialize the traffic generator on the TG node.
+
+ Args:
+ node_config: The TG node's test run configuration.
+ """
super(TGNode, self).__init__(node_config)
self.traffic_generator = create_traffic_generator(
self, node_config.traffic_generator
@@ -52,17 +57,17 @@ def send_packet_and_capture(
receive_port: Port,
duration: float = 1,
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet`, return received traffic.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given duration. Also record the captured traffic
in a pcap file.
Args:
packet: The packet to send.
send_port: The egress port on the TG node.
receive_port: The ingress port in the TG node.
- duration: Capture traffic for this amount of time after sending the packet.
+ duration: Capture traffic for this amount of time after sending `packet`.
Returns:
A list of received packets. May be empty if no packets are captured.
@@ -72,6 +77,9 @@ def send_packet_and_capture(
)
def close(self) -> None:
- """Free all resources used by the node"""
+ """Free all resources used by the node.
+
+ This extends the superclass method with TG cleanup.
+ """
self.traffic_generator.close()
super(TGNode, self).close()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 19/23] dts: base traffic generators docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (17 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 18/23] dts: sut and tg nodes " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 20/23] dts: scapy tg " Juraj Linkeš
` (4 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../traffic_generator/__init__.py | 22 ++++++++-
.../capturing_traffic_generator.py | 46 +++++++++++--------
.../traffic_generator/traffic_generator.py | 33 +++++++------
3 files changed, 68 insertions(+), 33 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index 11bfa1ee0f..51cca77da4 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -1,6 +1,19 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""DTS traffic generators.
+
+A traffic generator is capable of generating traffic and then monitor returning traffic.
+A traffic generator may just count the number of received packets
+and it may additionally capture individual packets.
+
+A traffic generator may be software running on generic hardware or it could be specialized hardware.
+
+The traffic generators that only count the number of received packets are suitable only for
+performance testing. In functional testing, we need to be able to dissect each arrived packet
+and a capturing traffic generator is required.
+"""
+
from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
from framework.exception import ConfigurationError
from framework.testbed_model.node import Node
@@ -12,8 +25,15 @@
def create_traffic_generator(
tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
+ """The factory function for creating traffic generator objects from the test run configuration.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ traffic_generator_config: The traffic generator config.
+ Returns:
+ A traffic generator capable of capturing received packets.
+ """
match traffic_generator_config.traffic_generator_type:
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGenerator(tg_node, traffic_generator_config)
diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index e521211ef0..b0a43ad003 100644
--- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -23,19 +23,22 @@
def _get_default_capture_name() -> str:
- """
- This is the function used for the default implementation of capture names.
- """
return str(uuid.uuid4())
class CapturingTrafficGenerator(TrafficGenerator):
"""Capture packets after sending traffic.
- A mixin interface which enables a packet generator to declare that it can capture
+ The intermediary interface which enables a packet generator to declare that it can capture
packets and return them to the user.
+ Similarly to
+ :class:`~framework.testbed_model.traffic_generator.traffic_generator.TrafficGenerator`,
+ this class exposes the public methods specific to capturing traffic generators and defines
+ a private method that must implement the traffic generation and capturing logic in subclasses.
+
The methods of capturing traffic generators obey the following workflow:
+
1. send packets
2. capture packets
3. write the capture to a .pcap file
@@ -44,6 +47,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
@property
def is_capturing(self) -> bool:
+ """This traffic generator can capture traffic."""
return True
def send_packet_and_capture(
@@ -54,11 +58,12 @@ def send_packet_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet` and capture received traffic.
+
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ The captured traffic is recorded in the `capture_name`.pcap file.
Args:
packet: The packet to send.
@@ -68,7 +73,7 @@ def send_packet_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
return self.send_packets_and_capture(
[packet], send_port, receive_port, duration, capture_name
@@ -82,11 +87,14 @@ def send_packets_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send packets, return received traffic.
+ """Send `packets` and capture received traffic.
- Send packets on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ Send `packets` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
+
+ The captured traffic is recorded in the `capture_name`.pcap file. The target directory
+ can be configured with the :option:`--output-dir` command line argument or
+ the :envvar:`DTS_OUTPUT_DIR` environment variable.
Args:
packets: The packets to send.
@@ -96,7 +104,7 @@ def send_packets_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
self._logger.debug(get_packet_summaries(packets))
self._logger.debug(
@@ -124,10 +132,12 @@ def _send_packets_and_capture(
receive_port: Port,
duration: float,
) -> list[Packet]:
- """
- The extended classes must implement this method which
- sends packets on send_port and receives packets on the receive_port
- for the specified duration. It must be able to handle no received packets.
+ """The implementation of :method:`send_packets_and_capture`.
+
+ The subclasses must implement this method which sends `packets` on `send_port`
+ and receives packets on `receive_port` for the specified `duration`.
+
+ It must be able to handle no received packets.
"""
def _write_capture_from_packets(
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index ea7c3963da..ed396c6a2f 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -22,7 +22,8 @@
class TrafficGenerator(ABC):
"""The base traffic generator.
- Defines the few basic methods that each traffic generator must implement.
+ Exposes the common public methods of all traffic generators and defines private methods
+ that must implement the traffic generation logic in subclasses.
"""
_config: TrafficGeneratorConfig
@@ -30,6 +31,12 @@ class TrafficGenerator(ABC):
_logger: DTSLOG
def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ """Initialize the traffic generator.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ config: The traffic generator's test run configuration.
+ """
self._config = config
self._tg_node = tg_node
self._logger = getLogger(
@@ -37,9 +44,9 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
)
def send_packet(self, packet: Packet, port: Port) -> None:
- """Send a packet and block until it is fully sent.
+ """Send `packet` and block until it is fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packet` on `port`, then wait until `packet` is fully sent.
Args:
packet: The packet to send.
@@ -48,9 +55,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
self.send_packets([packet], port)
def send_packets(self, packets: list[Packet], port: Port) -> None:
- """Send packets and block until they are fully sent.
+ """Send `packets` and block until they are fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packets` on `port`, then wait until `packets` are fully sent.
Args:
packets: The packets to send.
@@ -62,19 +69,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
@abstractmethod
def _send_packets(self, packets: list[Packet], port: Port) -> None:
- """
- The extended classes must implement this method which
- sends packets on send_port. The method should block until all packets
- are fully sent.
+ """The implementation of :method:`send_packets`.
+
+ The subclasses must implement this method which sends `packets` on `port`.
+ The method should block until all `packets` are fully sent.
+
+ What full sent means is defined by the traffic generator.
"""
@property
def is_capturing(self) -> bool:
- """Whether this traffic generator can capture traffic.
-
- Returns:
- True if the traffic generator can capture traffic, False otherwise.
- """
+ """This traffic generator can't capture traffic."""
return False
@abstractmethod
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 20/23] dts: scapy tg docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (18 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 19/23] dts: base traffic generators " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:15 ` [PATCH v5 21/23] dts: test suites " Juraj Linkeš
` (3 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
1 file changed, 54 insertions(+), 37 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
index 51864b6e6b..d0fe03055a 100644
--- a/dts/framework/testbed_model/traffic_generator/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -2,14 +2,15 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""Scapy traffic generator.
+"""The Scapy traffic generator.
-Traffic generator used for functional testing, implemented using the Scapy library.
+A traffic generator used for functional testing, implemented with
+`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
The traffic generator uses an XML-RPC server to run Scapy on the remote TG node.
-The XML-RPC server runs in an interactive remote SSH session running Python console,
-where we start the server. The communication with the server is facilitated with
-a local server proxy.
+The traffic generator uses the :mod:`xmlrpc.server` module to run an XML-RPC server
+in an interactive remote Python SSH session. The communication with the server is facilitated
+with a local server proxy from the :mod:`xmlrpc.client` module.
"""
import inspect
@@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
recv_iface: str,
duration: float,
) -> list[bytes]:
- """RPC function to send and capture packets.
+ """The RPC function to send and capture packets.
- The function is meant to be executed on the remote TG node.
+ The function is meant to be executed on the remote TG node via the server proxy.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
recv_iface: The logical name of the ingress interface.
duration: Capture for this amount of time, in seconds.
Returns:
A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
+ to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
sniffer = scapy.all.AsyncSniffer(
@@ -98,19 +99,15 @@ def scapy_send_packets_and_capture(
def scapy_send_packets(
xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: str
) -> None:
- """RPC function to send packets.
+ """The RPC function to send packets.
- The function is meant to be executed on the remote TG node.
- It doesn't return anything, only sends packets.
+ The function is meant to be executed on the remote TG node via the server proxy.
+ It only sends `xmlrpc_packets`, without capturing them.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
-
- Returns:
- A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True, verbose=True)
@@ -130,11 +127,19 @@ def scapy_send_packets(
class QuittableXMLRPCServer(SimpleXMLRPCServer):
- """Basic XML-RPC server that may be extended
- by functions serializable by the marshal module.
+ r"""Basic XML-RPC server.
+
+ The server may be augmented by functions serializable by the :mod:`marshal` module.
"""
def __init__(self, *args, **kwargs):
+ """Extend the XML-RPC server initialization.
+
+ Args:
+ args: The positional arguments that will be passed to the superclass's constructor.
+ kwargs: The keyword arguments that will be passed to the superclass's constructor.
+ The `allow_none` argument will be set to ``True``.
+ """
kwargs["allow_none"] = True
super().__init__(*args, **kwargs)
self.register_introspection_functions()
@@ -142,13 +147,12 @@ def __init__(self, *args, **kwargs):
self.register_function(self.add_rpc_function)
def quit(self) -> None:
+ """Quit the server."""
self._BaseServer__shutdown_request = True
return None
def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
- """Add a function to the server.
-
- This is meant to be executed remotely.
+ """Add a function to the server from the local server proxy.
Args:
name: The name of the function.
@@ -159,6 +163,11 @@ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> N
self.register_function(function)
def serve_forever(self, poll_interval: float = 0.5) -> None:
+ """Extend the superclass method with an additional print.
+
+ Once executed in the local server proxy, the print gives us a clear string to expect
+ when starting the server. The print means the function was executed on the XML-RPC server.
+ """
print("XMLRPC OK")
super().serve_forever(poll_interval)
@@ -166,19 +175,12 @@ def serve_forever(self, poll_interval: float = 0.5) -> None:
class ScapyTrafficGenerator(CapturingTrafficGenerator):
"""Provides access to scapy functions via an RPC interface.
- The traffic generator first starts an XML-RPC on the remote TG node.
- Then it populates the server with functions which use the Scapy library
- to send/receive traffic.
-
- Any packets sent to the remote server are first converted to bytes.
- They are received as xmlrpc.client.Binary objects on the server side.
- When the server sends the packets back, they are also received as
- xmlrpc.client.Binary object on the client side, are converted back to Scapy
- packets and only then returned from the methods.
+ The class extends the base with remote execution of scapy functions.
- Arguments:
- tg_node: The node where the traffic generator resides.
- config: The user configuration of the traffic generator.
+ Any packets sent to the remote server are first converted to bytes. They are received as
+ :class:`~xmlrpc.client.Binary` objects on the server side. When the server sends the packets
+ back, they are also received as :class:`~xmlrpc.client.Binary` objects on the client side, are
+ converted back to :class:`scapy.packet.Packet` objects and only then returned from the methods.
Attributes:
session: The exclusive interactive remote session created by the Scapy
@@ -192,6 +194,22 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
_config: ScapyTrafficGeneratorConfig
def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ """Extend the constructor with Scapy TG specifics.
+
+ The traffic generator first starts an XML-RPC on the remote `tg_node`.
+ Then it populates the server with functions which use the Scapy library
+ to send/receive traffic:
+
+ * :func:`scapy_send_packets_and_capture`
+ * :func:`scapy_send_packets`
+
+ To enable verbose logging from the xmlrpc client, use the :option:`--verbose`
+ command line argument or the :envvar:`DTS_VERBOSE` environment variable.
+
+ Args:
+ tg_node: The node where the traffic generator resides.
+ config: The traffic generator's test run configuration.
+ """
super().__init__(tg_node, config)
assert (
@@ -237,10 +255,8 @@ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
[line for line in src.splitlines() if not line.isspace() and line != ""]
)
- spacing = "\n" * 4
-
# execute it in the python terminal
- self.session.send_command(spacing + src + spacing)
+ self.session.send_command(src + "\n")
self.session.send_command(
f"server = QuittableXMLRPCServer(('0.0.0.0', {listen_port}));"
f"server.serve_forever()",
@@ -274,6 +290,7 @@ def _send_packets_and_capture(
return scapy_packets
def close(self) -> None:
+ """Close the traffic generator."""
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 21/23] dts: test suites docstring update
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (19 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 20/23] dts: scapy tg " Juraj Linkeš
@ 2023-11-06 17:15 ` Juraj Linkeš
2023-11-06 17:16 ` [PATCH v5 22/23] dts: add doc generation dependencies Juraj Linkeš
` (2 subsequent siblings)
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/tests/TestSuite_hello_world.py | 16 +++++----
dts/tests/TestSuite_os_udp.py | 16 +++++----
dts/tests/TestSuite_smoke_tests.py | 53 +++++++++++++++++++++++++++---
3 files changed, 68 insertions(+), 17 deletions(-)
diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
index 7e3d95c0cf..662a8f8726 100644
--- a/dts/tests/TestSuite_hello_world.py
+++ b/dts/tests/TestSuite_hello_world.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2010-2014 Intel Corporation
-"""
+"""The DPDK hello world app test suite.
+
Run the helloworld example app and verify it prints a message for each used core.
No other EAL parameters apart from cores are used.
"""
@@ -15,22 +16,25 @@
class TestHelloWorld(TestSuite):
+ """DPDK hello world app test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Build the app we're about to test - helloworld.
"""
self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
def test_hello_world_single_core(self) -> None:
- """
+ """Single core test case.
+
Steps:
Run the helloworld app on the first usable logical core.
Verify:
The app prints a message from the used core:
"hello from core <core_id>"
"""
-
# get the first usable core
lcore_amount = LogicalCoreCount(1, 1, 1)
lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
@@ -44,14 +48,14 @@ def test_hello_world_single_core(self) -> None:
)
def test_hello_world_all_cores(self) -> None:
- """
+ """All cores test case.
+
Steps:
Run the helloworld app on all usable logical cores.
Verify:
The app prints a message from all used cores:
"hello from core <core_id>"
"""
-
# get the maximum logical core number
eal_para = self.sut_node.create_eal_parameters(
lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
index 9b5f39711d..f99c4d76e3 100644
--- a/dts/tests/TestSuite_os_udp.py
+++ b/dts/tests/TestSuite_os_udp.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
+"""Basic IPv4 OS routing test suite.
+
Configure SUT node to route traffic from if1 to if2.
Send a packet to the SUT node, verify it comes back on the second port on the TG node.
"""
@@ -13,22 +14,24 @@
class TestOSUdp(TestSuite):
+ """IPv4 UDP OS routing test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Configure SUT ports and SUT to route traffic from if1 to if2.
"""
-
self.configure_testbed_ipv4()
def test_os_udp(self) -> None:
- """
+ """Basic UDP IPv4 traffic test case.
+
Steps:
Send a UDP packet.
Verify:
The packet with proper addresses arrives at the other TG port.
"""
-
packet = Ether() / IP() / UDP()
received_packets = self.send_packet_and_capture(packet)
@@ -38,7 +41,8 @@ def test_os_udp(self) -> None:
self.verify_packets(expected_packet, received_packets)
def tear_down_suite(self) -> None:
- """
+ """Tear down the test suite.
+
Teardown:
Remove the SUT port configuration configured in setup.
"""
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index 4a269df75b..36ff10a862 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -1,6 +1,17 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Smoke test suite.
+
+Smoke tests are a class of tests which are used for validating a minimal set of important features.
+These are the most important features without which (or when they're faulty) the software wouldn't
+work properly. Thus, if any failure occurs while testing these features,
+there isn't that much of a reason to continue testing, as the software is fundamentally broken.
+
+These tests don't have to include only DPDK tests, as the reason for failures could be
+in the infrastructure (a faulty link between NICs or a misconfiguration).
+"""
+
import re
from framework.config import PortConfig
@@ -11,13 +22,25 @@
class SmokeTests(TestSuite):
+ """DPDK and infrastructure smoke test suite.
+
+ The test cases validate the most basic DPDK functionality needed for all other test suites.
+ The infrastructure also needs to be tested, as that is also used by all other test suites.
+
+ Attributes:
+ is_blocking: This test suite will block the execution of all other test suites
+ in the build target after it.
+ nics_in_node: The NICs present on the SUT node.
+ """
+
is_blocking = True
# dicts in this list are expected to have two keys:
# "pci_address" and "current_driver"
nics_in_node: list[PortConfig] = []
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Set the build directory path and generate a list of NICs in the SUT node.
"""
@@ -25,7 +48,13 @@ def set_up_suite(self) -> None:
self.nics_in_node = self.sut_node.config.ports
def test_unit_tests(self) -> None:
- """
+ """DPDK meson fast-tests unit tests.
+
+ The DPDK unit tests are basic tests that indicate regressions and other critical failures.
+ These need to be addressed before other testing.
+
+ The fast-tests unit tests are a subset with only the most basic tests.
+
Test:
Run the fast-test unit-test suite through meson.
"""
@@ -37,7 +66,14 @@ def test_unit_tests(self) -> None:
)
def test_driver_tests(self) -> None:
- """
+ """DPDK meson driver-tests unit tests.
+
+ The DPDK unit tests are basic tests that indicate regressions and other critical failures.
+ These need to be addressed before other testing.
+
+ The driver-tests unit tests are a subset that test only drivers. These may be run
+ with virtual devices as well.
+
Test:
Run the driver-test unit-test suite through meson.
"""
@@ -63,7 +99,10 @@ def test_driver_tests(self) -> None:
)
def test_devices_listed_in_testpmd(self) -> None:
- """
+ """Testpmd device discovery.
+
+ If the configured devices can't be found in testpmd, they can't be tested.
+
Test:
Uses testpmd driver to verify that devices have been found by testpmd.
"""
@@ -79,7 +118,11 @@ def test_devices_listed_in_testpmd(self) -> None:
)
def test_device_bound_to_driver(self) -> None:
- """
+ """Device driver in OS.
+
+ The devices must be bound to the proper driver, otherwise they can't be used by DPDK
+ or the traffic generators.
+
Test:
Ensure that all drivers listed in the config are bound to the correct
driver.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 22/23] dts: add doc generation dependencies
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (20 preceding siblings ...)
2023-11-06 17:15 ` [PATCH v5 21/23] dts: test suites " Juraj Linkeš
@ 2023-11-06 17:16 ` Juraj Linkeš
2023-11-06 17:16 ` [PATCH v5 23/23] dts: add doc generation Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:16 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 499 ++++++++++++++++++++++++++++++++++++++++++++-
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 1 deletion(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index a734fa71f0..dea98f6913 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -404,6 +651,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -515,6 +773,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -632,6 +904,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -753,6 +1046,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -775,6 +1084,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "6.2.1"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
+ {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.2"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+ {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
+ {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -819,6 +1299,23 @@ files = [
{file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -837,4 +1334,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
+content-hash = "5faad2e53833e9b8a353ad3554c58de991801a9ebe8f9712fc9c839b35e7a789"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 3943c87c87..98df431b3b 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<7"
+sphinx-rtd-theme = "^1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 23/23] dts: add doc generation
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (21 preceding siblings ...)
2023-11-06 17:16 ` [PATCH v5 22/23] dts: add doc generation dependencies Juraj Linkeš
@ 2023-11-06 17:16 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:16 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
The tool used to generate developer docs is Sphinx, which is already
used in DPDK. The same configuration is used to preserve style, but it's
been augmented with doc-generating configuration and a change to how the
sidebar displays the content hierarchy.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 29 ++++++++++++-------
doc/api/meson.build | 1 +
doc/guides/conf.py | 34 +++++++++++++++++++----
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 32 ++++++++++++++++++++-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/index.rst | 17 ++++++++++++
dts/doc/meson.build | 49 +++++++++++++++++++++++++++++++++
dts/meson.build | 16 +++++++++++
meson.build | 1 +
10 files changed, 165 insertions(+), 16 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..c2f3acfb1d 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,46 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run, PIPE, STDOUT
from packaging.version import Version
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default='.')
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+os.environ['DTS_ROOT'] = args.dts_root
# for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
if Version(ver) >= Version('1.7'):
sphinx_cmd += ['-j', 'auto']
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..92fe10d9e7 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..169b1d24bc 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,31 @@
file=stderr)
pass
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = False
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+# Sidebar config
+html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+}
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +59,8 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+path.append(environ.get('DTS_ROOT'))
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index b1e99107c3..2b96bb11f1 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -283,7 +283,10 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
warnings when some of the basics are not met.
-The code must be properly documented with docstrings. The style must conform to
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the `API docs build steps <building_api_docs>`_.
+
+Speaking of which, the code must be properly documented with docstrings. The style must conform to
the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style
`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
@@ -408,3 +411,30 @@ There are three tools used in DTS to help with code checking, style and formatti
These three tools are all used in ``devtools/dts-check-format.sh``,
the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+
+
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system. After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..f5dcd553f2
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,17 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's documentation!
+===========================================
+
+.. toctree::
+ :titlesonly:
+ :caption: Contents:
+
+ framework
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..d2e8c19789
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,49 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build')
+sphinx_apidoc = find_program('sphinx-apidoc')
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_api_framework_dir = join_paths(dts_dir, 'framework')
+dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+dts_api_src = custom_target('dts_api_src',
+ output: 'modules.rst',
+ command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
+ sphinx_apidoc, '--append-syspath', '--force',
+ '--module-first', '--separate', '-V', meson.project_version(),
+ '--output-dir', dts_api_build_dir, '--no-toc', '--implicit-namespaces',
+ dts_api_framework_dir],
+ build_by_default: false)
+doc_targets += dts_api_src
+doc_target_names += 'DTS_API_sphinx_sources'
+
+cp = find_program('cp')
+cp_index = custom_target('cp_index',
+ input: ['index.rst', 'conf_yaml_schema.json'],
+ output: 'index.rst',
+ depends: dts_api_src,
+ command: [cp, '--dereference', '@INPUT@', dts_api_build_dir],
+ build_by_default: false)
+doc_targets += cp_index
+doc_target_names += 'DTS_API_sphinx_index'
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ depends: cp_index,
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ dts_api_build_dir, dts_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: false,
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 2e6e546d20..c391bf8c71 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 01/23] dts: code adjustments for doc generation
2023-11-06 17:15 ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
` (22 preceding siblings ...)
2023-11-06 17:16 ` [PATCH v5 23/23] dts: add doc generation Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 02/23] dts: add docstring checker Juraj Linkeš
` (21 more replies)
23 siblings, 22 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
The standard Python tool for generating API documentation, Sphinx,
imports modules one-by-one when generating the documentation. This
requires code changes:
* properly guarding argument parsing in the if __name__ == '__main__'
block,
* the logger used by DTS runner underwent the same treatment so that it
doesn't create log files outside of a DTS run,
* however, DTS uses the arguments to construct an object holding global
variables. The defaults for the global variables needed to be moved
from argument parsing elsewhere,
* importing the remote_session module from framework resulted in
circular imports because of one module trying to import another
module. This is fixed by reorganizing the code,
* some code reorganization was done because the resulting structure
makes more sense, improving documentation clarity.
The are some other changes which are documentation related:
* added missing type annotation so they appear in the generated docs,
* reordered arguments in some methods,
* removed superfluous arguments and attributes,
* change private functions/methods/attributes to private and vice-versa.
The above all appear in the generated documentation and the with them,
the documentation is improved.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 10 ++-
dts/framework/dts.py | 33 +++++--
dts/framework/exception.py | 54 +++++-------
dts/framework/remote_session/__init__.py | 41 ++++-----
.../interactive_remote_session.py | 0
.../{remote => }/interactive_shell.py | 0
.../{remote => }/python_shell.py | 0
.../remote_session/remote/__init__.py | 27 ------
.../{remote => }/remote_session.py | 0
.../{remote => }/ssh_session.py | 12 +--
.../{remote => }/testpmd_shell.py | 0
dts/framework/settings.py | 87 +++++++++++--------
dts/framework/test_result.py | 4 +-
dts/framework/test_suite.py | 7 +-
dts/framework/testbed_model/__init__.py | 12 +--
dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
dts/framework/testbed_model/hw/__init__.py | 27 ------
.../linux_session.py | 6 +-
dts/framework/testbed_model/node.py | 26 ++++--
.../os_session.py | 22 ++---
dts/framework/testbed_model/{hw => }/port.py | 0
.../posix_session.py | 4 +-
dts/framework/testbed_model/sut_node.py | 8 +-
dts/framework/testbed_model/tg_node.py | 30 +------
.../traffic_generator/__init__.py | 24 +++++
.../capturing_traffic_generator.py | 6 +-
.../{ => traffic_generator}/scapy.py | 23 ++---
.../traffic_generator.py | 16 +++-
.../testbed_model/{hw => }/virtual_device.py | 0
dts/framework/utils.py | 46 +++-------
dts/main.py | 9 +-
31 files changed, 259 insertions(+), 288 deletions(-)
rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
delete mode 100644 dts/framework/remote_session/remote/__init__.py
rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
delete mode 100644 dts/framework/testbed_model/hw/__init__.py
rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
rename dts/framework/testbed_model/{hw => }/port.py (100%)
rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (96%)
rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index cb7e00ba34..2044c82611 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -17,6 +17,7 @@
import warlock # type: ignore[import]
import yaml
+from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict):
+ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
# This looks useless now, but is designed to allow expansion to traffic
# generators that require more configuration later.
match TrafficGeneratorType(d["type"]):
@@ -97,6 +98,10 @@ def from_dict(d: dict):
return ScapyTrafficGeneratorConfig(
traffic_generator_type=TrafficGeneratorType.SCAPY
)
+ case _:
+ raise ConfigurationError(
+ f'Unknown traffic generator type "{d["type"]}".'
+ )
@dataclass(slots=True, frozen=True)
@@ -324,6 +329,3 @@ def load_config() -> Configuration:
config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
config_obj: Configuration = Configuration.from_dict(dict(config))
return config_obj
-
-
-CONFIGURATION = load_config()
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index f773f0c38d..4c7fb0c40a 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -6,19 +6,19 @@
import sys
from .config import (
- CONFIGURATION,
BuildTargetConfiguration,
ExecutionConfiguration,
TestSuiteConfig,
+ load_config,
)
from .exception import BlockingTestSuiteError
from .logger import DTSLOG, getLogger
from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
from .test_suite import get_test_suites
from .testbed_model import SutNode, TGNode
-from .utils import check_dts_python_version
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG = None # type: ignore[assignment]
result: DTSResult = DTSResult(dts_logger)
@@ -30,14 +30,18 @@ def run_all() -> None:
global dts_logger
global result
+ # create a regular DTS logger and create a new result with it
+ dts_logger = getLogger("DTSRunner")
+ result = DTSResult(dts_logger)
+
# check the python version of the server that run dts
- check_dts_python_version()
+ _check_dts_python_version()
sut_nodes: dict[str, SutNode] = {}
tg_nodes: dict[str, TGNode] = {}
try:
# for all Execution sections
- for execution in CONFIGURATION.executions:
+ for execution in load_config().executions:
sut_node = sut_nodes.get(execution.system_under_test_node.name)
tg_node = tg_nodes.get(execution.traffic_generator_node.name)
@@ -82,6 +86,25 @@ def run_all() -> None:
_exit_dts()
+def _check_dts_python_version() -> None:
+ def RED(text: str) -> str:
+ return f"\u001B[31;1m{str(text)}\u001B[0m"
+
+ if sys.version_info.major < 3 or (
+ sys.version_info.major == 3 and sys.version_info.minor < 10
+ ):
+ print(
+ RED(
+ (
+ "WARNING: DTS execution node's python version is lower than"
+ "python 3.10, is deprecated and will not work in future releases."
+ )
+ ),
+ file=sys.stderr,
+ )
+ print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+
+
def _run_execution(
sut_node: SutNode,
tg_node: TGNode,
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 001a5a5496..7489c03570 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
Command execution timeout.
"""
- command: str
- output: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _command: str
- def __init__(self, command: str, output: str):
- self.command = command
- self.output = output
+ def __init__(self, command: str):
+ self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self.command}"
-
- def get_output(self) -> str:
- return self.output
+ return f"TIMEOUT on {self._command}"
class SSHConnectionError(DTSError):
@@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
SSH connection error.
"""
- host: str
- errors: list[str]
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
+ _errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
- self.host = host
- self.errors = [] if errors is None else errors
+ self._host = host
+ self._errors = [] if errors is None else errors
def __str__(self) -> str:
- message = f"Error trying to connect with {self.host}."
- if self.errors:
- message += f" Errors encountered while retrying: {', '.join(self.errors)}"
+ message = f"Error trying to connect with {self._host}."
+ if self._errors:
+ message += f" Errors encountered while retrying: {', '.join(self._errors)}"
return message
@@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
It can no longer be used.
"""
- host: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
def __init__(self, host: str):
- self.host = host
+ self._host = host
def __str__(self) -> str:
- return f"SSH session with {self.host} has died"
+ return f"SSH session with {self._host} has died"
class ConfigurationError(DTSError):
@@ -107,18 +102,18 @@ class RemoteCommandExecutionError(DTSError):
Raised when a command executed on a Node returns a non-zero exit status.
"""
- command: str
- command_return_code: int
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ command: str
+ _command_return_code: int
def __init__(self, command: str, command_return_code: int):
self.command = command
- self.command_return_code = command_return_code
+ self._command_return_code = command_return_code
def __str__(self) -> str:
return (
f"Command {self.command} returned a non-zero exit code: "
- f"{self.command_return_code}"
+ f"{self._command_return_code}"
)
@@ -143,22 +138,15 @@ class TestCaseVerifyError(DTSError):
Used in test cases to verify the expected behavior.
"""
- value: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
- def __init__(self, value: str):
- self.value = value
-
- def __str__(self) -> str:
- return repr(self.value)
-
class BlockingTestSuiteError(DTSError):
- suite_name: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
+ _suite_name: str
def __init__(self, suite_name: str) -> None:
- self.suite_name = suite_name
+ self._suite_name = suite_name
def __str__(self) -> str:
- return f"Blocking suite {self.suite_name} failed."
+ return f"Blocking suite {self._suite_name} failed."
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 00b6d1f03a..5e7ddb2b05 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -12,29 +12,24 @@
# pylama:ignore=W0611
-from framework.config import OS, NodeConfiguration
-from framework.exception import ConfigurationError
+from framework.config import NodeConfiguration
from framework.logger import DTSLOG
-from .linux_session import LinuxSession
-from .os_session import InteractiveShellType, OSSession
-from .remote import (
- CommandResult,
- InteractiveRemoteSession,
- InteractiveShell,
- PythonShell,
- RemoteSession,
- SSHSession,
- TestPmdDevice,
- TestPmdShell,
-)
-
-
-def create_session(
+from .interactive_remote_session import InteractiveRemoteSession
+from .interactive_shell import InteractiveShell
+from .python_shell import PythonShell
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
+from .testpmd_shell import TestPmdShell
+
+
+def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> OSSession:
- match node_config.os:
- case OS.linux:
- return LinuxSession(node_config, name, logger)
- case _:
- raise ConfigurationError(f"Unsupported OS {node_config.os}")
+) -> RemoteSession:
+ return SSHSession(node_config, name, logger)
+
+
+def create_interactive_session(
+ node_config: NodeConfiguration, logger: DTSLOG
+) -> InteractiveRemoteSession:
+ return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_remote_session.py
rename to dts/framework/remote_session/interactive_remote_session.py
diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_shell.py
rename to dts/framework/remote_session/interactive_shell.py
diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/python_shell.py
rename to dts/framework/remote_session/python_shell.py
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 06403691a5..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-# Copyright(c) 2023 University of New Hampshire
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .interactive_remote_session import InteractiveRemoteSession
-from .interactive_shell import InteractiveShell
-from .python_shell import PythonShell
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-from .testpmd_shell import TestPmdDevice, TestPmdShell
-
-
-def create_remote_session(
- node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
- return SSHSession(node_config, name, logger)
-
-
-def create_interactive_session(
- node_config: NodeConfiguration, logger: DTSLOG
-) -> InteractiveRemoteSession:
- return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 91%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
index 8d127f1601..cee11d14d6 100644
--- a/dts/framework/remote_session/remote/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -18,9 +18,7 @@
SSHException,
)
-from framework.config import NodeConfiguration
from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
-from framework.logger import DTSLOG
from .remote_session import CommandResult, RemoteSession
@@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
session: Connection
- def __init__(
- self,
- node_config: NodeConfiguration,
- session_name: str,
- logger: DTSLOG,
- ):
- super(SSHSession, self).__init__(node_config, session_name, logger)
-
def _connect(self) -> None:
errors = []
retry_attempts = 10
@@ -117,7 +107,7 @@ def _send_command(
except CommandTimedOut as e:
self._logger.exception(e)
- raise SSHTimeoutError(command, e.result.stderr) from e
+ raise SSHTimeoutError(command) from e
return CommandResult(
self.name, command, output.stdout, output.stderr, output.return_code
diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/testpmd_shell.py
rename to dts/framework/remote_session/testpmd_shell.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index cfa39d011b..7f5841d073 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
from pathlib import Path
from typing import Any, TypeVar
@@ -22,8 +22,8 @@ def __init__(
option_strings: Sequence[str],
dest: str,
nargs: str | int | None = None,
- const: str | None = None,
- default: str = None,
+ const: bool | None = None,
+ default: Any = None,
type: Callable[[str], _T | argparse.FileType | None] = None,
choices: Iterable[_T] | None = None,
required: bool = False,
@@ -32,6 +32,12 @@ def __init__(
) -> None:
env_var_value = os.environ.get(env_var)
default = env_var_value or default
+ if const is not None:
+ nargs = 0
+ default = const if env_var_value else default
+ type = None
+ choices = None
+ metavar = None
super(_EnvironmentArgument, self).__init__(
option_strings,
dest,
@@ -52,22 +58,28 @@ def __call__(
values: Any,
option_string: str = None,
) -> None:
- setattr(namespace, self.dest, values)
+ if self.const is not None:
+ setattr(namespace, self.dest, self.const)
+ else:
+ setattr(namespace, self.dest, values)
return _EnvironmentArgument
-@dataclass(slots=True, frozen=True)
-class _Settings:
- config_file_path: str
- output_dir: str
- timeout: float
- verbose: bool
- skip_setup: bool
- dpdk_tarball_path: Path
- compile_timeout: float
- test_cases: list
- re_run: int
+@dataclass(slots=True)
+class Settings:
+ config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ output_dir: str = "output"
+ timeout: float = 15
+ verbose: bool = False
+ skip_setup: bool = False
+ dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ compile_timeout: float = 1200
+ test_cases: list[str] = field(default_factory=list)
+ re_run: int = 0
+
+
+SETTINGS: Settings = Settings()
def _get_parser() -> argparse.ArgumentParser:
@@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--config-file",
action=_env_arg("DTS_CFG_FILE"),
- default="conf.yaml",
+ default=SETTINGS.config_file_path,
+ type=Path,
help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
"and targets.",
)
@@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--output-dir",
"--output",
action=_env_arg("DTS_OUTPUT_DIR"),
- default="output",
+ default=SETTINGS.output_dir,
help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
)
@@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
"-t",
"--timeout",
action=_env_arg("DTS_TIMEOUT"),
- default=15,
+ default=SETTINGS.timeout,
type=float,
help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
"compiling DPDK.",
@@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
"-v",
"--verbose",
action=_env_arg("DTS_VERBOSE"),
- default="N",
- help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
+ default=SETTINGS.verbose,
+ const=True,
+ help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
"to the console.",
)
@@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
"-s",
"--skip-setup",
action=_env_arg("DTS_SKIP_SETUP"),
- default="N",
- help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
+ const=True,
+ help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
)
parser.add_argument(
@@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--snapshot",
"--git-ref",
action=_env_arg("DTS_DPDK_TARBALL"),
- default="dpdk.tar.xz",
+ default=SETTINGS.dpdk_tarball_path,
type=Path,
help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
"tag ID or tree ID to test. To test local changes, first commit them, "
@@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--compile-timeout",
action=_env_arg("DTS_COMPILE_TIMEOUT"),
- default=1200,
+ default=SETTINGS.compile_timeout,
type=float,
help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
)
@@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--re-run",
"--re_run",
action=_env_arg("DTS_RERUN"),
- default=0,
+ default=SETTINGS.re_run,
type=int,
help="[DTS_RERUN] Re-run each test case the specified amount of times "
"if a test failure occurs",
@@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
return parser
-def _get_settings() -> _Settings:
+def get_settings() -> Settings:
parsed_args = _get_parser().parse_args()
- return _Settings(
+ return Settings(
config_file_path=parsed_args.config_file,
output_dir=parsed_args.output_dir,
timeout=parsed_args.timeout,
- verbose=(parsed_args.verbose == "Y"),
- skip_setup=(parsed_args.skip_setup == "Y"),
+ verbose=parsed_args.verbose,
+ skip_setup=parsed_args.skip_setup,
dpdk_tarball_path=Path(
- DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
- )
- if not os.path.exists(parsed_args.tarball)
- else Path(parsed_args.tarball),
+ Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
+ if not os.path.exists(parsed_args.tarball)
+ else Path(parsed_args.tarball)
+ ),
compile_timeout=parsed_args.compile_timeout,
- test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+ test_cases=(
+ parsed_args.test_cases.split(",") if parsed_args.test_cases else []
+ ),
re_run=parsed_args.re_run,
)
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index f0fbe80f6f..603e18872c 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -254,7 +254,7 @@ def add_build_target(
self._inner_results.append(build_target_result)
return build_target_result
- def add_sut_info(self, sut_info: NodeInfo):
+ def add_sut_info(self, sut_info: NodeInfo) -> None:
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
@@ -297,7 +297,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
self._inner_results.append(execution_result)
return execution_result
- def add_error(self, error) -> None:
+ def add_error(self, error: Exception) -> None:
self._errors.append(error)
def process(self) -> None:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 3b890c0451..d53553bf34 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -11,7 +11,7 @@
import re
from ipaddress import IPv4Interface, IPv6Interface, ip_interface
from types import MethodType
-from typing import Union
+from typing import Any, Union
from scapy.layers.inet import IP # type: ignore[import]
from scapy.layers.l2 import Ether # type: ignore[import]
@@ -26,8 +26,7 @@
from .logger import DTSLOG, getLogger
from .settings import SETTINGS
from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
-from .testbed_model import SutNode, TGNode
-from .testbed_model.hw.port import Port, PortLink
+from .testbed_model import Port, PortLink, SutNode, TGNode
from .utils import get_packet_summaries
@@ -453,7 +452,7 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
- def is_test_suite(object) -> bool:
+ def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
return True
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 5cbb859e47..8ced05653b 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,15 +9,9 @@
# pylama:ignore=W0611
-from .hw import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
- VirtualDevice,
- lcore_filter,
-)
+from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
from .node import Node
+from .port import Port, PortLink
from .sut_node import SutNode
from .tg_node import TGNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/testbed_model/cpu.py
index d1918a12dc..8fe785dfe4 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
)
return filtered_lcores
+
+
+def lcore_filter(
+ core_list: list[LogicalCore],
+ filter_specifier: LogicalCoreCount | LogicalCoreList,
+ ascending: bool,
+) -> LogicalCoreFilter:
+ if isinstance(filter_specifier, LogicalCoreList):
+ return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+ elif isinstance(filter_specifier, LogicalCoreCount):
+ return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+ else:
+ raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
- core_list: list[LogicalCore],
- filter_specifier: LogicalCoreCount | LogicalCoreList,
- ascending: bool,
-) -> LogicalCoreFilter:
- if isinstance(filter_specifier, LogicalCoreList):
- return LogicalCoreListFilter(core_list, filter_specifier, ascending)
- elif isinstance(filter_specifier, LogicalCoreCount):
- return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
- else:
- raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
similarity index 97%
rename from dts/framework/remote_session/linux_session.py
rename to dts/framework/testbed_model/linux_session.py
index a3f1a6bf3b..f472bb8f0f 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -9,10 +9,10 @@
from typing_extensions import NotRequired
from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
from framework.utils import expand_range
+from .cpu import LogicalCore
+from .port import Port
from .posix_session import PosixSession
@@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
lcores.append(LogicalCore(lcore, core, socket, node))
return lcores
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return dpdk_prefix
def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index fc01e0bf8e..7571e7b98d 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,23 +12,26 @@
from typing import Any, Callable, Type, Union
from framework.config import (
+ OS,
BuildTargetConfiguration,
ExecutionConfiguration,
NodeConfiguration,
)
+from framework.exception import ConfigurationError
from framework.logger import DTSLOG, getLogger
-from framework.remote_session import InteractiveShellType, OSSession, create_session
from framework.settings import SETTINGS
-from .hw import (
+from .cpu import (
LogicalCore,
LogicalCoreCount,
LogicalCoreList,
LogicalCoreListFilter,
- VirtualDevice,
lcore_filter,
)
-from .hw.port import Port
+from .linux_session import LinuxSession
+from .os_session import InteractiveShellType, OSSession
+from .port import Port
+from .virtual_device import VirtualDevice
class Node(ABC):
@@ -69,6 +72,7 @@ def __init__(self, node_config: NodeConfiguration):
def _init_ports(self) -> None:
self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
self.main_session.update_ports(self.ports)
+
for port in self.ports:
self.configure_port_state(port)
@@ -172,9 +176,9 @@ def create_interactive_shell(
return self.main_session.create_interactive_shell(
shell_cls,
- app_args,
timeout,
privileged,
+ app_args,
)
def filter_lcores(
@@ -205,7 +209,7 @@ def _get_remote_cpus(self) -> None:
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
- def _setup_hugepages(self):
+ def _setup_hugepages(self) -> None:
"""
Setup hugepages on the Node. Different architectures can supply different
amounts of memory for hugepages and numa-based hugepage allocation may need
@@ -249,3 +253,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
return lambda *args: None
else:
return func
+
+
+def create_session(
+ node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> OSSession:
+ match node_config.os:
+ case OS.linux:
+ return LinuxSession(node_config, name, logger)
+ case _:
+ raise ConfigurationError(f"Unsupported OS {node_config.os}")
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
similarity index 95%
rename from dts/framework/remote_session/os_session.py
rename to dts/framework/testbed_model/os_session.py
index 8a709eac1c..76e595a518 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -10,19 +10,19 @@
from framework.config import Architecture, NodeConfiguration, NodeInfo
from framework.logger import DTSLOG
-from framework.remote_session.remote import InteractiveShell
-from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
-from framework.utils import MesonArgs
-
-from .remote import (
+from framework.remote_session import (
CommandResult,
InteractiveRemoteSession,
+ InteractiveShell,
RemoteSession,
create_interactive_session,
create_remote_session,
)
+from framework.settings import SETTINGS
+from framework.utils import MesonArgs
+
+from .cpu import LogicalCore
+from .port import Port
InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
@@ -85,9 +85,9 @@ def send_command(
def create_interactive_shell(
self,
shell_cls: Type[InteractiveShellType],
- eal_parameters: str,
timeout: float,
privileged: bool,
+ app_args: str,
) -> InteractiveShellType:
"""
See "create_interactive_shell" in SutNode
@@ -96,7 +96,7 @@ def create_interactive_shell(
self.interactive_session.session,
self._logger,
self._get_privileged_command if privileged else None,
- eal_parameters,
+ app_args,
timeout,
)
@@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
"""
@abstractmethod
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
"""
Try to find DPDK remote dir in remote_dir.
"""
@@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
"""
@abstractmethod
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
"""
Get the DPDK file prefix that will be used when running DPDK apps.
"""
diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
similarity index 100%
rename from dts/framework/testbed_model/hw/port.py
rename to dts/framework/testbed_model/port.py
diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
similarity index 98%
rename from dts/framework/remote_session/posix_session.py
rename to dts/framework/testbed_model/posix_session.py
index 5da0516e05..1d1d5b1b26 100644
--- a/dts/framework/remote_session/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
@@ -219,7 +219,7 @@ def _remove_dpdk_runtime_dirs(
for dpdk_runtime_dir in dpdk_runtime_dirs:
self.remove_remote_dir(dpdk_runtime_dir)
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return ""
def get_compiler_version(self, compiler_name: str) -> str:
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 202aebfd06..4e33cf02ea 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,12 +15,14 @@
NodeInfo,
SutNodeConfiguration,
)
-from framework.remote_session import CommandResult, InteractiveShellType, OSSession
+from framework.remote_session import CommandResult
from framework.settings import SETTINGS
from framework.utils import MesonArgs
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
+from .cpu import LogicalCoreCount, LogicalCoreList
from .node import Node
+from .os_session import InteractiveShellType, OSSession
+from .virtual_device import VirtualDevice
class EalParameters(object):
@@ -289,7 +291,7 @@ def create_eal_parameters(
prefix: str = "dpdk",
append_prefix_timestamp: bool = True,
no_pci: bool = False,
- vdevs: list[VirtualDevice] = None,
+ vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
"""
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 27025cfa31..166eb8430e 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -16,16 +16,11 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.config import (
- ScapyTrafficGeneratorConfig,
- TGNodeConfiguration,
- TrafficGeneratorType,
-)
-from framework.exception import ConfigurationError
-
-from .capturing_traffic_generator import CapturingTrafficGenerator
-from .hw.port import Port
+from framework.config import TGNodeConfiguration
+
from .node import Node
+from .port import Port
+from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
class TGNode(Node):
@@ -80,20 +75,3 @@ def close(self) -> None:
"""Free all resources used by the node"""
self.traffic_generator.close()
super(TGNode, self).close()
-
-
-def create_traffic_generator(
- tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
-) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
-
- from .scapy import ScapyTrafficGenerator
-
- match traffic_generator_config.traffic_generator_type:
- case TrafficGeneratorType.SCAPY:
- return ScapyTrafficGenerator(tg_node, traffic_generator_config)
- case _:
- raise ConfigurationError(
- "Unknown traffic generator: "
- f"{traffic_generator_config.traffic_generator_type}"
- )
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
new file mode 100644
index 0000000000..11bfa1ee0f
--- /dev/null
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -0,0 +1,24 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
+from framework.exception import ConfigurationError
+from framework.testbed_model.node import Node
+
+from .capturing_traffic_generator import CapturingTrafficGenerator
+from .scapy import ScapyTrafficGenerator
+
+
+def create_traffic_generator(
+ tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
+) -> CapturingTrafficGenerator:
+ """A factory function for creating traffic generator object from user config."""
+
+ match traffic_generator_config.traffic_generator_type:
+ case TrafficGeneratorType.SCAPY:
+ return ScapyTrafficGenerator(tg_node, traffic_generator_config)
+ case _:
+ raise ConfigurationError(
+ "Unknown traffic generator: "
+ f"{traffic_generator_config.traffic_generator_type}"
+ )
diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
similarity index 96%
rename from dts/framework/testbed_model/capturing_traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index ab98987f8e..e521211ef0 100644
--- a/dts/framework/testbed_model/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -16,9 +16,9 @@
from scapy.packet import Packet # type: ignore[import]
from framework.settings import SETTINGS
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
from .traffic_generator import TrafficGenerator
@@ -130,7 +130,9 @@ def _send_packets_and_capture(
for the specified duration. It must be able to handle no received packets.
"""
- def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
+ def _write_capture_from_packets(
+ self, capture_name: str, packets: list[Packet]
+ ) -> None:
file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
self._logger.debug(f"Writing packets to {file_name}.")
scapy.utils.wrpcap(file_name, packets)
diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
similarity index 95%
rename from dts/framework/testbed_model/scapy.py
rename to dts/framework/testbed_model/traffic_generator/scapy.py
index af0d4dbb25..51864b6e6b 100644
--- a/dts/framework/testbed_model/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -24,16 +24,15 @@
from scapy.packet import Packet # type: ignore[import]
from framework.config import OS, ScapyTrafficGeneratorConfig
-from framework.logger import DTSLOG, getLogger
from framework.remote_session import PythonShell
from framework.settings import SETTINGS
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from .capturing_traffic_generator import (
CapturingTrafficGenerator,
_get_default_capture_name,
)
-from .hw.port import Port
-from .tg_node import TGNode
"""
========= BEGIN RPC FUNCTIONS =========
@@ -146,7 +145,7 @@ def quit(self) -> None:
self._BaseServer__shutdown_request = True
return None
- def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
+ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
"""Add a function to the server.
This is meant to be executed remotely.
@@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
session: PythonShell
rpc_server_proxy: xmlrpc.client.ServerProxy
_config: ScapyTrafficGeneratorConfig
- _tg_node: TGNode
- _logger: DTSLOG
-
- def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
- self._config = config
- self._tg_node = tg_node
- self._logger = getLogger(
- f"{self._tg_node.name} {self._config.traffic_generator_type}"
- )
+
+ def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ super().__init__(tg_node, config)
assert (
self._tg_node.config.os == OS.linux
@@ -235,7 +228,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
function_bytes = marshal.dumps(function.__code__)
self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
- def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
+ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
# load the source of the function
src = inspect.getsource(QuittableXMLRPCServer)
# Lines with only whitespace break the repl if in the middle of a function
@@ -280,7 +273,7 @@ def _send_packets_and_capture(
scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
return scapy_packets
- def close(self):
+ def close(self) -> None:
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
similarity index 80%
rename from dts/framework/testbed_model/traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 28c35d3ce4..ea7c3963da 100644
--- a/dts/framework/testbed_model/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -12,11 +12,12 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.logger import DTSLOG
+from framework.config import TrafficGeneratorConfig
+from framework.logger import DTSLOG, getLogger
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
-
class TrafficGenerator(ABC):
"""The base traffic generator.
@@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
Defines the few basic methods that each traffic generator must implement.
"""
+ _config: TrafficGeneratorConfig
+ _tg_node: Node
_logger: DTSLOG
+ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ self._config = config
+ self._tg_node = tg_node
+ self._logger = getLogger(
+ f"{self._tg_node.name} {self._config.traffic_generator_type}"
+ )
+
def send_packet(self, packet: Packet, port: Port) -> None:
"""Send a packet and block until it is fully sent.
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index d27c2c5b5f..f0c916471c 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -7,7 +7,6 @@
import json
import os
import subprocess
-import sys
from enum import Enum
from pathlib import Path
from subprocess import SubprocessError
@@ -16,35 +15,7 @@
from .exception import ConfigurationError
-
-class StrEnum(Enum):
- @staticmethod
- def _generate_next_value_(
- name: str, start: int, count: int, last_values: object
- ) -> str:
- return name
-
- def __str__(self) -> str:
- return self.name
-
-
-REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
-
-
-def check_dts_python_version() -> None:
- if sys.version_info.major < 3 or (
- sys.version_info.major == 3 and sys.version_info.minor < 10
- ):
- print(
- RED(
- (
- "WARNING: DTS execution node's python version is lower than"
- "python 3.10, is deprecated and will not work in future releases."
- )
- ),
- file=sys.stderr,
- )
- print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
def expand_range(range_str: str) -> list[int]:
@@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
return expanded_range
-def get_packet_summaries(packets: list[Packet]):
+def get_packet_summaries(packets: list[Packet]) -> str:
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
return f"Packet contents: \n{packet_summaries}"
-def RED(text: str) -> str:
- return f"\u001B[31;1m{str(text)}\u001B[0m"
+class StrEnum(Enum):
+ @staticmethod
+ def _generate_next_value_(
+ name: str, start: int, count: int, last_values: object
+ ) -> str:
+ return name
+
+ def __str__(self) -> str:
+ return self.name
class MesonArgs(object):
@@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
if self._tarball_path and os.path.exists(self._tarball_path):
os.remove(self._tarball_path)
- def __fspath__(self):
+ def __fspath__(self) -> str:
return str(self._tarball_path)
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..5d4714b0c3 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,17 @@
import logging
-from framework import dts
+from framework import settings
def main() -> None:
+ """Set DTS settings, then run DTS.
+
+ The DTS settings are taken from the command line arguments and the environment variables.
+ """
+ settings.SETTINGS = settings.get_settings()
+ from framework import dts
+
dts.run_all()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 02/23] dts: add docstring checker
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 03/23] dts: add basic developer docs Juraj Linkeš
` (20 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Python docstrings are the in-code way to document the code. The
docstring checker of choice is pydocstyle which we're executing from
Pylama, but the current latest versions are not complatible due to [0],
so pin the pydocstyle version to the latest working version.
[0] https://github.com/klen/pylama/issues/232
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 12 ++++++------
dts/pyproject.toml | 6 +++++-
2 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index f7b3b6d602..a734fa71f0 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -489,20 +489,20 @@ files = [
[[package]]
name = "pydocstyle"
-version = "6.3.0"
+version = "6.1.1"
description = "Python docstring style checker"
optional = false
python-versions = ">=3.6"
files = [
- {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
- {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+ {file = "pydocstyle-6.1.1-py3-none-any.whl", hash = "sha256:6987826d6775056839940041beef5c08cc7e3d71d63149b48e36727f70144dc4"},
+ {file = "pydocstyle-6.1.1.tar.gz", hash = "sha256:1d41b7c459ba0ee6c345f2eb9ae827cab14a7533a88c5c6f7e94923f72df92dc"},
]
[package.dependencies]
-snowballstemmer = ">=2.2.0"
+snowballstemmer = "*"
[package.extras]
-toml = ["tomli (>=1.2.3)"]
+toml = ["toml"]
[[package]]
name = "pyflakes"
@@ -837,4 +837,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
+content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6762edfa6b..3943c87c87 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -25,6 +25,7 @@ PyYAML = "^6.0"
types-PyYAML = "^6.0.8"
fabric = "^2.7.1"
scapy = "^2.5.0"
+pydocstyle = "6.1.1"
[tool.poetry.group.dev.dependencies]
mypy = "^0.961"
@@ -39,10 +40,13 @@ requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
[tool.pylama]
-linters = "mccabe,pycodestyle,pyflakes"
+linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
format = "pylint"
max_line_length = 88 # https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html#line-length
+[tool.pylama.linter.pydocstyle]
+convention = "google"
+
[tool.mypy]
python_version = "3.10"
enable_error_code = ["ignore-without-code"]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 03/23] dts: add basic developer docs
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 02/23] dts: add docstring checker Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 04/23] dts: exceptions docstring update Juraj Linkeš
` (19 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Expand the framework contribution guidelines and add how to document the
code with Python docstrings.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
1 file changed, 73 insertions(+)
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 32c18ee472..cd771a428c 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
The results contain basic statistics of passed/failed test cases and DPDK version.
+Contributing to DTS
+-------------------
+
+There are two areas of contribution: The DTS framework and DTS test suites.
+
+The framework contains the logic needed to run test cases, such as connecting to nodes,
+running DPDK apps and collecting results.
+
+The test cases call APIs from the framework to test their scenarios. Adding test cases may
+require adding code to the framework as well.
+
+
+Framework Coding Guidelines
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When adding code to the DTS framework, pay attention to the rest of the code
+and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
+warnings when some of the basics are not met.
+
+The code must be properly documented with docstrings. The style must conform to
+the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
+See an example of the style
+`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
+For cases which are not covered by the Google style, refer
+to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
+the two style guides, where we deviate or where some additional clarification is helpful:
+
+ * The __init__() methods of classes are documented separately from the docstring of the class
+ itself.
+ * The docstrigs of implemented abstract methods should refer to the superclass's definition
+ if there's no deviation.
+ * Instance variables/attributes should be documented in the docstring of the class
+ in the ``Attributes:`` section.
+ * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
+ attributes which result in instance variables/attributes should also be recorded
+ in the ``Attributes:`` section.
+ * Class variables/attributes, on the other hand, should be documented with ``#:`` above
+ the type annotated line. The description may be omitted if the meaning is obvious.
+ * The Enum and TypedDict also process the attributes in particular ways and should be documented
+ with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
+ * When referencing a parameter of a function or a method in their docstring, don't use
+ any articles and put the parameter into single backticks. This mimics the style of
+ `Python's documentation <https://docs.python.org/3/index.html>`_.
+ * When specifying a value, use double backticks::
+
+ def foo(greet: bool) -> None:
+ """Demonstration of single and double backticks.
+
+ `greet` controls whether ``Hello World`` is printed.
+
+ Args:
+ greet: Whether to print the ``Hello World`` message.
+ """
+ if greet:
+ print(f"Hello World")
+
+ * The docstring maximum line length is the same as the code maximum line length.
+
+
How To Write a Test Suite
-------------------------
@@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
| These methods don't need to be implemented if there's no need for them in a test suite.
In that case, nothing will happen when they're is executed.
+#. **Configuration, traffic and other logic**
+
+ The ``TestSuite`` class contains a variety of methods for anything that
+ a test suite setup, a teardown, or a test case may need to do.
+
+ The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
+ and use the interactive shell instances directly.
+
+ These are the two main ways to call the framework logic in test suites. If there's any
+ functionality or logic missing from the framework, it should be implemented so that
+ the test suites can use one of these two ways.
+
#. **Test case verification**
Test case verification should be done with the ``verify`` method, which records the result.
@@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
and used by the test suite via the ``sut_node`` field.
+.. _dts_dev_tools:
+
DTS Developer Tools
-------------------
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 04/23] dts: exceptions docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 02/23] dts: add docstring checker Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 03/23] dts: add basic developer docs Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 05/23] dts: settings " Juraj Linkeš
` (18 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/__init__.py | 12 ++++-
dts/framework/exception.py | 106 +++++++++++++++++++++++++------------
2 files changed, 83 insertions(+), 35 deletions(-)
diff --git a/dts/framework/__init__.py b/dts/framework/__init__.py
index d551ad4bf0..662e6ccad2 100644
--- a/dts/framework/__init__.py
+++ b/dts/framework/__init__.py
@@ -1,3 +1,13 @@
# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2022 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+
+"""Libraries and utilities for running DPDK Test Suite (DTS).
+
+The various modules in the DTS framework offer:
+
+* Connections to nodes, both interactive and non-interactive,
+* A straightforward way to add support for different operating systems of remote nodes,
+* Test suite setup, execution and teardown, along with test case setup, execution and teardown,
+* Pre-test suite setup and post-test suite teardown.
+"""
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 7489c03570..ee1562c672 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -3,8 +3,10 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-User-defined exceptions used across the framework.
+"""DTS exceptions.
+
+The exceptions all have different severities expressed as an integer.
+The highest severity of all raised exception is used as the exit code of DTS.
"""
from enum import IntEnum, unique
@@ -13,59 +15,79 @@
@unique
class ErrorSeverity(IntEnum):
- """
- The severity of errors that occur during DTS execution.
+ """The severity of errors that occur during DTS execution.
+
All exceptions are caught and the most severe error is used as return code.
"""
+ #:
NO_ERR = 0
+ #:
GENERIC_ERR = 1
+ #:
CONFIG_ERR = 2
+ #:
REMOTE_CMD_EXEC_ERR = 3
+ #:
SSH_ERR = 4
+ #:
DPDK_BUILD_ERR = 10
+ #:
TESTCASE_VERIFY_ERR = 20
+ #:
BLOCKING_TESTSUITE_ERR = 25
class DTSError(Exception):
- """
- The base exception from which all DTS exceptions are derived.
- Stores error severity.
+ """The base exception from which all DTS exceptions are subclassed.
+
+ Do not use this exception, only use subclassed exceptions.
"""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.GENERIC_ERR
class SSHTimeoutError(DTSError):
- """
- Command execution timeout.
- """
+ """The SSH execution of a command timed out."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_command: str
def __init__(self, command: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ command: The executed command.
+ """
self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self._command}"
+ """Add some context to the string representation."""
+ return f"{self._command} execution timed out."
class SSHConnectionError(DTSError):
- """
- SSH connection error.
- """
+ """An unsuccessful SSH connection."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
_errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ host: The hostname to which we're trying to connect.
+ errors: Any errors that occurred during the connection attempt.
+ """
self._host = host
self._errors = [] if errors is None else errors
def __str__(self) -> str:
+ """Include the errors in the string representation."""
message = f"Error trying to connect with {self._host}."
if self._errors:
message += f" Errors encountered while retrying: {', '.join(self._errors)}"
@@ -74,43 +96,53 @@ def __str__(self) -> str:
class SSHSessionDeadError(DTSError):
- """
- SSH session is not alive.
- It can no longer be used.
- """
+ """The SSH session is no longer alive."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
def __init__(self, host: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ host: The hostname of the disconnected node.
+ """
self._host = host
def __str__(self) -> str:
- return f"SSH session with {self._host} has died"
+ """Add some context to the string representation."""
+ return f"SSH session with {self._host} has died."
class ConfigurationError(DTSError):
- """
- Raised when an invalid configuration is encountered.
- """
+ """An invalid configuration."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.CONFIG_ERR
class RemoteCommandExecutionError(DTSError):
- """
- Raised when a command executed on a Node returns a non-zero exit status.
- """
+ """An unsuccessful execution of a remote command."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ #: The executed command.
command: str
_command_return_code: int
def __init__(self, command: str, command_return_code: int):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ command: The executed command.
+ command_return_code: The return code of the executed command.
+ """
self.command = command
self._command_return_code = command_return_code
def __str__(self) -> str:
+ """Include both the command and return code in the string representation."""
return (
f"Command {self.command} returned a non-zero exit code: "
f"{self._command_return_code}"
@@ -118,35 +150,41 @@ def __str__(self) -> str:
class RemoteDirectoryExistsError(DTSError):
- """
- Raised when a remote directory to be created already exists.
- """
+ """A directory that exists on a remote node."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
class DPDKBuildError(DTSError):
- """
- Raised when DPDK build fails for any reason.
- """
+ """A DPDK build failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.DPDK_BUILD_ERR
class TestCaseVerifyError(DTSError):
- """
- Used in test cases to verify the expected behavior.
- """
+ """A test case failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
class BlockingTestSuiteError(DTSError):
+ """A failure in a blocking test suite."""
+
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
_suite_name: str
def __init__(self, suite_name: str) -> None:
+ """Define the meaning of the first argument.
+
+ Args:
+ suite_name: The blocking test suite.
+ """
self._suite_name = suite_name
def __str__(self) -> str:
+ """Add some context to the string representation."""
return f"Blocking suite {self._suite_name} failed."
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 05/23] dts: settings docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (2 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 04/23] dts: exceptions docstring update Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 16:17 ` Yoan Picchi
2023-11-08 12:53 ` [PATCH v6 06/23] dts: logger and " Juraj Linkeš
` (17 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/settings.py | 101 +++++++++++++++++++++++++++++++++++++-
1 file changed, 100 insertions(+), 1 deletion(-)
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 7f5841d073..787db7c198 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -3,6 +3,70 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+"""Environment variables and command line arguments parsing.
+
+This is a simple module utilizing the built-in argparse module to parse command line arguments,
+augment them with values from environment variables and make them available across the framework.
+
+The command line value takes precedence, followed by the environment variable value,
+followed by the default value defined in this module.
+
+The command line arguments along with the supported environment variables are:
+
+.. option:: --config-file
+.. envvar:: DTS_CFG_FILE
+
+ The path to the YAML test run configuration file.
+
+.. option:: --output-dir, --output
+.. envvar:: DTS_OUTPUT_DIR
+
+ The directory where DTS logs and results are saved.
+
+.. option:: --compile-timeout
+.. envvar:: DTS_COMPILE_TIMEOUT
+
+ The timeout for compiling DPDK.
+
+.. option:: -t, --timeout
+.. envvar:: DTS_TIMEOUT
+
+ The timeout for all DTS operation except for compiling DPDK.
+
+.. option:: -v, --verbose
+.. envvar:: DTS_VERBOSE
+
+ Set to any value to enable logging everything to the console.
+
+.. option:: -s, --skip-setup
+.. envvar:: DTS_SKIP_SETUP
+
+ Set to any value to skip building DPDK.
+
+.. option:: --tarball, --snapshot, --git-ref
+.. envvar:: DTS_DPDK_TARBALL
+
+ The path to a DPDK tarball, git commit ID, tag ID or tree ID to test.
+
+.. option:: --test-cases
+.. envvar:: DTS_TESTCASES
+
+ A comma-separated list of test cases to execute. Unknown test cases will be silently ignored.
+
+.. option:: --re-run, --re_run
+.. envvar:: DTS_RERUN
+
+ Re-run each test case this many times in case of a failure.
+
+Attributes:
+ SETTINGS: The module level variable storing framework-wide DTS settings.
+
+Typical usage example::
+
+ from framework.settings import SETTINGS
+ foo = SETTINGS.foo
+"""
+
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
@@ -16,6 +80,23 @@
def _env_arg(env_var: str) -> Any:
+ """A helper method augmenting the argparse Action with environment variables.
+
+ If the supplied environment variable is defined, then the default value
+ of the argument is modified. This satisfies the priority order of
+ command line argument > environment variable > default value.
+
+ Arguments with no values (flags) should be defined using the const keyword argument
+ (True or False). When the argument is specified, it will be set to const, if not specified,
+ the default will be stored (possibly modified by the corresponding environment variable).
+
+ Other arguments work the same as default argparse arguments, that is using
+ the default 'store' action.
+
+ Returns:
+ The modified argparse.Action.
+ """
+
class _EnvironmentArgument(argparse.Action):
def __init__(
self,
@@ -68,14 +149,28 @@ def __call__(
@dataclass(slots=True)
class Settings:
+ """Default framework-wide user settings.
+
+ The defaults may be modified at the start of the run.
+ """
+
+ #:
config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ #:
output_dir: str = "output"
+ #:
timeout: float = 15
+ #:
verbose: bool = False
+ #:
skip_setup: bool = False
+ #:
dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ #:
compile_timeout: float = 1200
+ #:
test_cases: list[str] = field(default_factory=list)
+ #:
re_run: int = 0
@@ -169,7 +264,7 @@ def _get_parser() -> argparse.ArgumentParser:
action=_env_arg("DTS_RERUN"),
default=SETTINGS.re_run,
type=int,
- help="[DTS_RERUN] Re-run each test case the specified amount of times "
+ help="[DTS_RERUN] Re-run each test case the specified number of times "
"if a test failure occurs",
)
@@ -177,6 +272,10 @@ def _get_parser() -> argparse.ArgumentParser:
def get_settings() -> Settings:
+ """Create new settings with inputs from the user.
+
+ The inputs are taken from the command line and from environment variables.
+ """
parsed_args = _get_parser().parse_args()
return Settings(
config_file_path=parsed_args.config_file,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 05/23] dts: settings docstring update
2023-11-08 12:53 ` [PATCH v6 05/23] dts: settings " Juraj Linkeš
@ 2023-11-08 16:17 ` Yoan Picchi
2023-11-15 10:09 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-08 16:17 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek
Cc: dev
On 11/8/23 12:53, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/settings.py | 101 +++++++++++++++++++++++++++++++++++++-
> 1 file changed, 100 insertions(+), 1 deletion(-)
>
> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> index 7f5841d073..787db7c198 100644
> --- a/dts/framework/settings.py
> +++ b/dts/framework/settings.py
> @@ -3,6 +3,70 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022 University of New Hampshire
>
> +"""Environment variables and command line arguments parsing.
> +
> +This is a simple module utilizing the built-in argparse module to parse command line arguments,
> +augment them with values from environment variables and make them available across the framework.
> +
> +The command line value takes precedence, followed by the environment variable value,
> +followed by the default value defined in this module.
> +
> +The command line arguments along with the supported environment variables are:
> +
> +.. option:: --config-file
> +.. envvar:: DTS_CFG_FILE
> +
> + The path to the YAML test run configuration file.
> +
> +.. option:: --output-dir, --output
> +.. envvar:: DTS_OUTPUT_DIR
> +
> + The directory where DTS logs and results are saved.
> +
> +.. option:: --compile-timeout
> +.. envvar:: DTS_COMPILE_TIMEOUT
> +
> + The timeout for compiling DPDK.
> +
> +.. option:: -t, --timeout
> +.. envvar:: DTS_TIMEOUT
> +
> + The timeout for all DTS operation except for compiling DPDK.
> +
> +.. option:: -v, --verbose
> +.. envvar:: DTS_VERBOSE
> +
> + Set to any value to enable logging everything to the console.
> +
> +.. option:: -s, --skip-setup
> +.. envvar:: DTS_SKIP_SETUP
> +
> + Set to any value to skip building DPDK.
> +
> +.. option:: --tarball, --snapshot, --git-ref
> +.. envvar:: DTS_DPDK_TARBALL
> +
> + The path to a DPDK tarball, git commit ID, tag ID or tree ID to test.
> +
> +.. option:: --test-cases
> +.. envvar:: DTS_TESTCASES
> +
> + A comma-separated list of test cases to execute. Unknown test cases will be silently ignored.
> +
> +.. option:: --re-run, --re_run
> +.. envvar:: DTS_RERUN
> +
> + Re-run each test case this many times in case of a failure.
> +
> +Attributes:
> + SETTINGS: The module level variable storing framework-wide DTS settings.
In the generated doc, "Attributes" doesn't appear. It ends up looking
like SETTINGS is just another environment variable, with no separation
with the above list.
> +
> +Typical usage example::
> +
> + from framework.settings import SETTINGS
> + foo = SETTINGS.foo
> +"""
> +
> import argparse
> import os
> from collections.abc import Callable, Iterable, Sequence
> @@ -16,6 +80,23 @@
>
>
> def _env_arg(env_var: str) -> Any:
> + """A helper method augmenting the argparse Action with environment variable > +
> + If the supplied environment variable is defined, then the default value
> + of the argument is modified. This satisfies the priority order of
> + command line argument > environment variable > default value.
> +
> + Arguments with no values (flags) should be defined using the const keyword argument
> + (True or False). When the argument is specified, it will be set to const, if not specified,
> + the default will be stored (possibly modified by the corresponding environment variable).
> +
> + Other arguments work the same as default argparse arguments, that is using
> + the default 'store' action.
> +
> + Returns:
> + The modified argparse.Action.
> + """
> +
> class _EnvironmentArgument(argparse.Action):
> def __init__(
> self,
> @@ -68,14 +149,28 @@ def __call__(
>
> @dataclass(slots=True)
> class Settings:
> + """Default framework-wide user settings.
> +
> + The defaults may be modified at the start of the run.
> + """
> +
> + #:
> config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
> + #:
> output_dir: str = "output"
> + #:
> timeout: float = 15
> + #:
> verbose: bool = False
> + #:
> skip_setup: bool = False
> + #:
> dpdk_tarball_path: Path | str = "dpdk.tar.xz"
> + #:
> compile_timeout: float = 1200
> + #:
> test_cases: list[str] = field(default_factory=list)
> + #:
> re_run: int = 0
For some reason in the doc, __init__ also appears :
__init__(config_file_path: ~pathlib.Path = PosixPath('/ho...
>
>
> @@ -169,7 +264,7 @@ def _get_parser() -> argparse.ArgumentParser:
> action=_env_arg("DTS_RERUN"),
> default=SETTINGS.re_run,
> type=int,
> - help="[DTS_RERUN] Re-run each test case the specified amount of times "
> + help="[DTS_RERUN] Re-run each test case the specified number of times "
> "if a test failure occurs",
> )
>
> @@ -177,6 +272,10 @@ def _get_parser() -> argparse.ArgumentParser:
>
>
> def get_settings() -> Settings:
> + """Create new settings with inputs from the user.
> +
> + The inputs are taken from the command line and from environment variables.
> + """
> parsed_args = _get_parser().parse_args()
> return Settings(
> config_file_path=parsed_args.config_file,
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 05/23] dts: settings docstring update
2023-11-08 16:17 ` Yoan Picchi
@ 2023-11-15 10:09 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 10:09 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, dev
On Wed, Nov 8, 2023 at 5:17 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/8/23 12:53, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/settings.py | 101 +++++++++++++++++++++++++++++++++++++-
> > 1 file changed, 100 insertions(+), 1 deletion(-)
> >
> > diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> > index 7f5841d073..787db7c198 100644
> > --- a/dts/framework/settings.py
> > +++ b/dts/framework/settings.py
> > @@ -3,6 +3,70 @@
> > # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022 University of New Hampshire
> >
> > +"""Environment variables and command line arguments parsing.
> > +
> > +This is a simple module utilizing the built-in argparse module to parse command line arguments,
> > +augment them with values from environment variables and make them available across the framework.
> > +
> > +The command line value takes precedence, followed by the environment variable value,
> > +followed by the default value defined in this module.
> > +
> > +The command line arguments along with the supported environment variables are:
> > +
> > +.. option:: --config-file
> > +.. envvar:: DTS_CFG_FILE
> > +
> > + The path to the YAML test run configuration file.
> > +
> > +.. option:: --output-dir, --output
> > +.. envvar:: DTS_OUTPUT_DIR
> > +
> > + The directory where DTS logs and results are saved.
> > +
> > +.. option:: --compile-timeout
> > +.. envvar:: DTS_COMPILE_TIMEOUT
> > +
> > + The timeout for compiling DPDK.
> > +
> > +.. option:: -t, --timeout
> > +.. envvar:: DTS_TIMEOUT
> > +
> > + The timeout for all DTS operation except for compiling DPDK.
> > +
> > +.. option:: -v, --verbose
> > +.. envvar:: DTS_VERBOSE
> > +
> > + Set to any value to enable logging everything to the console.
> > +
> > +.. option:: -s, --skip-setup
> > +.. envvar:: DTS_SKIP_SETUP
> > +
> > + Set to any value to skip building DPDK.
> > +
> > +.. option:: --tarball, --snapshot, --git-ref
> > +.. envvar:: DTS_DPDK_TARBALL
> > +
> > + The path to a DPDK tarball, git commit ID, tag ID or tree ID to test.
> > +
> > +.. option:: --test-cases
> > +.. envvar:: DTS_TESTCASES
> > +
> > + A comma-separated list of test cases to execute. Unknown test cases will be silently ignored.
> > +
> > +.. option:: --re-run, --re_run
> > +.. envvar:: DTS_RERUN
> > +
> > + Re-run each test case this many times in case of a failure.
> > +
> > +Attributes:
> > + SETTINGS: The module level variable storing framework-wide DTS settings.
>
> In the generated doc, "Attributes" doesn't appear. It ends up looking
> like SETTINGS is just another environment variable, with no separation
> with the above list.
>
Yes, the Attributes: section is just a syntactical way to tell the
parser to render the attributes in a certain way.
We could add some delimiter or an extra paragraph explaining that what
comes next are module attributes. I'll try to add something.
> > +
> > +Typical usage example::
> > +
> > + from framework.settings import SETTINGS
> > + foo = SETTINGS.foo
> > +"""
> > +
> > import argparse
> > import os
> > from collections.abc import Callable, Iterable, Sequence
> > @@ -16,6 +80,23 @@
> >
> >
> > def _env_arg(env_var: str) -> Any:
> > + """A helper method augmenting the argparse Action with environment variable
> > +
> > + If the supplied environment variable is defined, then the default value
> > + of the argument is modified. This satisfies the priority order of
> > + command line argument > environment variable > default value.
> > +
> > + Arguments with no values (flags) should be defined using the const keyword argument
> > + (True or False). When the argument is specified, it will be set to const, if not specified,
> > + the default will be stored (possibly modified by the corresponding environment variable).
> > +
> > + Other arguments work the same as default argparse arguments, that is using
> > + the default 'store' action.
> > +
> > + Returns:
> > + The modified argparse.Action.
> > + """
> > +
> > class _EnvironmentArgument(argparse.Action):
> > def __init__(
> > self,
> > @@ -68,14 +149,28 @@ def __call__(
> >
> > @dataclass(slots=True)
> > class Settings:
> > + """Default framework-wide user settings.
> > +
> > + The defaults may be modified at the start of the run.
> > + """
> > +
> > + #:
> > config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
> > + #:
> > output_dir: str = "output"
> > + #:
> > timeout: float = 15
> > + #:
> > verbose: bool = False
> > + #:
> > skip_setup: bool = False
> > + #:
> > dpdk_tarball_path: Path | str = "dpdk.tar.xz"
> > + #:
> > compile_timeout: float = 1200
> > + #:
> > test_cases: list[str] = field(default_factory=list)
> > + #:
> > re_run: int = 0
>
> For some reason in the doc, __init__ also appears :
> __init__(config_file_path: ~pathlib.Path = PosixPath('/ho...
>
Yes, the @dataclass decorator adds the constructor so it gets
documented. This is useful so that we see the default values.
> >
> >
> > @@ -169,7 +264,7 @@ def _get_parser() -> argparse.ArgumentParser:
> > action=_env_arg("DTS_RERUN"),
> > default=SETTINGS.re_run,
> > type=int,
> > - help="[DTS_RERUN] Re-run each test case the specified amount of times "
> > + help="[DTS_RERUN] Re-run each test case the specified number of times "
> > "if a test failure occurs",
> > )
> >
> > @@ -177,6 +272,10 @@ def _get_parser() -> argparse.ArgumentParser:
> >
> >
> > def get_settings() -> Settings:
> > + """Create new settings with inputs from the user.
> > +
> > + The inputs are taken from the command line and from environment variables.
> > + """
> > parsed_args = _get_parser().parse_args()
> > return Settings(
> > config_file_path=parsed_args.config_file,
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 06/23] dts: logger and settings docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (3 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 05/23] dts: settings " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 17:14 ` Yoan Picchi
2023-11-08 12:53 ` [PATCH v6 07/23] dts: dts runner and main " Juraj Linkeš
` (16 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/logger.py | 72 +++++++++++++++++++++----------
dts/framework/utils.py | 96 ++++++++++++++++++++++++++++++-----------
2 files changed, 121 insertions(+), 47 deletions(-)
diff --git a/dts/framework/logger.py b/dts/framework/logger.py
index bb2991e994..d3eb75a4e4 100644
--- a/dts/framework/logger.py
+++ b/dts/framework/logger.py
@@ -3,9 +3,9 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-DTS logger module with several log level. DTS framework and TestSuite logs
-are saved in different log files.
+"""DTS logger module.
+
+DTS framework and TestSuite logs are saved in different log files.
"""
import logging
@@ -18,19 +18,21 @@
stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
-class LoggerDictType(TypedDict):
- logger: "DTSLOG"
- name: str
- node: str
-
+class DTSLOG(logging.LoggerAdapter):
+ """DTS logger adapter class for framework and testsuites.
-# List for saving all using loggers
-Loggers: list[LoggerDictType] = []
+ The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
+ variable control the verbosity of output. If enabled, all messages will be emitted to the
+ console.
+ The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+ variable modify the directory where the logs will be stored.
-class DTSLOG(logging.LoggerAdapter):
- """
- DTS log class for framework and testsuite.
+ Attributes:
+ node: The additional identifier. Currently unused.
+ sh: The handler which emits logs to console.
+ fh: The handler which emits logs to a file.
+ verbose_fh: Just as fh, but logs with a different, more verbose, format.
"""
_logger: logging.Logger
@@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
verbose_fh: logging.FileHandler
def __init__(self, logger: logging.Logger, node: str = "suite"):
+ """Extend the constructor with additional handlers.
+
+ One handler logs to the console, the other one to a file, with either a regular or verbose
+ format.
+
+ Args:
+ logger: The logger from which to create the logger adapter.
+ node: An additional identifier. Currently unused.
+ """
self._logger = logger
# 1 means log everything, this will be used by file handlers if their level
# is not set
@@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
def logger_exit(self) -> None:
- """
- Remove stream handler and logfile handler.
- """
+ """Remove the stream handler and the logfile handler."""
for handler in (self.sh, self.fh, self.verbose_fh):
handler.flush()
self._logger.removeHandler(handler)
+class _LoggerDictType(TypedDict):
+ logger: DTSLOG
+ name: str
+ node: str
+
+
+# List for saving all loggers in use
+_Loggers: list[_LoggerDictType] = []
+
+
def getLogger(name: str, node: str = "suite") -> DTSLOG:
+ """Get DTS logger adapter identified by name and node.
+
+ An existing logger will be return if one with the exact name and node already exists.
+ A new one will be created and stored otherwise.
+
+ Args:
+ name: The name of the logger.
+ node: An additional identifier for the logger.
+
+ Returns:
+ A logger uniquely identified by both name and node.
"""
- Get logger handler and if there's no handler for specified Node will create one.
- """
- global Loggers
+ global _Loggers
# return saved logger
- logger: LoggerDictType
- for logger in Loggers:
+ logger: _LoggerDictType
+ for logger in _Loggers:
if logger["name"] == name and logger["node"] == node:
return logger["logger"]
# return new logger
dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
- Loggers.append({"logger": dts_logger, "name": name, "node": node})
+ _Loggers.append({"logger": dts_logger, "name": name, "node": node})
return dts_logger
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index f0c916471c..0613adf7ad 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -3,6 +3,16 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Various utility classes and functions.
+
+These are used in multiple modules across the framework. They're here because
+they provide some non-specific functionality, greatly simplify imports or just don't
+fit elsewhere.
+
+Attributes:
+ REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
+"""
+
import atexit
import json
import os
@@ -19,12 +29,20 @@
def expand_range(range_str: str) -> list[int]:
- """
- Process range string into a list of integers. There are two possible formats:
- n - a single integer
- n-m - a range of integers
+ """Process `range_str` into a list of integers.
+
+ There are two possible formats of `range_str`:
+
+ * ``n`` - a single integer,
+ * ``n-m`` - a range of integers.
- The returned range includes both n and m. Empty string returns an empty list.
+ The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
+
+ Args:
+ range_str: The range to expand.
+
+ Returns:
+ All the numbers from the range.
"""
expanded_range: list[int] = []
if range_str:
@@ -39,6 +57,14 @@ def expand_range(range_str: str) -> list[int]:
def get_packet_summaries(packets: list[Packet]) -> str:
+ """Format a string summary from `packets`.
+
+ Args:
+ packets: The packets to format.
+
+ Returns:
+ The summary of `packets`.
+ """
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -49,6 +75,8 @@ def get_packet_summaries(packets: list[Packet]) -> str:
class StrEnum(Enum):
+ """Enum with members stored as strings."""
+
@staticmethod
def _generate_next_value_(
name: str, start: int, count: int, last_values: object
@@ -56,22 +84,29 @@ def _generate_next_value_(
return name
def __str__(self) -> str:
+ """The string representation is the name of the member."""
return self.name
class MesonArgs(object):
- """
- Aggregate the arguments needed to build DPDK:
- default_library: Default library type, Meson allows "shared", "static" and "both".
- Defaults to None, in which case the argument won't be used.
- Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
- Do not use -D with them, for example:
- meson_args = MesonArgs(enable_kmods=True).
- """
+ """Aggregate the arguments needed to build DPDK."""
_default_library: str
def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
+ """Initialize the meson arguments.
+
+ Args:
+ default_library: The default library type, Meson supports ``shared``, ``static`` and
+ ``both``. Defaults to :data:`None`, in which case the argument won't be used.
+ dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Example:
+ ::
+
+ meson_args = MesonArgs(enable_kmods=True).
+ """
self._default_library = (
f"--default-library={default_library}" if default_library else ""
)
@@ -83,6 +118,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
)
def __str__(self) -> str:
+ """The actual args."""
return " ".join(f"{self._default_library} {self._dpdk_args}".split())
@@ -93,35 +129,33 @@ class _TarCompressionFormat(StrEnum):
and Enum values are the associated file extensions.
"""
+ #:
gzip = "gz"
+ #:
compress = "Z"
+ #:
bzip2 = "bz2"
+ #:
lzip = "lz"
+ #:
lzma = "lzma"
+ #:
lzop = "lzo"
+ #:
xz = "xz"
+ #:
zstd = "zst"
class DPDKGitTarball(object):
- """Create a compressed tarball of DPDK from the repository.
-
- The DPDK version is specified with git object git_ref.
- The tarball will be compressed with _TarCompressionFormat,
- which must be supported by the DTS execution environment.
- The resulting tarball will be put into output_dir.
+ """Compressed tarball of DPDK from the repository.
- The class supports the os.PathLike protocol,
+ The class supports the :class:`os.PathLike` protocol,
which is used to get the Path of the tarball::
from pathlib import Path
tarball = DPDKGitTarball("HEAD", "output")
tarball_path = Path(tarball)
-
- Arguments:
- git_ref: A git commit ID, tag ID or tree ID.
- output_dir: The directory where to put the resulting tarball.
- tar_compression_format: The compression format to use.
"""
_git_ref: str
@@ -136,6 +170,17 @@ def __init__(
output_dir: str,
tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
):
+ """Create the tarball during initialization.
+
+ The DPDK version is specified with `git_ref`. The tarball will be compressed with
+ `tar_compression_format`, which must be supported by the DTS execution environment.
+ The resulting tarball will be put into `output_dir`.
+
+ Args:
+ git_ref: A git commit ID, tag ID or tree ID.
+ output_dir: The directory where to put the resulting tarball.
+ tar_compression_format: The compression format to use.
+ """
self._git_ref = git_ref
self._tar_compression_format = tar_compression_format
@@ -204,4 +249,5 @@ def _delete_tarball(self) -> None:
os.remove(self._tarball_path)
def __fspath__(self) -> str:
+ """The os.PathLike protocol implementation."""
return str(self._tarball_path)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 06/23] dts: logger and settings docstring update
2023-11-08 12:53 ` [PATCH v6 06/23] dts: logger and " Juraj Linkeš
@ 2023-11-08 17:14 ` Yoan Picchi
2023-11-15 10:11 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-08 17:14 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek
Cc: dev
On 11/8/23 12:53, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/logger.py | 72 +++++++++++++++++++++----------
> dts/framework/utils.py | 96 ++++++++++++++++++++++++++++++-----------
> 2 files changed, 121 insertions(+), 47 deletions(-)
>
> diff --git a/dts/framework/logger.py b/dts/framework/logger.py
> index bb2991e994..d3eb75a4e4 100644
> --- a/dts/framework/logger.py
> +++ b/dts/framework/logger.py
> @@ -3,9 +3,9 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> -"""
> -DTS logger module with several log level. DTS framework and TestSuite logs
> -are saved in different log files.
> +"""DTS logger module.
> +
> +DTS framework and TestSuite logs are saved in different log files.
> """
>
> import logging
> @@ -18,19 +18,21 @@
> stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
>
>
> -class LoggerDictType(TypedDict):
> - logger: "DTSLOG"
> - name: str
> - node: str
> -
> +class DTSLOG(logging.LoggerAdapter):
> + """DTS logger adapter class for framework and testsuites.
>
> -# List for saving all using loggers
> -Loggers: list[LoggerDictType] = []
> + The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
> + variable control the verbosity of output. If enabled, all messages will be emitted to the
> + console.
>
> + The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
> + variable modify the directory where the logs will be stored.
>
> -class DTSLOG(logging.LoggerAdapter):
> - """
> - DTS log class for framework and testsuite.
> + Attributes:
> + node: The additional identifier. Currently unused.
> + sh: The handler which emits logs to console.
> + fh: The handler which emits logs to a file.
> + verbose_fh: Just as fh, but logs with a different, more verbose, format.
> """
>
> _logger: logging.Logger
> @@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
> verbose_fh: logging.FileHandler
>
> def __init__(self, logger: logging.Logger, node: str = "suite"):
> + """Extend the constructor with additional handlers.
> +
> + One handler logs to the console, the other one to a file, with either a regular or verbose
> + format.
> +
> + Args:
> + logger: The logger from which to create the logger adapter.
> + node: An additional identifier. Currently unused.
> + """
> self._logger = logger
> # 1 means log everything, this will be used by file handlers if their level
> # is not set
> @@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
> super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
>
> def logger_exit(self) -> None:
> - """
> - Remove stream handler and logfile handler.
> - """
> + """Remove the stream handler and the logfile handler."""
> for handler in (self.sh, self.fh, self.verbose_fh):
> handler.flush()
> self._logger.removeHandler(handler)
>
>
> +class _LoggerDictType(TypedDict):
> + logger: DTSLOG
> + name: str
> + node: str
> +
> +
> +# List for saving all loggers in use
> +_Loggers: list[_LoggerDictType] = []
> +
> +
> def getLogger(name: str, node: str = "suite") -> DTSLOG:
> + """Get DTS logger adapter identified by name and node.
> +
> + An existing logger will be return if one with the exact name and node already exists.
> + A new one will be created and stored otherwise.
> +
> + Args:
> + name: The name of the logger.
> + node: An additional identifier for the logger.
> +
> + Returns:
> + A logger uniquely identified by both name and node.
> """
> - Get logger handler and if there's no handler for specified Node will create one.
> - """
> - global Loggers
> + global _Loggers
> # return saved logger
> - logger: LoggerDictType
> - for logger in Loggers:
> + logger: _LoggerDictType
> + for logger in _Loggers:
> if logger["name"] == name and logger["node"] == node:
> return logger["logger"]
>
> # return new logger
> dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
> - Loggers.append({"logger": dts_logger, "name": name, "node": node})
> + _Loggers.append({"logger": dts_logger, "name": name, "node": node})
> return dts_logger
> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> index f0c916471c..0613adf7ad 100644
> --- a/dts/framework/utils.py
> +++ b/dts/framework/utils.py
> @@ -3,6 +3,16 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> +"""Various utility classes and functions.
> +
> +These are used in multiple modules across the framework. They're here because
> +they provide some non-specific functionality, greatly simplify imports or just don't
> +fit elsewhere.
> +
> +Attributes:
> + REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
> +"""
> +
> import atexit
> import json
> import os
> @@ -19,12 +29,20 @@
>
>
> def expand_range(range_str: str) -> list[int]:
> - """
> - Process range string into a list of integers. There are two possible formats:
> - n - a single integer
> - n-m - a range of integers
> + """Process `range_str` into a list of integers.
> +
> + There are two possible formats of `range_str`:
> +
> + * ``n`` - a single integer,
> + * ``n-m`` - a range of integers.
>
> - The returned range includes both n and m. Empty string returns an empty list.
> + The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
> +
> + Args:
> + range_str: The range to expand.
> +
> + Returns:
> + All the numbers from the range.
> """
> expanded_range: list[int] = []
> if range_str:
> @@ -39,6 +57,14 @@ def expand_range(range_str: str) -> list[int]:
>
>
> def get_packet_summaries(packets: list[Packet]) -> str:
> + """Format a string summary from `packets`.
> +
> + Args:
> + packets: The packets to format.
> +
> + Returns:
> + The summary of `packets`.
> + """
> if len(packets) == 1:
> packet_summaries = packets[0].summary()
> else:
> @@ -49,6 +75,8 @@ def get_packet_summaries(packets: list[Packet]) -> str:
>
>
> class StrEnum(Enum):
> + """Enum with members stored as strings."""
> +
> @staticmethod
> def _generate_next_value_(
> name: str, start: int, count: int, last_values: object
> @@ -56,22 +84,29 @@ def _generate_next_value_(
> return name
>
> def __str__(self) -> str:
> + """The string representation is the name of the member."""
> return self.name
>
>
> class MesonArgs(object):
> - """
> - Aggregate the arguments needed to build DPDK:
> - default_library: Default library type, Meson allows "shared", "static" and "both".
> - Defaults to None, in which case the argument won't be used.
> - Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
> - Do not use -D with them, for example:
> - meson_args = MesonArgs(enable_kmods=True).
> - """
> + """Aggregate the arguments needed to build DPDK."""
>
> _default_library: str
>
> def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
> + """Initialize the meson arguments.
> +
> + Args:
> + default_library: The default library type, Meson supports ``shared``, ``static`` and
> + ``both``. Defaults to :data:`None`, in which case the argument won't be used.
> + dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
> + Do not use ``-D`` with them.
> +
> + Example:
> + ::
> +
> + meson_args = MesonArgs(enable_kmods=True).
> + """
> self._default_library = (
> f"--default-library={default_library}" if default_library else ""
> )
> @@ -83,6 +118,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
> )
>
> def __str__(self) -> str:
> + """The actual args."""
> return " ".join(f"{self._default_library} {self._dpdk_args}".split())
>
>
> @@ -93,35 +129,33 @@ class _TarCompressionFormat(StrEnum):
> and Enum values are the associated file extensions.
> """
>
> + #:
> gzip = "gz"
> + #:
> compress = "Z"
> + #:
> bzip2 = "bz2"
> + #:
> lzip = "lz"
> + #:
> lzma = "lzma"
> + #:
> lzop = "lzo"
> + #:
> xz = "xz"
> + #:
> zstd = "zst"
Just to be sure, _TarCompressionFormat doesn't appear in the doc
(framework.utils.html). I believe that's intended (because of the _) but
then I don't think the #: are used for anything.
>
>
> class DPDKGitTarball(object):
> - """Create a compressed tarball of DPDK from the repository.
> -
> - The DPDK version is specified with git object git_ref.
> - The tarball will be compressed with _TarCompressionFormat,
> - which must be supported by the DTS execution environment.
> - The resulting tarball will be put into output_dir.
> + """Compressed tarball of DPDK from the repository.
>
> - The class supports the os.PathLike protocol,
> + The class supports the :class:`os.PathLike` protocol,
> which is used to get the Path of the tarball::
>
> from pathlib import Path
> tarball = DPDKGitTarball("HEAD", "output")
> tarball_path = Path(tarball)
> -
> - Arguments:
> - git_ref: A git commit ID, tag ID or tree ID.
> - output_dir: The directory where to put the resulting tarball.
> - tar_compression_format: The compression format to use.
> """
>
> _git_ref: str
> @@ -136,6 +170,17 @@ def __init__(
> output_dir: str,
> tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
> ):
> + """Create the tarball during initialization.
> +
> + The DPDK version is specified with `git_ref`. The tarball will be compressed with
> + `tar_compression_format`, which must be supported by the DTS execution environment.
> + The resulting tarball will be put into `output_dir`.
> +
> + Args:
> + git_ref: A git commit ID, tag ID or tree ID.
> + output_dir: The directory where to put the resulting tarball.
> + tar_compression_format: The compression format to use.
> + """
> self._git_ref = git_ref
> self._tar_compression_format = tar_compression_format
>
> @@ -204,4 +249,5 @@ def _delete_tarball(self) -> None:
> os.remove(self._tarball_path)
>
> def __fspath__(self) -> str:
> + """The os.PathLike protocol implementation."""
> return str(self._tarball_path)
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 06/23] dts: logger and settings docstring update
2023-11-08 17:14 ` Yoan Picchi
@ 2023-11-15 10:11 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 10:11 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, dev
On Wed, Nov 8, 2023 at 6:14 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/8/23 12:53, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/logger.py | 72 +++++++++++++++++++++----------
> > dts/framework/utils.py | 96 ++++++++++++++++++++++++++++++-----------
> > 2 files changed, 121 insertions(+), 47 deletions(-)
> >
<snip>
> > diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> > index f0c916471c..0613adf7ad 100644
> > --- a/dts/framework/utils.py
> > +++ b/dts/framework/utils.py
<snip>
> > @@ -93,35 +129,33 @@ class _TarCompressionFormat(StrEnum):
> > and Enum values are the associated file extensions.
> > """
> >
> > + #:
> > gzip = "gz"
> > + #:
> > compress = "Z"
> > + #:
> > bzip2 = "bz2"
> > + #:
> > lzip = "lz"
> > + #:
> > lzma = "lzma"
> > + #:
> > lzop = "lzo"
> > + #:
> > xz = "xz"
> > + #:
> > zstd = "zst"
>
> Just to be sure, _TarCompressionFormat doesn't appear in the doc
> (framework.utils.html). I believe that's intended (because of the _) but
> then I don't think the #: are used for anything.
>
Good point, I'll remove the comments as they're just clutter when not
in doc generation.
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 07/23] dts: dts runner and main docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (4 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 06/23] dts: logger and " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 08/23] dts: test suite " Juraj Linkeš
` (15 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/dts.py | 128 ++++++++++++++++++++++++++++++++++++-------
dts/main.py | 8 ++-
2 files changed, 112 insertions(+), 24 deletions(-)
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 4c7fb0c40a..331fed7dc4 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,33 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+r"""Test suite runner module.
+
+A DTS run is split into stages:
+
+ #. Execution stage,
+ #. Build target stage,
+ #. Test suite stage,
+ #. Test case stage.
+
+The module is responsible for running tests on testbeds defined in the test run configuration.
+Each setup or teardown of each stage is recorded in a :class:`~framework.test_result.DTSResult` or
+one of its subclasses. The test case results are also recorded.
+
+If an error occurs, the current stage is aborted, the error is recorded and the run continues in
+the next iteration of the same stage. The return code is the highest `severity` of all
+:class:`~.framework.exception.DTSError`\s.
+
+Example:
+ An error occurs in a build target setup. The current build target is aborted and the run
+ continues with the next build target. If the errored build target was the last one in the given
+ execution, the next execution begins.
+
+Attributes:
+ dts_logger: The logger instance used in this module.
+ result: The top level result used in the module.
+"""
+
import sys
from .config import (
@@ -23,9 +50,38 @@
def run_all() -> None:
- """
- The main process of DTS. Runs all build targets in all executions from the main
- config file.
+ """Run all build targets in all executions from the test run configuration.
+
+ Before running test suites, executions and build targets are first set up.
+ The executions and build targets defined in the test run configuration are iterated over.
+ The executions define which tests to run and where to run them and build targets define
+ the DPDK build setup.
+
+ The tests suites are set up for each execution/build target tuple and each scheduled
+ test case within the test suite is set up, executed and torn down. After all test cases
+ have been executed, the test suite is torn down and the next build target will be tested.
+
+ All the nested steps look like this:
+
+ #. Execution setup
+
+ #. Build target setup
+
+ #. Test suite setup
+
+ #. Test case setup
+ #. Test case logic
+ #. Test case teardown
+
+ #. Test suite teardown
+
+ #. Build target teardown
+
+ #. Execution teardown
+
+ The test cases are filtered according to the specification in the test run configuration and
+ the :option:`--test-cases` command line argument or
+ the :envvar:`DTS_TESTCASES` environment variable.
"""
global dts_logger
global result
@@ -87,6 +143,8 @@ def run_all() -> None:
def _check_dts_python_version() -> None:
+ """Check the required Python version - v3.10."""
+
def RED(text: str) -> str:
return f"\u001B[31;1m{str(text)}\u001B[0m"
@@ -111,9 +169,16 @@ def _run_execution(
execution: ExecutionConfiguration,
result: DTSResult,
) -> None:
- """
- Run the given execution. This involves running the execution setup as well as
- running all build targets in the given execution.
+ """Run the given execution.
+
+ This involves running the execution setup as well as running all build targets
+ in the given execution. After that, execution teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: An execution's test run configuration.
+ result: The top level result object.
"""
dts_logger.info(
f"Running execution with SUT '{execution.system_under_test_node.name}'."
@@ -150,8 +215,18 @@ def _run_build_target(
execution: ExecutionConfiguration,
execution_result: ExecutionResult,
) -> None:
- """
- Run the given build target.
+ """Run the given build target.
+
+ This involves running the build target setup as well as running all test suites
+ in the given execution the build target is defined in.
+ After that, build target teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ build_target: A build target's test run configuration.
+ execution: The build target's execution's test run configuration.
+ execution_result: The execution level result object associated with the execution.
"""
dts_logger.info(f"Running build target '{build_target.name}'.")
build_target_result = execution_result.add_build_target(build_target)
@@ -183,10 +258,17 @@ def _run_all_suites(
execution: ExecutionConfiguration,
build_target_result: BuildTargetResult,
) -> None:
- """
- Use the given build_target to run execution's test suites
- with possibly only a subset of test cases.
- If no subset is specified, run all test cases.
+ """Run the execution's (possibly a subset) test suites using the current build_target.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
"""
end_build_target = False
if not execution.skip_smoke_tests:
@@ -215,16 +297,22 @@ def _run_single_suite(
build_target_result: BuildTargetResult,
test_suite_config: TestSuiteConfig,
) -> None:
- """Runs a single test suite.
+ """Run all test suite in a single test suite module.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
Args:
- sut_node: Node to run tests on.
- execution: Execution the test case belongs to.
- build_target_result: Build target configuration test case is run on
- test_suite_config: Test suite configuration
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
+ test_suite_config: Test suite test run configuration specifying the test suite module
+ and possibly a subset of test cases of test suites in that module.
Raises:
- BlockingTestSuiteError: If a test suite that was marked as blocking fails.
+ BlockingTestSuiteError: If a blocking test suite fails.
"""
try:
full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
@@ -248,9 +336,7 @@ def _run_single_suite(
def _exit_dts() -> None:
- """
- Process all errors and exit with the proper exit code.
- """
+ """Process all errors and exit with the proper exit code."""
result.process()
if dts_logger:
diff --git a/dts/main.py b/dts/main.py
index 5d4714b0c3..f703615d11 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -4,9 +4,7 @@
# Copyright(c) 2022 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
-"""
-A test framework for testing DPDK.
-"""
+"""The DTS executable."""
import logging
@@ -17,6 +15,10 @@ def main() -> None:
"""Set DTS settings, then run DTS.
The DTS settings are taken from the command line arguments and the environment variables.
+ The settings object is stored in the module-level variable settings.SETTINGS which the entire
+ framework uses. After importing the module (or the variable), any changes to the variable are
+ not going to be reflected without a re-import. This means that the SETTINGS variable must
+ be modified before the settings module is imported anywhere else in the framework.
"""
settings.SETTINGS = settings.get_settings()
from framework import dts
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 08/23] dts: test suite docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (5 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 07/23] dts: dts runner and main " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 09/23] dts: test result " Juraj Linkeš
` (14 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_suite.py | 223 +++++++++++++++++++++++++++---------
1 file changed, 168 insertions(+), 55 deletions(-)
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index d53553bf34..8daac35818 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -2,8 +2,19 @@
# Copyright(c) 2010-2014 Intel Corporation
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Base class for creating DTS test cases.
+"""Features common to all test suites.
+
+The module defines the :class:`TestSuite` class which doesn't contain any test cases, and as such
+must be extended by subclasses which add test cases. The :class:`TestSuite` contains the basics
+needed by subclasses:
+
+ * Test suite and test case execution flow,
+ * Testbed (SUT, TG) configuration,
+ * Packet sending and verification,
+ * Test case verification.
+
+The module also defines a function, :func:`get_test_suites`,
+for gathering test suites from a Python module.
"""
import importlib
@@ -31,25 +42,44 @@
class TestSuite(object):
- """
- The base TestSuite class provides methods for handling basic flow of a test suite:
- * test case filtering and collection
- * test suite setup/cleanup
- * test setup/cleanup
- * test case execution
- * error handling and results storage
- Test cases are implemented by derived classes. Test cases are all methods
- starting with test_, further divided into performance test cases
- (starting with test_perf_) and functional test cases (all other test cases).
- By default, all test cases will be executed. A list of testcase str names
- may be specified in conf.yaml or on the command line
- to filter which test cases to run.
- The methods named [set_up|tear_down]_[suite|test_case] should be overridden
- in derived classes if the appropriate suite/test case fixtures are needed.
+ """The base class with methods for handling the basic flow of a test suite.
+
+ * Test case filtering and collection,
+ * Test suite setup/cleanup,
+ * Test setup/cleanup,
+ * Test case execution,
+ * Error handling and results storage.
+
+ Test cases are implemented by subclasses. Test cases are all methods starting with ``test_``,
+ further divided into performance test cases (starting with ``test_perf_``)
+ and functional test cases (all other test cases).
+
+ By default, all test cases will be executed. A list of testcase names may be specified
+ in the YAML test run configuration file and in the :option:`--test-cases` command line argument
+ or in the :envvar:`DTS_TESTCASES` environment variable to filter which test cases to run.
+ The union of both lists will be used. Any unknown test cases from the latter lists
+ will be silently ignored.
+
+ If the :option:`--re-run` command line argument or the :envvar:`DTS_RERUN` environment variable
+ is set, in case of a test case failure, the test case will be executed again until it passes
+ or it fails that many times in addition of the first failure.
+
+ The methods named ``[set_up|tear_down]_[suite|test_case]`` should be overridden in subclasses
+ if the appropriate test suite/test case fixtures are needed.
+
+ The test suite is aware of the testbed (the SUT and TG) it's running on. From this, it can
+ properly choose the IP addresses and other configuration that must be tailored to the testbed.
+
+ Attributes:
+ sut_node: The SUT node where the test suite is running.
+ tg_node: The TG node where the test suite is running.
+ is_blocking: Whether the test suite is blocking. A failure of a blocking test suite
+ will block the execution of all subsequent test suites in the current build target.
"""
sut_node: SutNode
- is_blocking = False
+ tg_node: TGNode
+ is_blocking: bool = False
_logger: DTSLOG
_test_cases_to_run: list[str]
_func: bool
@@ -72,6 +102,19 @@ def __init__(
func: bool,
build_target_result: BuildTargetResult,
):
+ """Initialize the test suite testbed information and basic configuration.
+
+ Process what test cases to run, create the associated :class:`TestSuiteResult`,
+ find links between ports and set up default IP addresses to be used when configuring them.
+
+ Args:
+ sut_node: The SUT node where the test suite will run.
+ tg_node: The TG node where the test suite will run.
+ test_cases: The list of test cases to execute.
+ If empty, all test cases will be executed.
+ func: Whether to run functional tests.
+ build_target_result: The build target result this test suite is run in.
+ """
self.sut_node = sut_node
self.tg_node = tg_node
self._logger = getLogger(self.__class__.__name__)
@@ -95,6 +138,7 @@ def __init__(
self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
def _process_links(self) -> None:
+ """Construct links between SUT and TG ports."""
for sut_port in self.sut_node.ports:
for tg_port in self.tg_node.ports:
if (sut_port.identifier, sut_port.peer) == (
@@ -106,27 +150,42 @@ def _process_links(self) -> None:
)
def set_up_suite(self) -> None:
- """
- Set up test fixtures common to all test cases; this is done before
- any test case is run.
+ """Set up test fixtures common to all test cases.
+
+ This is done before any test case has been run.
"""
def tear_down_suite(self) -> None:
- """
- Tear down the previously created test fixtures common to all test cases.
+ """Tear down the previously created test fixtures common to all test cases.
+
+ This is done after all test have been run.
"""
def set_up_test_case(self) -> None:
- """
- Set up test fixtures before each test case.
+ """Set up test fixtures before each test case.
+
+ This is done before *each* test case.
"""
def tear_down_test_case(self) -> None:
- """
- Tear down the previously created test fixtures after each test case.
+ """Tear down the previously created test fixtures after each test case.
+
+ This is done after *each* test case.
"""
def configure_testbed_ipv4(self, restore: bool = False) -> None:
+ """Configure IPv4 addresses on all testbed ports.
+
+ The configured ports are:
+
+ * SUT ingress port,
+ * SUT egress port,
+ * TG ingress port,
+ * TG egress port.
+
+ Args:
+ restore: If :data:`True`, will remove the configuration instead.
+ """
delete = True if restore else False
enable = False if restore else True
self._configure_ipv4_forwarding(enable)
@@ -153,11 +212,13 @@ def _configure_ipv4_forwarding(self, enable: bool) -> None:
def send_packet_and_capture(
self, packet: Packet, duration: float = 1
) -> list[Packet]:
- """
- Send a packet through the appropriate interface and
- receive on the appropriate interface.
- Modify the packet with l3/l2 addresses corresponding
- to the testbed and desired traffic.
+ """Send and receive `packet` using the associated TG.
+
+ Send `packet` through the appropriate interface and receive on the appropriate interface.
+ Modify the packet with l3/l2 addresses corresponding to the testbed and desired traffic.
+
+ Returns:
+ A list of received packets.
"""
packet = self._adjust_addresses(packet)
return self.tg_node.send_packet_and_capture(
@@ -165,13 +226,25 @@ def send_packet_and_capture(
)
def get_expected_packet(self, packet: Packet) -> Packet:
+ """Inject the proper L2/L3 addresses into `packet`.
+
+ Args:
+ packet: The packet to modify.
+
+ Returns:
+ `packet` with injected L2/L3 addresses.
+ """
return self._adjust_addresses(packet, expected=True)
def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
- """
+ """L2 and L3 address additions in both directions.
+
Assumptions:
- Two links between SUT and TG, one link is TG -> SUT,
- the other SUT -> TG.
+ Two links between SUT and TG, one link is TG -> SUT, the other SUT -> TG.
+
+ Args:
+ packet: The packet to modify.
+ expected: If True, the direction is SUT -> TG, otherwise the direction is TG -> SUT.
"""
if expected:
# The packet enters the TG from SUT
@@ -197,6 +270,19 @@ def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
return Ether(packet.build())
def verify(self, condition: bool, failure_description: str) -> None:
+ """Verify `condition` and handle failures.
+
+ When `condition` is :data:`False`, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ condition: The condition to check.
+ failure_description: A short description of the failure
+ that will be stored in the raised exception.
+
+ Raises:
+ TestCaseVerifyError: `condition` is :data:`False`.
+ """
if not condition:
self._fail_test_case_verify(failure_description)
@@ -216,6 +302,19 @@ def _fail_test_case_verify(self, failure_description: str) -> None:
def verify_packets(
self, expected_packet: Packet, received_packets: list[Packet]
) -> None:
+ """Verify that `expected_packet` has been received.
+
+ Go through `received_packets` and check that `expected_packet` is among them.
+ If not, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ expected_packet: The packet we're expecting to receive.
+ received_packets: The packets where we're looking for `expected_packet`.
+
+ Raises:
+ TestCaseVerifyError: `expected_packet` is not among `received_packets`.
+ """
for received_packet in received_packets:
if self._compare_packets(expected_packet, received_packet):
break
@@ -303,10 +402,14 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
return True
def run(self) -> None:
- """
- Setup, execute and teardown the whole suite.
- Suite execution consists of running all test cases scheduled to be executed.
- A test cast run consists of setup, execution and teardown of said test case.
+ """Set up, execute and tear down the whole suite.
+
+ Test suite execution consists of running all test cases scheduled to be executed.
+ A test case run consists of setup, execution and teardown of said test case.
+
+ Record the setup and the teardown and handle failures.
+
+ The list of scheduled test cases is constructed when creating the :class:`TestSuite` object.
"""
test_suite_name = self.__class__.__name__
@@ -338,9 +441,7 @@ def run(self) -> None:
raise BlockingTestSuiteError(test_suite_name)
def _execute_test_suite(self) -> None:
- """
- Execute all test cases scheduled to be executed in this suite.
- """
+ """Execute all test cases scheduled to be executed in this suite."""
if self._func:
for test_case_method in self._get_functional_test_cases():
test_case_name = test_case_method.__name__
@@ -357,14 +458,18 @@ def _execute_test_suite(self) -> None:
self._run_test_case(test_case_method, test_case_result)
def _get_functional_test_cases(self) -> list[MethodType]:
- """
- Get all functional test cases.
+ """Get all functional test cases defined in this TestSuite.
+
+ Returns:
+ The list of functional test cases of this TestSuite.
"""
return self._get_test_cases(r"test_(?!perf_)")
def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
- """
- Return a list of test cases matching test_case_regex.
+ """Return a list of test cases matching test_case_regex.
+
+ Returns:
+ The list of test cases matching test_case_regex of this TestSuite.
"""
self._logger.debug(f"Searching for test cases in {self.__class__.__name__}.")
filtered_test_cases = []
@@ -378,9 +483,7 @@ def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
return filtered_test_cases
def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool:
- """
- Check whether the test case should be executed.
- """
+ """Check whether the test case should be scheduled to be executed."""
match = bool(re.match(test_case_regex, test_case_name))
if self._test_cases_to_run:
return match and test_case_name in self._test_cases_to_run
@@ -390,9 +493,9 @@ def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool
def _run_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Setup, execute and teardown a test case in this suite.
- Exceptions are caught and recorded in logs and results.
+ """Setup, execute and teardown a test case in this suite.
+
+ Record the result of the setup and the teardown and handle failures.
"""
test_case_name = test_case_method.__name__
@@ -427,9 +530,7 @@ def _run_test_case(
def _execute_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Execute one test case and handle failures.
- """
+ """Execute one test case, record the result and handle failures."""
test_case_name = test_case_method.__name__
try:
self._logger.info(f"Starting test case execution: {test_case_name}")
@@ -452,6 +553,18 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
+ r"""Find all :class:`TestSuite`\s in a Python module.
+
+ Args:
+ testsuite_module_path: The path to the Python module.
+
+ Returns:
+ The list of :class:`TestSuite`\s found within the Python module.
+
+ Raises:
+ ConfigurationError: The test suite module was not found.
+ """
+
def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 09/23] dts: test result docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (6 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 08/23] dts: test suite " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 10/23] dts: config " Juraj Linkeš
` (13 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_result.py | 292 ++++++++++++++++++++++++++++-------
1 file changed, 234 insertions(+), 58 deletions(-)
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 603e18872c..f553948454 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -2,8 +2,25 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-Generic result container and reporters
+r"""Record and process DTS results.
+
+The results are recorded in a hierarchical manner:
+
+ * :class:`DTSResult` contains
+ * :class:`ExecutionResult` contains
+ * :class:`BuildTargetResult` contains
+ * :class:`TestSuiteResult` contains
+ * :class:`TestCaseResult`
+
+Each result may contain multiple lower level results, e.g. there are multiple
+:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
+The results have common parts, such as setup and teardown results, captured in :class:`BaseResult`,
+which also defines some common behaviors in its methods.
+
+Each result class has its own idiosyncrasies which they implement in overridden methods.
+
+The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+variable modify the directory where the files with results will be stored.
"""
import os.path
@@ -26,26 +43,34 @@
class Result(Enum):
- """
- An Enum defining the possible states that
- a setup, a teardown or a test case may end up in.
- """
+ """The possible states that a setup, a teardown or a test case may end up in."""
+ #:
PASS = auto()
+ #:
FAIL = auto()
+ #:
ERROR = auto()
+ #:
SKIP = auto()
def __bool__(self) -> bool:
+ """Only PASS is True."""
return self is self.PASS
class FixtureResult(object):
- """
- A record that stored the result of a setup or a teardown.
- The default is FAIL because immediately after creating the object
- the setup of the corresponding stage will be executed, which also guarantees
- the execution of teardown.
+ """A record that stores the result of a setup or a teardown.
+
+ FAIL is a sensible default since it prevents false positives
+ (which could happen if the default was TRUE).
+
+ Preventing false positives or other false results is preferable since a failure
+ is mostly likely to be investigated (the other false results may not be investigated at all).
+
+ Attributes:
+ result: The associated result.
+ error: The error in case of a failure.
"""
result: Result
@@ -56,21 +81,32 @@ def __init__(
result: Result = Result.FAIL,
error: Exception | None = None,
):
+ """Initialize the constructor with the fixture result and store a possible error.
+
+ Args:
+ result: The result to store.
+ error: The error which happened when a failure occurred.
+ """
self.result = result
self.error = error
def __bool__(self) -> bool:
+ """A wrapper around the stored :class:`Result`."""
return bool(self.result)
class Statistics(dict):
- """
- A helper class used to store the number of test cases by its result
- along a few other basic information.
- Using a dict provides a convenient way to format the data.
+ """How many test cases ended in which result state along some other basic information.
+
+ Subclassing :class:`dict` provides a convenient way to format the data.
"""
def __init__(self, dpdk_version: str | None):
+ """Extend the constructor with relevant keys.
+
+ Args:
+ dpdk_version: The version of tested DPDK.
+ """
super(Statistics, self).__init__()
for result in Result:
self[result.name] = 0
@@ -78,8 +114,17 @@ def __init__(self, dpdk_version: str | None):
self["DPDK VERSION"] = dpdk_version
def __iadd__(self, other: Result) -> "Statistics":
- """
- Add a Result to the final count.
+ """Add a Result to the final count.
+
+ Example:
+ stats: Statistics = Statistics() # empty Statistics
+ stats += Result.PASS # add a Result to `stats`
+
+ Args:
+ other: The Result to add to this statistics object.
+
+ Returns:
+ The modified statistics object.
"""
self[other.name] += 1
self["PASS RATE"] = (
@@ -90,9 +135,7 @@ def __iadd__(self, other: Result) -> "Statistics":
return self
def __str__(self) -> str:
- """
- Provide a string representation of the data.
- """
+ """Each line contains the formatted key = value pair."""
stats_str = ""
for key, value in self.items():
stats_str += f"{key:<12} = {value}\n"
@@ -102,10 +145,16 @@ def __str__(self) -> str:
class BaseResult(object):
- """
- The Base class for all results. Stores the results of
- the setup and teardown portions of the corresponding stage
- and a list of results from each inner stage in _inner_results.
+ """Common data and behavior of DTS results.
+
+ Stores the results of the setup and teardown portions of the corresponding stage.
+ The hierarchical nature of DTS results is captured recursively in an internal list.
+ A stage is each level in this particular hierarchy (pre-execution or the top-most level,
+ execution, build target, test suite and test case.)
+
+ Attributes:
+ setup_result: The result of the setup of the particular stage.
+ teardown_result: The results of the teardown of the particular stage.
"""
setup_result: FixtureResult
@@ -113,15 +162,28 @@ class BaseResult(object):
_inner_results: MutableSequence["BaseResult"]
def __init__(self):
+ """Initialize the constructor."""
self.setup_result = FixtureResult()
self.teardown_result = FixtureResult()
self._inner_results = []
def update_setup(self, result: Result, error: Exception | None = None) -> None:
+ """Store the setup result.
+
+ Args:
+ result: The result of the setup.
+ error: The error that occurred in case of a failure.
+ """
self.setup_result.result = result
self.setup_result.error = error
def update_teardown(self, result: Result, error: Exception | None = None) -> None:
+ """Store the teardown result.
+
+ Args:
+ result: The result of the teardown.
+ error: The error that occurred in case of a failure.
+ """
self.teardown_result.result = result
self.teardown_result.error = error
@@ -141,27 +203,55 @@ def _get_inner_errors(self) -> list[Exception]:
]
def get_errors(self) -> list[Exception]:
+ """Compile errors from the whole result hierarchy.
+
+ Returns:
+ The errors from setup, teardown and all errors found in the whole result hierarchy.
+ """
return self._get_setup_teardown_errors() + self._get_inner_errors()
def add_stats(self, statistics: Statistics) -> None:
+ """Collate stats from the whole result hierarchy.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be collated.
+ """
for inner_result in self._inner_results:
inner_result.add_stats(statistics)
class TestCaseResult(BaseResult, FixtureResult):
- """
- The test case specific result.
- Stores the result of the actual test case.
- Also stores the test case name.
+ r"""The test case specific result.
+
+ Stores the result of the actual test case. This is done by adding an extra superclass
+ in :class:`FixtureResult`. The setup and teardown results are :class:`FixtureResult`\s and
+ the class is itself a record of the test case.
+
+ Attributes:
+ test_case_name: The test case name.
"""
test_case_name: str
def __init__(self, test_case_name: str):
+ """Extend the constructor with `test_case_name`.
+
+ Args:
+ test_case_name: The test case's name.
+ """
super(TestCaseResult, self).__init__()
self.test_case_name = test_case_name
def update(self, result: Result, error: Exception | None = None) -> None:
+ """Update the test case result.
+
+ This updates the result of the test case itself and doesn't affect
+ the results of the setup and teardown steps in any way.
+
+ Args:
+ result: The result of the test case.
+ error: The error that occurred in case of a failure.
+ """
self.result = result
self.error = error
@@ -171,38 +261,66 @@ def _get_inner_errors(self) -> list[Exception]:
return []
def add_stats(self, statistics: Statistics) -> None:
+ r"""Add the test case result to statistics.
+
+ The base method goes through the hierarchy recursively and this method is here to stop
+ the recursion, as the :class:`TestCaseResult`\s are the leaves of the hierarchy tree.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be added.
+ """
statistics += self.result
def __bool__(self) -> bool:
+ """The test case passed only if setup, teardown and the test case itself passed."""
return (
bool(self.setup_result) and bool(self.teardown_result) and bool(self.result)
)
class TestSuiteResult(BaseResult):
- """
- The test suite specific result.
- The _inner_results list stores results of test cases in a given test suite.
- Also stores the test suite name.
+ """The test suite specific result.
+
+ The internal list stores the results of all test cases in a given test suite.
+
+ Attributes:
+ suite_name: The test suite name.
"""
suite_name: str
def __init__(self, suite_name: str):
+ """Extend the constructor with `suite_name`.
+
+ Args:
+ suite_name: The test suite's name.
+ """
super(TestSuiteResult, self).__init__()
self.suite_name = suite_name
def add_test_case(self, test_case_name: str) -> TestCaseResult:
+ """Add and return the inner result (test case).
+
+ Returns:
+ The test case's result.
+ """
test_case_result = TestCaseResult(test_case_name)
self._inner_results.append(test_case_result)
return test_case_result
class BuildTargetResult(BaseResult):
- """
- The build target specific result.
- The _inner_results list stores results of test suites in a given build target.
- Also stores build target specifics, such as compiler used to build DPDK.
+ """The build target specific result.
+
+ The internal list stores the results of all test suites in a given build target.
+
+ Attributes:
+ arch: The DPDK build target architecture.
+ os: The DPDK build target operating system.
+ cpu: The DPDK build target CPU.
+ compiler: The DPDK build target compiler.
+ compiler_version: The DPDK build target compiler version.
+ dpdk_version: The built DPDK version.
"""
arch: Architecture
@@ -213,6 +331,11 @@ class BuildTargetResult(BaseResult):
dpdk_version: str | None
def __init__(self, build_target: BuildTargetConfiguration):
+ """Extend the constructor with the `build_target`'s build target config.
+
+ Args:
+ build_target: The build target's test run configuration.
+ """
super(BuildTargetResult, self).__init__()
self.arch = build_target.arch
self.os = build_target.os
@@ -222,20 +345,35 @@ def __init__(self, build_target: BuildTargetConfiguration):
self.dpdk_version = None
def add_build_target_info(self, versions: BuildTargetInfo) -> None:
+ """Add information about the build target gathered at runtime.
+
+ Args:
+ versions: The additional information.
+ """
self.compiler_version = versions.compiler_version
self.dpdk_version = versions.dpdk_version
def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
+ """Add and return the inner result (test suite).
+
+ Returns:
+ The test suite's result.
+ """
test_suite_result = TestSuiteResult(test_suite_name)
self._inner_results.append(test_suite_result)
return test_suite_result
class ExecutionResult(BaseResult):
- """
- The execution specific result.
- The _inner_results list stores results of build targets in a given execution.
- Also stores the SUT node configuration.
+ """The execution specific result.
+
+ The internal list stores the results of all build targets in a given execution.
+
+ Attributes:
+ sut_node: The SUT node used in the execution.
+ sut_os_name: The operating system of the SUT node.
+ sut_os_version: The operating system version of the SUT node.
+ sut_kernel_version: The operating system kernel version of the SUT node.
"""
sut_node: NodeConfiguration
@@ -244,36 +382,55 @@ class ExecutionResult(BaseResult):
sut_kernel_version: str
def __init__(self, sut_node: NodeConfiguration):
+ """Extend the constructor with the `sut_node`'s config.
+
+ Args:
+ sut_node: The SUT node's test run configuration used in the execution.
+ """
super(ExecutionResult, self).__init__()
self.sut_node = sut_node
def add_build_target(
self, build_target: BuildTargetConfiguration
) -> BuildTargetResult:
+ """Add and return the inner result (build target).
+
+ Args:
+ build_target: The build target's test run configuration.
+
+ Returns:
+ The build target's result.
+ """
build_target_result = BuildTargetResult(build_target)
self._inner_results.append(build_target_result)
return build_target_result
def add_sut_info(self, sut_info: NodeInfo) -> None:
+ """Add SUT information gathered at runtime.
+
+ Args:
+ sut_info: The additional SUT node information.
+ """
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
class DTSResult(BaseResult):
- """
- Stores environment information and test results from a DTS run, which are:
- * Execution level information, such as SUT and TG hardware.
- * Build target level information, such as compiler, target OS and cpu.
- * Test suite results.
- * All errors that are caught and recorded during DTS execution.
+ """Stores environment information and test results from a DTS run.
- The information is stored in nested objects.
+ * Execution level information, such as testbed and the test suite list,
+ * Build target level information, such as compiler, target OS and cpu,
+ * Test suite and test case results,
+ * All errors that are caught and recorded during DTS execution.
- The class is capable of computing the return code used to exit DTS with
- from the stored error.
+ The information is stored hierarchically. This is the first level of the hierarchy
+ and as such is where the data form the whole hierarchy is collated or processed.
- It also provides a brief statistical summary of passed/failed test cases.
+ The internal list stores the results of all executions.
+
+ Attributes:
+ dpdk_version: The DPDK version to record.
"""
dpdk_version: str | None
@@ -284,6 +441,11 @@ class DTSResult(BaseResult):
_stats_filename: str
def __init__(self, logger: DTSLOG):
+ """Extend the constructor with top-level specifics.
+
+ Args:
+ logger: The logger instance the whole result will use.
+ """
super(DTSResult, self).__init__()
self.dpdk_version = None
self._logger = logger
@@ -293,21 +455,33 @@ def __init__(self, logger: DTSLOG):
self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
+ """Add and return the inner result (execution).
+
+ Args:
+ sut_node: The SUT node's test run configuration.
+
+ Returns:
+ The execution's result.
+ """
execution_result = ExecutionResult(sut_node)
self._inner_results.append(execution_result)
return execution_result
def add_error(self, error: Exception) -> None:
+ """Record an error that occurred outside any execution.
+
+ Args:
+ error: The exception to record.
+ """
self._errors.append(error)
def process(self) -> None:
- """
- Process the data after a DTS run.
- The data is added to nested objects during runtime and this parent object
- is not updated at that time. This requires us to process the nested data
- after it's all been gathered.
+ """Process the data after a whole DTS run.
+
+ The data is added to inner objects during runtime and this object is not updated
+ at that time. This requires us to process the inner data after it's all been gathered.
- The processing gathers all errors and the result statistics of test cases.
+ The processing gathers all errors and the statistics of test case results.
"""
self._errors += self.get_errors()
if self._errors and self._logger:
@@ -321,8 +495,10 @@ def process(self) -> None:
stats_file.write(str(self._stats_result))
def get_return_code(self) -> int:
- """
- Go through all stored Exceptions and return the highest error code found.
+ """Go through all stored Exceptions and return the final DTS error code.
+
+ Returns:
+ The highest error code found.
"""
for error in self._errors:
error_return_code = ErrorSeverity.GENERIC_ERR
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 10/23] dts: config docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (7 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 09/23] dts: test result " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 11/23] dts: remote session " Juraj Linkeš
` (12 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 371 ++++++++++++++++++++++++++-----
dts/framework/config/types.py | 132 +++++++++++
2 files changed, 446 insertions(+), 57 deletions(-)
create mode 100644 dts/framework/config/types.py
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 2044c82611..0aa149a53d 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -3,8 +3,34 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Yaml config parsing methods
+"""Testbed configuration and test suite specification.
+
+This package offers classes that hold real-time information about the testbed, hold test run
+configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
+the YAML test run configuration file
+and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+
+The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
+this package. The allowed keys and types inside this dictionary are defined in
+the :doc:`types <framework.config.types>` module.
+
+The test run configuration has two main sections:
+
+ * The :class:`ExecutionConfiguration` which defines what tests are going to be run
+ and how DPDK will be built. It also references the testbed where these tests and DPDK
+ are going to be run,
+ * The nodes of the testbed are defined in the other section,
+ a :class:`list` of :class:`NodeConfiguration` objects.
+
+The real-time information about testbed is supposed to be gathered at runtime.
+
+The classes defined in this package make heavy use of :mod:`dataclasses`.
+All of them use slots and are frozen:
+
+ * Slots enables some optimizations, by pre-allocating space for the defined
+ attributes in the underlying data structure,
+ * Frozen makes the object immutable. This enables further optimizations,
+ and makes it thread safe should we every want to move in that direction.
"""
import json
@@ -12,11 +38,20 @@
import pathlib
from dataclasses import dataclass
from enum import auto, unique
-from typing import Any, TypedDict, Union
+from typing import Union
import warlock # type: ignore[import]
import yaml
+from framework.config.types import (
+ BuildTargetConfigDict,
+ ConfigurationDict,
+ ExecutionConfigDict,
+ NodeConfigDict,
+ PortConfigDict,
+ TestSuiteConfigDict,
+ TrafficGeneratorConfigDict,
+)
from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -24,55 +59,97 @@
@unique
class Architecture(StrEnum):
+ r"""The supported architectures of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
i686 = auto()
+ #:
x86_64 = auto()
+ #:
x86_32 = auto()
+ #:
arm64 = auto()
+ #:
ppc64le = auto()
@unique
class OS(StrEnum):
+ r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
linux = auto()
+ #:
freebsd = auto()
+ #:
windows = auto()
@unique
class CPUType(StrEnum):
+ r"""The supported CPUs of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
native = auto()
+ #:
armv8a = auto()
+ #:
dpaa2 = auto()
+ #:
thunderx = auto()
+ #:
xgene1 = auto()
@unique
class Compiler(StrEnum):
+ r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
gcc = auto()
+ #:
clang = auto()
+ #:
icc = auto()
+ #:
msvc = auto()
@unique
class TrafficGeneratorType(StrEnum):
+ """The supported traffic generators."""
+
+ #:
SCAPY = auto()
-# Slots enables some optimizations, by pre-allocating space for the defined
-# attributes in the underlying data structure.
-#
-# Frozen makes the object immutable. This enables further optimizations,
-# and makes it thread safe should we every want to move in that direction.
@dataclass(slots=True, frozen=True)
class HugepageConfiguration:
+ r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ amount: The number of hugepages.
+ force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
+ """
+
amount: int
force_first_numa: bool
@dataclass(slots=True, frozen=True)
class PortConfig:
+ r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ node: The :class:`~framework.testbed_model.node.Node` where this port exists.
+ pci: The PCI address of the port.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK.
+ os_driver: The operating system driver name when the operating system controls the port.
+ peer_node: The :class:`~framework.testbed_model.node.Node` of the port
+ connected to this port.
+ peer_pci: The PCI address of the port connected to this port.
+ """
+
node: str
pci: str
os_driver_for_dpdk: str
@@ -81,18 +158,44 @@ class PortConfig:
peer_pci: str
@staticmethod
- def from_dict(node: str, d: dict) -> "PortConfig":
+ def from_dict(node: str, d: PortConfigDict) -> "PortConfig":
+ """A convenience method that creates the object from fewer inputs.
+
+ Args:
+ node: The node where this port exists.
+ d: The configuration dictionary.
+
+ Returns:
+ The port configuration instance.
+ """
return PortConfig(node=node, **d)
@dataclass(slots=True, frozen=True)
class TrafficGeneratorConfig:
+ """The configuration of traffic generators.
+
+ The class will be expanded when more configuration is needed.
+
+ Attributes:
+ traffic_generator_type: The type of the traffic generator.
+ """
+
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
- # This looks useless now, but is designed to allow expansion to traffic
- # generators that require more configuration later.
+ def from_dict(d: TrafficGeneratorConfigDict) -> "ScapyTrafficGeneratorConfig":
+ """A convenience method that produces traffic generator config of the proper type.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The traffic generator configuration instance.
+
+ Raises:
+ ConfigurationError: An unknown traffic generator type was encountered.
+ """
match TrafficGeneratorType(d["type"]):
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGeneratorConfig(
@@ -106,11 +209,31 @@ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
@dataclass(slots=True, frozen=True)
class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+ """Scapy traffic generator specific configuration."""
+
pass
@dataclass(slots=True, frozen=True)
class NodeConfiguration:
+ r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ name: The name of the :class:`~framework.testbed_model.node.Node`.
+ hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
+ Can be an IP or a domain name.
+ user: The name of the user used to connect to
+ the :class:`~framework.testbed_model.node.Node`.
+ password: The password of the user. The use of passwords is heavily discouraged.
+ Please use keys instead.
+ arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
+ os: The operating system of the :class:`~framework.testbed_model.node.Node`.
+ lcores: A comma delimited list of logical cores to use when running DPDK.
+ use_first_core: If :data:`True`, the first logical core won't be used.
+ hugepages: An optional hugepage configuration.
+ ports: The ports that can be used in testing.
+ """
+
name: str
hostname: str
user: str
@@ -123,57 +246,91 @@ class NodeConfiguration:
ports: list[PortConfig]
@staticmethod
- def from_dict(d: dict) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
- hugepage_config = d.get("hugepages")
- if hugepage_config:
- if "force_first_numa" not in hugepage_config:
- hugepage_config["force_first_numa"] = False
- hugepage_config = HugepageConfiguration(**hugepage_config)
-
- common_config = {
- "name": d["name"],
- "hostname": d["hostname"],
- "user": d["user"],
- "password": d.get("password"),
- "arch": Architecture(d["arch"]),
- "os": OS(d["os"]),
- "lcores": d.get("lcores", "1"),
- "use_first_core": d.get("use_first_core", False),
- "hugepages": hugepage_config,
- "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
- }
-
+ def from_dict(
+ d: NodeConfigDict,
+ ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
+ """A convenience method that processes the inputs before creating a specialized instance.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ Either an SUT or TG configuration instance.
+ """
+ hugepage_config = None
+ if "hugepages" in d:
+ hugepage_config_dict = d["hugepages"]
+ if "force_first_numa" not in hugepage_config_dict:
+ hugepage_config_dict["force_first_numa"] = False
+ hugepage_config = HugepageConfiguration(**hugepage_config_dict)
+
+ # The calls here contain duplicated code which is here because Mypy doesn't
+ # properly support dictionary unpacking with TypedDicts
if "traffic_generator" in d:
return TGNodeConfiguration(
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
traffic_generator=TrafficGeneratorConfig.from_dict(
d["traffic_generator"]
),
- **common_config,
)
else:
return SutNodeConfiguration(
- memory_channels=d.get("memory_channels", 1), **common_config
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
+ memory_channels=d.get("memory_channels", 1),
)
@dataclass(slots=True, frozen=True)
class SutNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
+
+ Attributes:
+ memory_channels: The number of memory channels to use when running DPDK.
+ """
+
memory_channels: int
@dataclass(slots=True, frozen=True)
class TGNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
+
+ Attributes:
+ traffic_generator: The configuration of the traffic generator present on the TG node.
+ """
+
traffic_generator: ScapyTrafficGeneratorConfig
@dataclass(slots=True, frozen=True)
class NodeInfo:
- """Class to hold important versions within the node.
-
- This class, unlike the NodeConfiguration class, cannot be generated at the start.
- This is because we need to initialize a connection with the node before we can
- collect the information needed in this class. Therefore, it cannot be a part of
- the configuration class above.
+ """Supplemental node information.
+
+ Attributes:
+ os_name: The name of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ os_version: The version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ kernel_version: The kernel version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
"""
os_name: str
@@ -183,6 +340,20 @@ class NodeInfo:
@dataclass(slots=True, frozen=True)
class BuildTargetConfiguration:
+ """DPDK build configuration.
+
+ The configuration used for building DPDK.
+
+ Attributes:
+ arch: The target architecture to build for.
+ os: The target os to build for.
+ cpu: The target CPU to build for.
+ compiler: The compiler executable to use.
+ compiler_wrapper: This string will be put in front of the compiler when
+ executing the build. Useful for adding wrapper commands, such as ``ccache``.
+ name: The name of the compiler.
+ """
+
arch: Architecture
os: OS
cpu: CPUType
@@ -191,7 +362,18 @@ class BuildTargetConfiguration:
name: str
@staticmethod
- def from_dict(d: dict) -> "BuildTargetConfiguration":
+ def from_dict(d: BuildTargetConfigDict) -> "BuildTargetConfiguration":
+ r"""A convenience method that processes the inputs before creating an instance.
+
+ `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
+ `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The build target configuration instance.
+ """
return BuildTargetConfiguration(
arch=Architecture(d["arch"]),
os=OS(d["os"]),
@@ -204,23 +386,29 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
@dataclass(slots=True, frozen=True)
class BuildTargetInfo:
- """Class to hold important versions within the build target.
+ """Various versions and other information about a build target.
- This is very similar to the NodeInfo class, it just instead holds information
- for the build target.
+ Attributes:
+ dpdk_version: The DPDK version that was built.
+ compiler_version: The version of the compiler used to build DPDK.
"""
dpdk_version: str
compiler_version: str
-class TestSuiteConfigDict(TypedDict):
- suite: str
- cases: list[str]
-
-
@dataclass(slots=True, frozen=True)
class TestSuiteConfig:
+ """Test suite configuration.
+
+ Information about a single test suite to be executed.
+
+ Attributes:
+ test_suite: The name of the test suite module without the starting ``TestSuite_``.
+ test_cases: The names of test cases from this test suite to execute.
+ If empty, all test cases will be executed.
+ """
+
test_suite: str
test_cases: list[str]
@@ -228,6 +416,14 @@ class TestSuiteConfig:
def from_dict(
entry: str | TestSuiteConfigDict,
) -> "TestSuiteConfig":
+ """Create an instance from two different types.
+
+ Args:
+ entry: Either a suite name or a dictionary containing the config.
+
+ Returns:
+ The test suite configuration instance.
+ """
if isinstance(entry, str):
return TestSuiteConfig(test_suite=entry, test_cases=[])
elif isinstance(entry, dict):
@@ -238,19 +434,49 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class ExecutionConfiguration:
+ """The configuration of an execution.
+
+ The configuration contains testbed information, what tests to execute
+ and with what DPDK build.
+
+ Attributes:
+ build_targets: A list of DPDK builds to test.
+ perf: Whether to run performance tests.
+ func: Whether to run functional tests.
+ skip_smoke_tests: Whether to skip smoke tests.
+ test_suites: The names of test suites and/or test cases to execute.
+ system_under_test_node: The SUT node to use in this execution.
+ traffic_generator_node: The TG node to use in this execution.
+ vdevs: The names of virtual devices to test.
+ """
+
build_targets: list[BuildTargetConfiguration]
perf: bool
func: bool
+ skip_smoke_tests: bool
test_suites: list[TestSuiteConfig]
system_under_test_node: SutNodeConfiguration
traffic_generator_node: TGNodeConfiguration
vdevs: list[str]
- skip_smoke_tests: bool
@staticmethod
def from_dict(
- d: dict, node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]]
+ d: ExecutionConfigDict,
+ node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]],
) -> "ExecutionConfiguration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ The build target and the test suite config is transformed into their respective objects.
+ SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
+ just stored.
+
+ Args:
+ d: The configuration dictionary.
+ node_map: A dictionary mapping node names to their config objects.
+
+ Returns:
+ The execution configuration instance.
+ """
build_targets: list[BuildTargetConfiguration] = list(
map(BuildTargetConfiguration.from_dict, d["build_targets"])
)
@@ -291,10 +517,31 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class Configuration:
+ """DTS testbed and test configuration.
+
+ The node configuration is not stored in this object. Rather, all used node configurations
+ are stored inside the execution configuration where the nodes are actually used.
+
+ Attributes:
+ executions: Execution configurations.
+ """
+
executions: list[ExecutionConfiguration]
@staticmethod
- def from_dict(d: dict) -> "Configuration":
+ def from_dict(d: ConfigurationDict) -> "Configuration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ Build target and test suite config is transformed into their respective objects.
+ SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
+ just stored.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The whole configuration instance.
+ """
nodes: list[Union[SutNodeConfiguration | TGNodeConfiguration]] = list(
map(NodeConfiguration.from_dict, d["nodes"])
)
@@ -313,9 +560,17 @@ def from_dict(d: dict) -> "Configuration":
def load_config() -> Configuration:
- """
- Loads the configuration file and the configuration file schema,
- validates the configuration file, and creates a configuration object.
+ """Load DTS test run configuration from a file.
+
+ Load the YAML test run configuration file
+ and :download:`the configuration file schema <conf_yaml_schema.json>`,
+ validate the test run configuration file, and create a test run configuration object.
+
+ The YAML test run configuration file is specified in the :option:`--config-file` command line
+ argument or the :envvar:`DTS_CFG_FILE` environment variable.
+
+ Returns:
+ The parsed test run configuration.
"""
with open(SETTINGS.config_file_path, "r") as f:
config_data = yaml.safe_load(f)
@@ -326,6 +581,8 @@ def load_config() -> Configuration:
with open(schema_path, "r") as f:
schema = json.load(f)
- config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
- config_obj: Configuration = Configuration.from_dict(dict(config))
+ config = warlock.model_factory(schema, name="_Config")(config_data)
+ config_obj: Configuration = Configuration.from_dict(
+ dict(config) # type: ignore[arg-type]
+ )
return config_obj
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
new file mode 100644
index 0000000000..1927910d88
--- /dev/null
+++ b/dts/framework/config/types.py
@@ -0,0 +1,132 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+"""Configuration dictionary contents specification.
+
+These type definitions serve as documentation of the configuration dictionary contents.
+
+The definitions use the built-in :class:`~typing.TypedDict` construct.
+"""
+
+from typing import TypedDict
+
+
+class PortConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ pci: str
+ #:
+ os_driver_for_dpdk: str
+ #:
+ os_driver: str
+ #:
+ peer_node: str
+ #:
+ peer_pci: str
+
+
+class TrafficGeneratorConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ type: str
+
+
+class HugepageConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ amount: int
+ #:
+ force_first_numa: bool
+
+
+class NodeConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ hugepages: HugepageConfigurationDict
+ #:
+ name: str
+ #:
+ hostname: str
+ #:
+ user: str
+ #:
+ password: str
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ lcores: str
+ #:
+ use_first_core: bool
+ #:
+ ports: list[PortConfigDict]
+ #:
+ memory_channels: int
+ #:
+ traffic_generator: TrafficGeneratorConfigDict
+
+
+class BuildTargetConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ cpu: str
+ #:
+ compiler: str
+ #:
+ compiler_wrapper: str
+
+
+class TestSuiteConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ suite: str
+ #:
+ cases: list[str]
+
+
+class ExecutionSUTConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ node_name: str
+ #:
+ vdevs: list[str]
+
+
+class ExecutionConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ build_targets: list[BuildTargetConfigDict]
+ #:
+ perf: bool
+ #:
+ func: bool
+ #:
+ skip_smoke_tests: bool
+ #:
+ test_suites: TestSuiteConfigDict
+ #:
+ system_under_test_node: ExecutionSUTConfigDict
+ #:
+ traffic_generator_node: str
+
+
+class ConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ nodes: list[NodeConfigDict]
+ #:
+ executions: list[ExecutionConfigDict]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 11/23] dts: remote session docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (8 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 10/23] dts: config " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 12/23] dts: interactive " Juraj Linkeš
` (11 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/__init__.py | 39 +++++-
.../remote_session/remote_session.py | 128 +++++++++++++-----
dts/framework/remote_session/ssh_session.py | 16 +--
3 files changed, 135 insertions(+), 48 deletions(-)
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 5e7ddb2b05..51a01d6b5e 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -2,12 +2,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-The package provides modules for managing remote connections to a remote host (node),
-differentiated by OS.
-The package provides a factory function, create_session, that returns the appropriate
-remote connection based on the passed configuration. The differences are in the
-underlying transport protocol (e.g. SSH) and remote OS (e.g. Linux).
+"""Remote interactive and non-interactive sessions.
+
+This package provides modules for managing remote connections to a remote host (node).
+
+The non-interactive sessions send commands and return their output and exit code.
+
+The interactive sessions open an interactive shell which is continuously open,
+allowing it to send and receive data within that particular shell.
"""
# pylama:ignore=W0611
@@ -26,10 +28,35 @@
def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> RemoteSession:
+ """Factory for non-interactive remote sessions.
+
+ The function returns an SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The SSH remote session.
+ """
return SSHSession(node_config, name, logger)
def create_interactive_session(
node_config: NodeConfiguration, logger: DTSLOG
) -> InteractiveRemoteSession:
+ """Factory for interactive remote sessions.
+
+ The function returns an interactive SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The interactive SSH remote session.
+ """
return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
index 0647d93de4..629c2d7b9c 100644
--- a/dts/framework/remote_session/remote_session.py
+++ b/dts/framework/remote_session/remote_session.py
@@ -3,6 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Base remote session.
+
+This module contains the abstract base class for remote sessions and defines
+the structure of the result of a command execution.
+"""
+
+
import dataclasses
from abc import ABC, abstractmethod
from pathlib import PurePath
@@ -15,8 +22,14 @@
@dataclasses.dataclass(slots=True, frozen=True)
class CommandResult:
- """
- The result of remote execution of a command.
+ """The result of remote execution of a command.
+
+ Attributes:
+ name: The name of the session that executed the command.
+ command: The executed command.
+ stdout: The standard output the command produced.
+ stderr: The standard error output the command produced.
+ return_code: The return code the command exited with.
"""
name: str
@@ -26,6 +39,7 @@ class CommandResult:
return_code: int
def __str__(self) -> str:
+ """Format the command outputs."""
return (
f"stdout: '{self.stdout}'\n"
f"stderr: '{self.stderr}'\n"
@@ -34,13 +48,24 @@ def __str__(self) -> str:
class RemoteSession(ABC):
- """
- The base class for defining which methods must be implemented in order to connect
- to a remote host (node) and maintain a remote session. The derived classes are
- supposed to implement/use some underlying transport protocol (e.g. SSH) to
- implement the methods. On top of that, it provides some basic services common to
- all derived classes, such as keeping history and logging what's being executed
- on the remote node.
+ """Non-interactive remote session.
+
+ The abstract methods must be implemented in order to connect to a remote host (node)
+ and maintain a remote session.
+ The subclasses must use (or implement) some underlying transport protocol (e.g. SSH)
+ to implement the methods. On top of that, it provides some basic services common to all
+ subclasses, such as keeping history and logging what's being executed on the remote node.
+
+ Attributes:
+ name: The name of the session.
+ hostname: The node's hostname. Could be an IP (possibly with port, separated by a colon)
+ or a domain name.
+ ip: The IP address of the node or a domain name, whichever was used in `hostname`.
+ port: The port of the node, if given in `hostname`.
+ username: The username used in the connection.
+ password: The password used in the connection. Most frequently empty,
+ as the use of passwords is discouraged.
+ history: The executed commands during this session.
"""
name: str
@@ -59,6 +84,16 @@ def __init__(
session_name: str,
logger: DTSLOG,
):
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ session_name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Raises:
+ SSHConnectionError: If the connection to the node was not successful.
+ """
self._node_config = node_config
self.name = session_name
@@ -79,8 +114,13 @@ def __init__(
@abstractmethod
def _connect(self) -> None:
- """
- Create connection to assigned node.
+ """Create a connection to the node.
+
+ The implementation must assign the established session to self.session.
+
+ The implementation must except all exceptions and convert them to an SSHConnectionError.
+
+ The implementation may optionally implement retry attempts.
"""
def send_command(
@@ -90,11 +130,24 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- Send a command to the connected node using optional env vars
- and return CommandResult.
- If verify is True, check the return code of the executed command
- and raise a RemoteCommandExecutionError if the command failed.
+ """Send `command` to the connected node.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds to execute `command`.
+ verify: If :data:`True`, will check the exit code of `command`.
+ env: A dictionary with environment variables to be used with `command` execution.
+
+ Raises:
+ SSHSessionDeadError: If the session isn't alive when sending `command`.
+ SSHTimeoutError: If `command` execution timed out.
+ RemoteCommandExecutionError: If verify is :data:`True` and `command` execution failed.
+
+ Returns:
+ The output of the command along with the return code.
"""
self._logger.info(
f"Sending: '{command}'" + (f" with env vars: '{env}'" if env else "")
@@ -115,29 +168,36 @@ def send_command(
def _send_command(
self, command: str, timeout: float, env: dict | None
) -> CommandResult:
- """
- Use the underlying protocol to execute the command using optional env vars
- and return CommandResult.
+ """Send a command to the connected node.
+
+ The implementation must execute the command remotely with `env` environment variables
+ and return the result.
+
+ The implementation must except all exceptions and raise an SSHSessionDeadError if
+ the session is not alive and an SSHTimeoutError if the command execution times out.
"""
def close(self, force: bool = False) -> None:
- """
- Close the remote session and free all used resources.
+ """Close the remote session and free all used resources.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
"""
self._logger.logger_exit()
self._close(force)
@abstractmethod
def _close(self, force: bool = False) -> None:
- """
- Execute protocol specific steps needed to close the session properly.
+ """Protocol specific steps needed to close the session properly.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
+ This doesn't have to be implemented in the overloaded method.
"""
@abstractmethod
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the remote session is still responding."""
@abstractmethod
def copy_from(
@@ -147,12 +207,12 @@ def copy_from(
) -> None:
"""Copy a file from the remote Node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote Node associated with this remote session
+ to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
- destination_file: a file or directory path on the local filesystem.
+ source_file: The file on the remote Node.
+ destination_file: A file or directory path on the local filesystem.
"""
@abstractmethod
@@ -163,10 +223,10 @@ def copy_to(
) -> None:
"""Copy a file from local filesystem to the remote Node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file` on the remote Node
+ associated with this remote session.
Args:
- source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ source_file: The file on the local filesystem.
+ destination_file: A file or directory path on the remote Node.
"""
diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
index cee11d14d6..7186490a9a 100644
--- a/dts/framework/remote_session/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -1,6 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""SSH session remote session."""
+
import socket
import traceback
from pathlib import PurePath
@@ -26,13 +28,8 @@
class SSHSession(RemoteSession):
"""A persistent SSH connection to a remote Node.
- The connection is implemented with the Fabric Python library.
-
- Args:
- node_config: The configuration of the Node to connect to.
- session_name: The name of the session.
- logger: The logger used for logging.
- This should be passed from the parent OSSession.
+ The connection is implemented with
+ `the Fabric Python library <https://docs.fabfile.org/en/latest/>`_.
Attributes:
session: The underlying Fabric SSH connection.
@@ -80,6 +77,7 @@ def _connect(self) -> None:
raise SSHConnectionError(self.hostname, errors)
def is_alive(self) -> bool:
+ """Overrides :meth:`~.remote_session.RemoteSession.is_alive`."""
return self.session.is_connected
def _send_command(
@@ -89,7 +87,7 @@ def _send_command(
Args:
command: The command to execute.
- timeout: Wait at most this many seconds for the execution to complete.
+ timeout: Wait at most this long in seconds to execute the command.
env: Extra environment variables that will be used in command execution.
Raises:
@@ -118,6 +116,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
self.session.get(str(destination_file), str(source_file))
def copy_to(
@@ -125,6 +124,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
self.session.put(str(source_file), str(destination_file))
def _close(self, force: bool = False) -> None:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 12/23] dts: interactive remote session docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (9 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 11/23] dts: remote session " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 13/23] dts: port and virtual device " Juraj Linkeš
` (10 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../interactive_remote_session.py | 36 +++----
.../remote_session/interactive_shell.py | 99 +++++++++++--------
dts/framework/remote_session/python_shell.py | 26 ++++-
dts/framework/remote_session/testpmd_shell.py | 61 +++++++++---
4 files changed, 150 insertions(+), 72 deletions(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 9085a668e8..c1bf30ac61 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -22,27 +22,23 @@
class InteractiveRemoteSession:
"""SSH connection dedicated to interactive applications.
- This connection is created using paramiko and is a persistent connection to the
- host. This class defines methods for connecting to the node and configures this
- connection to send "keep alive" packets every 30 seconds. Because paramiko attempts
- to use SSH keys to establish a connection first, providing a password is optional.
- This session is utilized by InteractiveShells and cannot be interacted with
- directly.
-
- Arguments:
- node_config: Configuration class for the node you are connecting to.
- _logger: Desired logger for this session to use.
+ The connection is created using `paramiko <https://docs.paramiko.org/en/latest/>`_
+ and is a persistent connection to the host. This class defines the methods for connecting
+ to the node and configures the connection to send "keep alive" packets every 30 seconds.
+ Because paramiko attempts to use SSH keys to establish a connection first, providing
+ a password is optional. This session is utilized by InteractiveShells
+ and cannot be interacted with directly.
Attributes:
- hostname: Hostname that will be used to initialize a connection to the node.
- ip: A subsection of hostname that removes the port for the connection if there
+ hostname: The hostname that will be used to initialize a connection to the node.
+ ip: A subsection of `hostname` that removes the port for the connection if there
is one. If there is no port, this will be the same as hostname.
- port: Port to use for the ssh connection. This will be extracted from the
- hostname if there is a port included, otherwise it will default to 22.
+ port: Port to use for the ssh connection. This will be extracted from `hostname`
+ if there is a port included, otherwise it will default to ``22``.
username: User to connect to the node with.
password: Password of the user connecting to the host. This will default to an
empty string if a password is not provided.
- session: Underlying paramiko connection.
+ session: The underlying paramiko connection.
Raises:
SSHConnectionError: There is an error creating the SSH connection.
@@ -58,9 +54,15 @@ class InteractiveRemoteSession:
_node_config: NodeConfiguration
_transport: Transport | None
- def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG) -> None:
+ def __init__(self, node_config: NodeConfiguration, logger: DTSLOG) -> None:
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+ """
self._node_config = node_config
- self._logger = _logger
+ self._logger = logger
self.hostname = node_config.hostname
self.username = node_config.user
self.password = node_config.password if node_config.password else ""
diff --git a/dts/framework/remote_session/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
index c24376b2a8..a98a822e91 100644
--- a/dts/framework/remote_session/interactive_shell.py
+++ b/dts/framework/remote_session/interactive_shell.py
@@ -3,18 +3,20 @@
"""Common functionality for interactive shell handling.
-This base class, InteractiveShell, is meant to be extended by other classes that
-contain functionality specific to that shell type. These derived classes will often
-modify things like the prompt to expect or the arguments to pass into the application,
-but still utilize the same method for sending a command and collecting output. How
-this output is handled however is often application specific. If an application needs
-elevated privileges to start it is expected that the method for gaining those
-privileges is provided when initializing the class.
+The base class, :class:`InteractiveShell`, is meant to be extended by subclasses that contain
+functionality specific to that shell type. These subclasses will often modify things like
+the prompt to expect or the arguments to pass into the application, but still utilize
+the same method for sending a command and collecting output. How this output is handled however
+is often application specific. If an application needs elevated privileges to start it is expected
+that the method for gaining those privileges is provided when initializing the class.
+
+The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+environment variable configure the timeout of getting the output from command execution.
"""
from abc import ABC
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from paramiko import Channel, SSHClient, channel # type: ignore[import]
@@ -30,28 +32,6 @@ class InteractiveShell(ABC):
and collecting input until reaching a certain prompt. All interactive applications
will use the same SSH connection, but each will create their own channel on that
session.
-
- Arguments:
- interactive_session: The SSH session dedicated to interactive shells.
- logger: Logger used for displaying information in the console.
- get_privileged_command: Method for modifying a command to allow it to use
- elevated privileges. If this is None, the application will not be started
- with elevated privileges.
- app_args: Command line arguments to be passed to the application on startup.
- timeout: Timeout used for the SSH channel that is dedicated to this interactive
- shell. This timeout is for collecting output, so if reading from the buffer
- and no output is gathered within the timeout, an exception is thrown.
-
- Attributes
- _default_prompt: Prompt to expect at the end of output when sending a command.
- This is often overridden by derived classes.
- _command_extra_chars: Extra characters to add to the end of every command
- before sending them. This is often overridden by derived classes and is
- most commonly an additional newline character.
- path: Path to the executable to start the interactive application.
- dpdk_app: Whether this application is a DPDK app. If it is, the build
- directory for DPDK on the node will be prepended to the path to the
- executable.
"""
_interactive_session: SSHClient
@@ -61,10 +41,22 @@ class InteractiveShell(ABC):
_logger: DTSLOG
_timeout: float
_app_args: str
- _default_prompt: str = ""
- _command_extra_chars: str = ""
- path: PurePath
- dpdk_app: bool = False
+
+ #: Prompt to expect at the end of output when sending a command.
+ #: This is often overridden by subclasses.
+ _default_prompt: ClassVar[str] = ""
+
+ #: Extra characters to add to the end of every command
+ #: before sending them. This is often overridden by subclasses and is
+ #: most commonly an additional newline character.
+ _command_extra_chars: ClassVar[str] = ""
+
+ #: Path to the executable to start the interactive application.
+ path: ClassVar[PurePath]
+
+ #: Whether this application is a DPDK app. If it is, the build directory
+ #: for DPDK on the node will be prepended to the path to the executable.
+ dpdk_app: ClassVar[bool] = False
def __init__(
self,
@@ -74,6 +66,19 @@ def __init__(
app_args: str = "",
timeout: float = SETTINGS.timeout,
) -> None:
+ """Create an SSH channel during initialization.
+
+ Args:
+ interactive_session: The SSH session dedicated to interactive shells.
+ logger: The logger instance this session will use.
+ get_privileged_command: A method for modifying a command to allow it to use
+ elevated privileges. If :data:`None`, the application will not be started
+ with elevated privileges.
+ app_args: The command line arguments to be passed to the application on startup.
+ timeout: The timeout used for the SSH channel that is dedicated to this interactive
+ shell. This timeout is for collecting output, so if reading from the buffer
+ and no output is gathered within the timeout, an exception is thrown.
+ """
self._interactive_session = interactive_session
self._ssh_channel = self._interactive_session.invoke_shell()
self._stdin = self._ssh_channel.makefile_stdin("w")
@@ -92,6 +97,10 @@ def _start_application(
This method is often overridden by subclasses as their process for
starting may look different.
+
+ Args:
+ get_privileged_command: A function (but could be any callable) that produces
+ the version of the command with elevated privileges.
"""
start_command = f"{self.path} {self._app_args}"
if get_privileged_command is not None:
@@ -99,16 +108,24 @@ def _start_application(
self.send_command(start_command)
def send_command(self, command: str, prompt: str | None = None) -> str:
- """Send a command and get all output before the expected ending string.
+ """Send `command` and get all output before the expected ending string.
Lines that expect input are not included in the stdout buffer, so they cannot
- be used for expect. For example, if you were prompted to log into something
- with a username and password, you cannot expect "username:" because it won't
- yet be in the stdout buffer. A workaround for this could be consuming an
- extra newline character to force the current prompt into the stdout buffer.
+ be used for expect.
+
+ Example:
+ If you were prompted to log into something with a username and password,
+ you cannot expect ``username:`` because it won't yet be in the stdout buffer.
+ A workaround for this could be consuming an extra newline character to force
+ the current `prompt` into the stdout buffer.
+
+ Args:
+ command: The command to send.
+ prompt: After sending the command, `send_command` will be expecting this string.
+ If :data:`None`, will use the class's default prompt.
Returns:
- All output in the buffer before expected string
+ All output in the buffer before expected string.
"""
self._logger.info(f"Sending: '{command}'")
if prompt is None:
@@ -126,8 +143,10 @@ def send_command(self, command: str, prompt: str | None = None) -> str:
return out
def close(self) -> None:
+ """Properly free all resources."""
self._stdin.close()
self._ssh_channel.close()
def __del__(self) -> None:
+ """Make sure the session is properly closed before deleting the object."""
self.close()
diff --git a/dts/framework/remote_session/python_shell.py b/dts/framework/remote_session/python_shell.py
index cc3ad48a68..c8e5957ef7 100644
--- a/dts/framework/remote_session/python_shell.py
+++ b/dts/framework/remote_session/python_shell.py
@@ -1,12 +1,32 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Python interactive shell.
+
+Typical usage example in a TestSuite::
+
+ from framework.remote_session import PythonShell
+ python_shell = self.tg_node.create_interactive_shell(
+ PythonShell, timeout=5, privileged=True
+ )
+ python_shell.send_command("print('Hello World')")
+ pytyon_shell.close()
+"""
+
from pathlib import PurePath
+from typing import ClassVar
from .interactive_shell import InteractiveShell
class PythonShell(InteractiveShell):
- _default_prompt: str = ">>>"
- _command_extra_chars: str = "\n"
- path: PurePath = PurePath("python3")
+ """Python interactive shell."""
+
+ #: Python's prompt.
+ _default_prompt: ClassVar[str] = ">>>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
+
+ #: The Python executable.
+ path: ClassVar[PurePath] = PurePath("python3")
diff --git a/dts/framework/remote_session/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
index 1455b5a199..2632515d74 100644
--- a/dts/framework/remote_session/testpmd_shell.py
+++ b/dts/framework/remote_session/testpmd_shell.py
@@ -1,45 +1,82 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Testpmd interactive shell.
+
+Typical usage example in a TestSuite::
+
+ testpmd_shell = self.sut_node.create_interactive_shell(
+ TestPmdShell, privileged=True
+ )
+ devices = testpmd_shell.get_devices()
+ for device in devices:
+ print(device)
+ testpmd_shell.close()
+"""
+
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from .interactive_shell import InteractiveShell
class TestPmdDevice(object):
+ """The data of a device that testpmd can recognize.
+
+ Attributes:
+ pci_address: The PCI address of the device.
+ """
+
pci_address: str
def __init__(self, pci_address_line: str):
+ """Initialize the device from the testpmd output line string.
+
+ Args:
+ pci_address_line: A line of testpmd output that contains a device.
+ """
self.pci_address = pci_address_line.strip().split(": ")[1].strip()
def __str__(self) -> str:
+ """The PCI address captures what the device is."""
return self.pci_address
class TestPmdShell(InteractiveShell):
- path: PurePath = PurePath("app", "dpdk-testpmd")
- dpdk_app: bool = True
- _default_prompt: str = "testpmd>"
- _command_extra_chars: str = (
- "\n" # We want to append an extra newline to every command
- )
+ """Testpmd interactive shell.
+
+ The testpmd shell users should never use
+ the :meth:`~framework.remote_session.interactive_shell.InteractiveShell.send_command` method
+ directly, but rather call specialized methods. If there isn't one that satisfies a need,
+ it should be added.
+ """
+
+ #: The path to the testpmd executable.
+ path: ClassVar[PurePath] = PurePath("app", "dpdk-testpmd")
+
+ #: Flag this as a DPDK app so that it's clear this is not a system app and
+ #: needs to be looked in a specific path.
+ dpdk_app: ClassVar[bool] = True
+
+ #: The testpmd's prompt.
+ _default_prompt: ClassVar[str] = "testpmd>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
def _start_application(
self, get_privileged_command: Callable[[str], str] | None
) -> None:
- """See "_start_application" in InteractiveShell."""
self._app_args += " -- -i"
super()._start_application(get_privileged_command)
def get_devices(self) -> list[TestPmdDevice]:
- """Get a list of device names that are known to testpmd
+ """Get a list of device names that are known to testpmd.
- Uses the device info listed in testpmd and then parses the output to
- return only the names of the devices.
+ Uses the device info listed in testpmd and then parses the output.
Returns:
- A list of strings representing device names (e.g. 0000:14:00.1)
+ A list of devices.
"""
dev_info: str = self.send_command("show device info all")
dev_list: list[TestPmdDevice] = []
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 13/23] dts: port and virtual device docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (10 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 12/23] dts: interactive " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 14/23] dts: cpu " Juraj Linkeš
` (9 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/__init__.py | 16 ++++--
dts/framework/testbed_model/port.py | 53 +++++++++++++++----
dts/framework/testbed_model/virtual_device.py | 17 +++++-
3 files changed, 71 insertions(+), 15 deletions(-)
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 8ced05653b..a02be1f2d9 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -2,9 +2,19 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-This package contains the classes used to model the physical traffic generator,
-system under test and any other components that need to be interacted with.
+"""Testbed modelling.
+
+This package defines the testbed elements DTS works with:
+
+ * A system under test node: :class:`SutNode`,
+ * A traffic generator node: :class:`TGNode`,
+ * The ports of network interface cards (NICs) present on nodes: :class:`Port`,
+ * The logical cores of CPUs present on nodes: :class:`LogicalCore`,
+ * The virtual devices that can be created on nodes: :class:`VirtualDevice`,
+ * The operating systems running on nodes: :class:`LinuxSession` and :class:`PosixSession`.
+
+DTS needs to be able to connect to nodes and understand some of the hardware present on these nodes
+to properly build and test DPDK.
"""
# pylama:ignore=W0611
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 680c29bfe3..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -2,6 +2,13 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""NIC port model.
+
+Basic port information, such as location (the port are identified by their PCI address on a node),
+drivers and address.
+"""
+
+
from dataclasses import dataclass
from framework.config import PortConfig
@@ -9,24 +16,35 @@
@dataclass(slots=True, frozen=True)
class PortIdentifier:
+ """The port identifier.
+
+ Attributes:
+ node: The node where the port resides.
+ pci: The PCI address of the port on `node`.
+ """
+
node: str
pci: str
@dataclass(slots=True)
class Port:
- """
- identifier: The PCI address of the port on a node.
-
- os_driver: The driver used by this port when the OS is controlling it.
- Example: i40e
- os_driver_for_dpdk: The driver the device must be bound to for DPDK to use it,
- Example: vfio-pci.
+ """Physical port on a node.
- Note: os_driver and os_driver_for_dpdk may be the same thing.
- Example: mlx5_core
+ The ports are identified by the node they're on and their PCI addresses. The port on the other
+ side of the connection is also captured here.
+ Each port is serviced by a driver, which may be different for the operating system (`os_driver`)
+ and for DPDK (`os_driver_for_dpdk`). For some devices, they are the same, e.g.: ``mlx5_core``.
- peer: The identifier of a port this port is connected with.
+ Attributes:
+ identifier: The PCI address of the port on a node.
+ os_driver: The operating system driver name when the operating system controls the port,
+ e.g.: ``i40e``.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK, e.g.: ``vfio-pci``.
+ peer: The identifier of a port this port is connected with.
+ The `peer` is on a different node.
+ mac_address: The MAC address of the port.
+ logical_name: The logical name of the port. Must be discovered.
"""
identifier: PortIdentifier
@@ -37,6 +55,12 @@ class Port:
logical_name: str = ""
def __init__(self, node_name: str, config: PortConfig):
+ """Initialize the port from `node_name` and `config`.
+
+ Args:
+ node_name: The name of the port's node.
+ config: The test run configuration of the port.
+ """
self.identifier = PortIdentifier(
node=node_name,
pci=config.pci,
@@ -47,14 +71,23 @@ def __init__(self, node_name: str, config: PortConfig):
@property
def node(self) -> str:
+ """The node where the port resides."""
return self.identifier.node
@property
def pci(self) -> str:
+ """The PCI address of the port."""
return self.identifier.pci
@dataclass(slots=True, frozen=True)
class PortLink:
+ """The physical, cabled connection between the ports.
+
+ Attributes:
+ sut_port: The port on the SUT node connected to `tg_port`.
+ tg_port: The port on the TG node connected to `sut_port`.
+ """
+
sut_port: Port
tg_port: Port
diff --git a/dts/framework/testbed_model/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
index eb664d9f17..e9b5e9c3be 100644
--- a/dts/framework/testbed_model/virtual_device.py
+++ b/dts/framework/testbed_model/virtual_device.py
@@ -1,16 +1,29 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Virtual devices model.
+
+Alongside support for physical hardware, DPDK can create various virtual devices.
+"""
+
class VirtualDevice(object):
- """
- Base class for virtual devices used by DPDK.
+ """Base class for virtual devices used by DPDK.
+
+ Attributes:
+ name: The name of the virtual device.
"""
name: str
def __init__(self, name: str):
+ """Initialize the virtual device.
+
+ Args:
+ name: The name of the virtual device.
+ """
self.name = name
def __str__(self) -> str:
+ """This corresponds to the name used for DPDK devices."""
return self.name
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 14/23] dts: cpu docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (11 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 13/23] dts: port and virtual device " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 15/23] dts: os session " Juraj Linkeš
` (8 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/cpu.py | 196 +++++++++++++++++++++--------
1 file changed, 144 insertions(+), 52 deletions(-)
diff --git a/dts/framework/testbed_model/cpu.py b/dts/framework/testbed_model/cpu.py
index 8fe785dfe4..4edeb4a7c2 100644
--- a/dts/framework/testbed_model/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -1,6 +1,22 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""CPU core representation and filtering.
+
+This module provides a unified representation of logical CPU cores along
+with filtering capabilities.
+
+When symmetric multiprocessing (SMP or multithreading) is enabled on a server,
+the physical CPU cores are split into logical CPU cores with different IDs.
+
+:class:`LogicalCoreCountFilter` filters by the number of logical cores. It's possible to specify
+the socket from which to filter the number of logical cores. It's also possible to not use all
+logical CPU cores from each physical core (e.g. only the first logical core of each physical core).
+
+:class:`LogicalCoreListFilter` filters by logical core IDs. This mostly checks that
+the logical cores are actually present on the server.
+"""
+
import dataclasses
from abc import ABC, abstractmethod
from collections.abc import Iterable, ValuesView
@@ -11,9 +27,17 @@
@dataclass(slots=True, frozen=True)
class LogicalCore(object):
- """
- Representation of a CPU core. A physical core is represented in OS
- by multiple logical cores (lcores) if CPU multithreading is enabled.
+ """Representation of a logical CPU core.
+
+ A physical core is represented in OS by multiple logical cores (lcores)
+ if CPU multithreading is enabled. When multithreading is disabled, their IDs are the same.
+
+ Attributes:
+ lcore: The logical core ID of a CPU core. It's the same as `core` with
+ disabled multithreading.
+ core: The physical core ID of a CPU core.
+ socket: The physical socket ID where the CPU resides.
+ node: The NUMA node ID where the CPU resides.
"""
lcore: int
@@ -22,27 +46,36 @@ class LogicalCore(object):
node: int
def __int__(self) -> int:
+ """The CPU is best represented by the logical core, as that's what we configure in EAL."""
return self.lcore
class LogicalCoreList(object):
- """
- Convert these options into a list of logical core ids.
- lcore_list=[LogicalCore1, LogicalCore2] - a list of LogicalCores
- lcore_list=[0,1,2,3] - a list of int indices
- lcore_list=['0','1','2-3'] - a list of str indices; ranges are supported
- lcore_list='0,1,2-3' - a comma delimited str of indices; ranges are supported
-
- The class creates a unified format used across the framework and allows
- the user to use either a str representation (using str(instance) or directly
- in f-strings) or a list representation (by accessing instance.lcore_list).
- Empty lcore_list is allowed.
+ r"""A unified way to store :class:`LogicalCore`\s.
+
+ Create a unified format used across the framework and allow the user to use
+ either a :class:`str` representation (using ``str(instance)`` or directly in f-strings)
+ or a :class:`list` representation (by accessing the `lcore_list` property,
+ which stores logical core IDs).
"""
_lcore_list: list[int]
_lcore_str: str
def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
+ """Process `lcore_list`, then sort.
+
+ There are four supported logical core list formats::
+
+ lcore_list=[LogicalCore1, LogicalCore2] # a list of LogicalCores
+ lcore_list=[0,1,2,3] # a list of int indices
+ lcore_list=['0','1','2-3'] # a list of str indices; ranges are supported
+ lcore_list='0,1,2-3' # a comma delimited str of indices; ranges are supported
+
+ Args:
+ lcore_list: Various ways to represent multiple logical cores.
+ Empty `lcore_list` is allowed.
+ """
self._lcore_list = []
if isinstance(lcore_list, str):
lcore_list = lcore_list.split(",")
@@ -60,6 +93,7 @@ def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
@property
def lcore_list(self) -> list[int]:
+ """The logical core IDs."""
return self._lcore_list
def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
@@ -89,28 +123,30 @@ def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
return formatted_core_list
def __str__(self) -> str:
+ """The consecutive ranges of logical core IDs."""
return self._lcore_str
@dataclasses.dataclass(slots=True, frozen=True)
class LogicalCoreCount(object):
- """
- Define the number of logical cores to use.
- If sockets is not None, socket_count is ignored.
- """
+ """Define the number of logical cores per physical cores per sockets."""
+ #: Use this many logical cores per each physical core.
lcores_per_core: int = 1
+ #: Use this many physical cores per each socket.
cores_per_socket: int = 2
+ #: Use this many sockets.
socket_count: int = 1
+ #: Use exactly these sockets. This takes precedence over `socket_count`,
+ #: so when `sockets` is not :data:`None`, `socket_count` is ignored.
sockets: list[int] | None = None
class LogicalCoreFilter(ABC):
- """
- Filter according to the input filter specifier. Each filter needs to be
- implemented in a derived class.
- This class only implements operations common to all filters, such as sorting
- the list to be filtered beforehand.
+ """Common filtering class.
+
+ Each filter needs to be implemented in a subclass. This base class sorts the list of cores
+ and defines the filtering method, which must be implemented by subclasses.
"""
_filter_specifier: LogicalCoreCount | LogicalCoreList
@@ -122,6 +158,17 @@ def __init__(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
):
+ """Filter according to the input filter specifier.
+
+ The input `lcore_list` is copied and sorted by physical core before filtering.
+ The list is copied so that the original is left intact.
+
+ Args:
+ lcore_list: The logical CPU cores to filter.
+ filter_specifier: Filter cores from `lcore_list` according to this filter.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If data:`False`,
+ sort in descending order.
+ """
self._filter_specifier = filter_specifier
# sorting by core is needed in case hyperthreading is enabled
@@ -132,31 +179,45 @@ def __init__(
@abstractmethod
def filter(self) -> list[LogicalCore]:
- """
- Use self._filter_specifier to filter self._lcores_to_filter
- and return the list of filtered LogicalCores.
- self._lcores_to_filter is a sorted copy of the original list,
- so it may be modified.
+ r"""Filter the cores.
+
+ Use `self._filter_specifier` to filter `self._lcores_to_filter` and return
+ the filtered :class:`LogicalCore`\s.
+ `self._lcores_to_filter` is a sorted copy of the original list, so it may be modified.
+
+ Returns:
+ The filtered cores.
"""
class LogicalCoreCountFilter(LogicalCoreFilter):
- """
+ """Filter cores by specified counts.
+
Filter the input list of LogicalCores according to specified rules:
- Use cores from the specified number of sockets or from the specified socket ids.
- If sockets is specified, it takes precedence over socket_count.
- From each of those sockets, use only cores_per_socket of cores.
- And for each core, use lcores_per_core of logical cores. Hypertheading
- must be enabled for this to take effect.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+
+ * The input `filter_specifier` is :class:`LogicalCoreCount`,
+ * Use cores from the specified number of sockets or from the specified socket ids,
+ * If `sockets` is specified, it takes precedence over `socket_count`,
+ * From each of those sockets, use only `cores_per_socket` of cores,
+ * And for each core, use `lcores_per_core` of logical cores. Hypertheading
+ must be enabled for this to take effect.
"""
_filter_specifier: LogicalCoreCount
def filter(self) -> list[LogicalCore]:
+ """Filter the cores according to :class:`LogicalCoreCount`.
+
+ Start by filtering the allowed sockets. The cores matching the allowed socket are returned.
+ The cores of each socket are stored in separate lists.
+
+ Then filter the allowed physical cores from those lists of cores per socket. When filtering
+ physical cores, store the desired number of logical cores per physical core which then
+ together constitute the final filtered list.
+
+ Returns:
+ The filtered cores.
+ """
sockets_to_filter = self._filter_sockets(self._lcores_to_filter)
filtered_lcores = []
for socket_to_filter in sockets_to_filter:
@@ -166,24 +227,37 @@ def filter(self) -> list[LogicalCore]:
def _filter_sockets(
self, lcores_to_filter: Iterable[LogicalCore]
) -> ValuesView[list[LogicalCore]]:
- """
- Remove all lcores that don't match the specified socket(s).
- If self._filter_specifier.sockets is not None, keep lcores from those sockets,
- otherwise keep lcores from the first
- self._filter_specifier.socket_count sockets.
+ """Filter a list of cores per each allowed socket.
+
+ The sockets may be specified in two ways, either a number or a specific list of sockets.
+ In case of a specific list, we just need to return the cores from those sockets.
+ If filtering a number of cores, we need to go through all cores and note which sockets
+ appear and only filter from the first n that appear.
+
+ Args:
+ lcores_to_filter: The cores to filter. These must be sorted by the physical core.
+
+ Returns:
+ A list of lists of logical CPU cores. Each list contains cores from one socket.
"""
allowed_sockets: set[int] = set()
socket_count = self._filter_specifier.socket_count
if self._filter_specifier.sockets:
+ # when sockets in filter is specified, the sockets are already set
socket_count = len(self._filter_specifier.sockets)
allowed_sockets = set(self._filter_specifier.sockets)
+ # filter socket_count sockets from all sockets by checking the socket of each CPU
filtered_lcores: dict[int, list[LogicalCore]] = {}
for lcore in lcores_to_filter:
if not self._filter_specifier.sockets:
+ # this is when sockets is not set, so we do the actual filtering
+ # when it is set, allowed_sockets is already defined and can't be changed
if len(allowed_sockets) < socket_count:
+ # allowed_sockets is a set, so adding an existing socket won't re-add it
allowed_sockets.add(lcore.socket)
if lcore.socket in allowed_sockets:
+ # separate sockets per socket; this makes it easier in further processing
if lcore.socket in filtered_lcores:
filtered_lcores[lcore.socket].append(lcore)
else:
@@ -200,12 +274,13 @@ def _filter_sockets(
def _filter_cores_from_socket(
self, lcores_to_filter: Iterable[LogicalCore]
) -> list[LogicalCore]:
- """
- Keep only the first self._filter_specifier.cores_per_socket cores.
- In multithreaded environments, keep only
- the first self._filter_specifier.lcores_per_core lcores of those cores.
- """
+ """Filter a list of cores from the given socket.
+
+ Go through the cores and note how many logical cores per physical core have been filtered.
+ Returns:
+ The filtered logical CPU cores.
+ """
# no need to use ordered dict, from Python3.7 the dict
# insertion order is preserved (LIFO).
lcore_count_per_core_map: dict[int, int] = {}
@@ -248,15 +323,21 @@ def _filter_cores_from_socket(
class LogicalCoreListFilter(LogicalCoreFilter):
- """
- Filter the input list of Logical Cores according to the input list of
- lcore indices.
- An empty LogicalCoreList won't filter anything.
+ """Filter the logical CPU cores by logical CPU core IDs.
+
+ This is a simple filter that looks at logical CPU IDs and only filter those that match.
+
+ The input filter is :class:`LogicalCoreList`. An empty LogicalCoreList won't filter anything.
"""
_filter_specifier: LogicalCoreList
def filter(self) -> list[LogicalCore]:
+ """Filter based on logical CPU core ID.
+
+ Return:
+ The filtered logical CPU cores.
+ """
if not len(self._filter_specifier.lcore_list):
return self._lcores_to_filter
@@ -279,6 +360,17 @@ def lcore_filter(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool,
) -> LogicalCoreFilter:
+ """Factory for using the right filter with `filter_specifier`.
+
+ Args:
+ core_list: The logical CPU cores to filter.
+ filter_specifier: The filter to use.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If :data:`False`,
+ sort in descending order.
+
+ Returns:
+ The filter matching `filter_specifier`.
+ """
if isinstance(filter_specifier, LogicalCoreList):
return LogicalCoreListFilter(core_list, filter_specifier, ascending)
elif isinstance(filter_specifier, LogicalCoreCount):
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 15/23] dts: os session docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (12 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 14/23] dts: cpu " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 16/23] dts: posix and linux sessions " Juraj Linkeš
` (7 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/os_session.py | 275 ++++++++++++++++------
1 file changed, 208 insertions(+), 67 deletions(-)
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 76e595a518..bad75d52e7 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -2,6 +2,29 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""OS-aware remote session.
+
+DPDK supports multiple different operating systems, meaning it can run on these different operating
+systems. This module defines the common API that OS-unaware layers use and translates the API into
+OS-aware calls/utility usage.
+
+Note:
+ Running commands with administrative privileges requires OS awareness. This is the only layer
+ that's aware of OS differences, so this is where non-privileged command get converted
+ to privileged commands.
+
+Example:
+ A user wishes to remove a directory on
+ a remote :class:`~framework.testbed_model.sut_node.SutNode`.
+ The :class:`~framework.testbed_model.sut_node.SutNode` object isn't aware what OS the node
+ is running - it delegates the OS translation logic
+ to :attr:`~framework.testbed_model.node.Node.main_session`. The SUT node calls
+ :meth:`~OSSession.remove_remote_dir` with a generic, OS-unaware path and
+ the :attr:`~framework.testbed_model.node.Node.main_session` translates that
+ to ``rm -rf`` if the node's OS is Linux and other commands for other OSs.
+ It also translates the path to match the underlying OS.
+"""
+
from abc import ABC, abstractmethod
from collections.abc import Iterable
from ipaddress import IPv4Interface, IPv6Interface
@@ -28,10 +51,16 @@
class OSSession(ABC):
- """
- The OS classes create a DTS node remote session and implement OS specific
+ """OS-unaware to OS-aware translation API definition.
+
+ The OSSession classes create a remote session to a DTS node and implement OS specific
behavior. There a few control methods implemented by the base class, the rest need
- to be implemented by derived classes.
+ to be implemented by subclasses.
+
+ Attributes:
+ name: The name of the session.
+ remote_session: The remote session maintaining the connection to the node.
+ interactive_session: The interactive remote session maintaining the connection to the node.
"""
_config: NodeConfiguration
@@ -46,6 +75,15 @@ def __init__(
name: str,
logger: DTSLOG,
):
+ """Initialize the OS-aware session.
+
+ Connect to the node right away and also create an interactive remote session.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
self._config = node_config
self.name = name
self._logger = logger
@@ -53,15 +91,15 @@ def __init__(
self.interactive_session = create_interactive_session(node_config, logger)
def close(self, force: bool = False) -> None:
- """
- Close the remote session.
+ """Close the underlying remote session.
+
+ Args:
+ force: Force the closure of the connection.
"""
self.remote_session.close(force)
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the underlying remote session is still responding."""
return self.remote_session.is_alive()
def send_command(
@@ -72,10 +110,23 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- An all-purpose API in case the command to be executed is already
- OS-agnostic, such as when the path to the executed command has been
- constructed beforehand.
+ """An all-purpose API for OS-agnostic commands.
+
+ This can be used for an execution of a portable command that's executed the same way
+ on all operating systems, such as Python.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds to execute the command.
+ privileged: Whether to run the command with administrative privileges.
+ verify: If True, will check the exit code of the command.
+ env: A dictionary with environment variables to be used with the command execution.
+
+ Raises:
+ RemoteCommandExecutionError: If verify is True and the command failed.
"""
if privileged:
command = self._get_privileged_command(command)
@@ -89,8 +140,20 @@ def create_interactive_shell(
privileged: bool,
app_args: str,
) -> InteractiveShellType:
- """
- See "create_interactive_shell" in SutNode
+ """Factory for interactive session handlers.
+
+ Instantiate `shell_cls` according to the remote OS specifics.
+
+ Args:
+ shell_cls: The class of the shell.
+ timeout: Timeout for reading output from the SSH channel. If you are
+ reading from the buffer and don't receive any data within the timeout
+ it will throw an error.
+ privileged: Whether to run the shell with administrative privileges.
+ app_args: The arguments to be passed to the application.
+
+ Returns:
+ An instance of the desired interactive application shell.
"""
return shell_cls(
self.interactive_session.session,
@@ -114,27 +177,42 @@ def _get_privileged_command(command: str) -> str:
@abstractmethod
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
- """
- Try to find DPDK remote dir in remote_dir.
+ """Try to find DPDK directory in `remote_dir`.
+
+ The directory is the one which is created after the extraction of the tarball. The files
+ are usually extracted into a directory starting with ``dpdk-``.
+
+ Returns:
+ The absolute path of the DPDK remote directory, empty path if not found.
"""
@abstractmethod
def get_remote_tmp_dir(self) -> PurePath:
- """
- Get the path of the temporary directory of the remote OS.
+ """Get the path of the temporary directory of the remote OS.
+
+ Returns:
+ The absolute path of the temporary directory.
"""
@abstractmethod
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for the target architecture. Get
- information from the node if needed.
+ """Create extra environment variables needed for the target architecture.
+
+ Different architectures may require different configuration, such as setting 32-bit CFLAGS.
+
+ Returns:
+ A dictionary with keys as environment variables.
"""
@abstractmethod
def join_remote_path(self, *args: str | PurePath) -> PurePath:
- """
- Join path parts using the path separator that fits the remote OS.
+ """Join path parts using the path separator that fits the remote OS.
+
+ Args:
+ args: Any number of paths to join.
+
+ Returns:
+ The resulting joined path.
"""
@abstractmethod
@@ -143,13 +221,13 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from the remote Node to the local filesystem.
+ """Copy a file from the remote node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote node associated with this remote
+ session to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
+ source_file: the file on the remote node.
destination_file: a file or directory path on the local filesystem.
"""
@@ -159,14 +237,14 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from local filesystem to the remote Node.
+ """Copy a file from local filesystem to the remote node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file`
+ on the remote node associated with this remote session.
Args:
source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ destination_file: a file or directory path on the remote node.
"""
@abstractmethod
@@ -176,8 +254,12 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
- """
- Remove remote directory, by default remove recursively and forcefully.
+ """Remove remote directory, by default remove recursively and forcefully.
+
+ Args:
+ remote_dir_path: The path of the directory to remove.
+ recursive: If :data:`True`, also remove all contents inside the directory.
+ force: If :data:`True`, ignore all warnings and try to remove at all costs.
"""
@abstractmethod
@@ -186,9 +268,12 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
- """
- Extract remote tarball in place. If expected_dir is a non-empty string, check
- whether the dir exists after extracting the archive.
+ """Extract remote tarball in its remote directory.
+
+ Args:
+ remote_tarball_path: The path of the tarball on the remote node.
+ expected_dir: If non-empty, check whether `expected_dir` exists after extracting
+ the archive.
"""
@abstractmethod
@@ -201,69 +286,119 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
- """
- Build DPDK in the input dir with specified environment variables and meson
- arguments.
+ """Build DPDK on the remote node.
+
+ An extracted DPDK tarball must be present on the node. The build consists of two steps::
+
+ meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
+ ninja -C remote_dpdk_build_dir
+
+ The :option:`--compile-timeout` command line argument and the :envvar:`DTS_COMPILE_TIMEOUT`
+ environment variable configure the timeout of DPDK build.
+
+ Args:
+ env_vars: Use these environment variables then building DPDK.
+ meson_args: Use these meson arguments when building DPDK.
+ remote_dpdk_dir: The directory on the remote node where DPDK will be built.
+ remote_dpdk_build_dir: The target build directory on the remote node.
+ rebuild: If True, do a subsequent build with ``meson configure`` instead
+ of ``meson setup``.
+ timeout: Wait at most this long in seconds for the build to execute.
"""
@abstractmethod
def get_dpdk_version(self, version_path: str | PurePath) -> str:
- """
- Inspect DPDK version on the remote node from version_path.
+ """Inspect the DPDK version on the remote node.
+
+ Args:
+ version_path: The path to the VERSION file containing the DPDK version.
+
+ Returns:
+ The DPDK version.
"""
@abstractmethod
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
- """
- Compose a list of LogicalCores present on the remote node.
- If use_first_core is False, the first physical core won't be used.
+ r"""Get the list of :class:`~framework.testbed_model.cpu.LogicalCore`\s on the remote node.
+
+ Args:
+ use_first_core: If :data:`False`, the first physical core won't be used.
+
+ Returns:
+ The logical cores present on the node.
"""
@abstractmethod
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
- """
- Kill and cleanup all DPDK apps identified by dpdk_prefix_list. If
- dpdk_prefix_list is empty, attempt to find running DPDK apps to kill and clean.
+ """Kill and cleanup all DPDK apps.
+
+ Args:
+ dpdk_prefix_list: Kill all apps identified by `dpdk_prefix_list`.
+ If `dpdk_prefix_list` is empty, attempt to find running DPDK apps to kill and clean.
"""
@abstractmethod
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
- """
- Get the DPDK file prefix that will be used when running DPDK apps.
+ """Make OS-specific modification to the DPDK file prefix.
+
+ Args:
+ dpdk_prefix: The OS-unaware file prefix.
+
+ Returns:
+ The OS-specific file prefix.
"""
@abstractmethod
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
- """
- Get the node's Hugepage Size, configure the specified amount of hugepages
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Configure hugepages on the node.
+
+ Get the node's Hugepage Size, configure the specified count of hugepages
if needed and mount the hugepages if needed.
- If force_first_numa is True, configure hugepages just on the first socket.
+
+ Args:
+ hugepage_count: Configure this many hugepages.
+ force_first_numa: If :data:`True`, configure hugepages just on the first socket.
"""
@abstractmethod
def get_compiler_version(self, compiler_name: str) -> str:
- """
- Get installed version of compiler used for DPDK
+ """Get installed version of compiler used for DPDK.
+
+ Args:
+ compiler_name: The name of the compiler executable.
+
+ Returns:
+ The compiler's version.
"""
@abstractmethod
def get_node_info(self) -> NodeInfo:
- """
- Collect information about the node
+ """Collect additional information about the node.
+
+ Returns:
+ Node information.
"""
@abstractmethod
def update_ports(self, ports: list[Port]) -> None:
- """
- Get additional information about ports:
- Logical name (e.g. enp7s0) if applicable
- Mac address
+ """Get additional information about ports from the operating system and update them.
+
+ The additional information is:
+
+ * Logical name (e.g. ``enp7s0``) if applicable,
+ * Mac address.
+
+ Args:
+ ports: The ports to update.
"""
@abstractmethod
def configure_port_state(self, port: Port, enable: bool) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port` in the operating system.
+
+ Args:
+ port: The port to configure.
+ enable: If :data:`True`, enable the port, otherwise shut it down.
"""
@abstractmethod
@@ -273,12 +408,18 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
- """
- Configure (add or delete) an IP address of the input port.
+ """Configure an IP address on `port` in the operating system.
+
+ Args:
+ address: The address to configure.
+ port: The port to configure.
+ delete: If :data:`True`, remove the IP address, otherwise configure it.
"""
@abstractmethod
def configure_ipv4_forwarding(self, enable: bool) -> None:
- """
- Enable IPv4 forwarding in the underlying OS.
+ """Enable IPv4 forwarding in the operating system.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
"""
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 16/23] dts: posix and linux sessions docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (13 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 15/23] dts: os session " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 17/23] dts: node " Juraj Linkeš
` (6 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/linux_session.py | 63 ++++++++++-----
dts/framework/testbed_model/posix_session.py | 81 +++++++++++++++++---
2 files changed, 113 insertions(+), 31 deletions(-)
diff --git a/dts/framework/testbed_model/linux_session.py b/dts/framework/testbed_model/linux_session.py
index f472bb8f0f..279954ff63 100644
--- a/dts/framework/testbed_model/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -2,6 +2,13 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""Linux OS translator.
+
+Translate OS-unaware calls into Linux calls/utilities. Most of Linux distributions are mostly
+compliant with POSIX standards, so this module only implements the parts that aren't.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import json
from ipaddress import IPv4Interface, IPv6Interface
from typing import TypedDict, Union
@@ -17,43 +24,51 @@
class LshwConfigurationOutput(TypedDict):
+ """The relevant parts of ``lshw``'s ``configuration`` section."""
+
+ #:
link: str
class LshwOutput(TypedDict):
- """
- A model of the relevant information from json lshw output, e.g.:
- {
- ...
- "businfo" : "pci@0000:08:00.0",
- "logicalname" : "enp8s0",
- "version" : "00",
- "serial" : "52:54:00:59:e1:ac",
- ...
- "configuration" : {
- ...
- "link" : "yes",
- ...
- },
- ...
+ """A model of the relevant information from ``lshw``'s json output.
+
+ e.g.::
+
+ {
+ ...
+ "businfo" : "pci@0000:08:00.0",
+ "logicalname" : "enp8s0",
+ "version" : "00",
+ "serial" : "52:54:00:59:e1:ac",
+ ...
+ "configuration" : {
+ ...
+ "link" : "yes",
+ ...
+ },
+ ...
"""
+ #:
businfo: str
+ #:
logicalname: NotRequired[str]
+ #:
serial: NotRequired[str]
+ #:
configuration: LshwConfigurationOutput
class LinuxSession(PosixSession):
- """
- The implementation of non-Posix compliant parts of Linux remote sessions.
- """
+ """The implementation of non-Posix compliant parts of Linux."""
@staticmethod
def _get_privileged_command(command: str) -> str:
return f"sudo -- sh -c '{command}'"
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_cpus`."""
cpu_info = self.send_command("lscpu -p=CPU,CORE,SOCKET,NODE|grep -v \\#").stdout
lcores = []
for cpu_line in cpu_info.splitlines():
@@ -65,18 +80,20 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
return lcores
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return dpdk_prefix
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.setup_hugepages`."""
self._logger.info("Getting Hugepage information.")
hugepage_size = self._get_hugepage_size()
hugepages_total = self._get_hugepages_total()
self._numa_nodes = self._get_numa_nodes()
- if force_first_numa or hugepages_total != hugepage_amount:
+ if force_first_numa or hugepages_total != hugepage_count:
# when forcing numa, we need to clear existing hugepages regardless
# of size, so they can be moved to the first numa node
- self._configure_huge_pages(hugepage_amount, hugepage_size, force_first_numa)
+ self._configure_huge_pages(hugepage_count, hugepage_size, force_first_numa)
else:
self._logger.info("Hugepages already configured.")
self._mount_huge_pages()
@@ -140,6 +157,7 @@ def _configure_huge_pages(
)
def update_ports(self, ports: list[Port]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.update_ports`."""
self._logger.debug("Gathering port info.")
for port in ports:
assert (
@@ -178,6 +196,7 @@ def _update_port_attr(
)
def configure_port_state(self, port: Port, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_state`."""
state = "up" if enable else "down"
self.send_command(
f"ip link set dev {port.logical_name} {state}", privileged=True
@@ -189,6 +208,7 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_ip_address`."""
command = "del" if delete else "add"
self.send_command(
f"ip address {command} {address} dev {port.logical_name}",
@@ -197,5 +217,6 @@ def configure_port_ip_address(
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_ipv4_forwarding`."""
state = 1 if enable else 0
self.send_command(f"sysctl -w net.ipv4.ip_forward={state}", privileged=True)
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 1d1d5b1b26..a4824aa274 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -2,6 +2,15 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""POSIX compliant OS translator.
+
+Translates OS-unaware calls into POSIX compliant calls/utilities. POSIX is a set of standards
+for portability between Unix operating systems which not all Linux distributions
+(or the tools most frequently bundled with said distributions) adhere to. Most of Linux
+distributions are mostly compliant though.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import re
from collections.abc import Iterable
from pathlib import PurePath, PurePosixPath
@@ -15,13 +24,21 @@
class PosixSession(OSSession):
- """
- An intermediary class implementing the Posix compliant parts of
- Linux and other OS remote sessions.
- """
+ """An intermediary class implementing the POSIX standard."""
@staticmethod
def combine_short_options(**opts: bool) -> str:
+ """Combine shell options into one argument.
+
+ These are options such as ``-x``, ``-v``, ``-f`` which are combined into ``-xvf``.
+
+ Args:
+ opts: The keys are option names (usually one letter) and the bool values indicate
+ whether to include the option in the resulting argument.
+
+ Returns:
+ The options combined into one argument.
+ """
ret_opts = ""
for opt, include in opts.items():
if include:
@@ -33,17 +50,19 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.guess_dpdk_remote_dir`."""
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
def get_remote_tmp_dir(self) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_tmp_dir`."""
return PurePosixPath("/tmp")
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for i686 arch build. Get information
- from the node if needed.
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_build_env_vars`.
+
+ Supported architecture: ``i686``.
"""
env_vars = {}
if arch == Architecture.i686:
@@ -63,6 +82,7 @@ def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
return env_vars
def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
return PurePosixPath(*args)
def copy_from(
@@ -70,6 +90,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_from`."""
self.remote_session.copy_from(source_file, destination_file)
def copy_to(
@@ -77,6 +98,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_to`."""
self.remote_session.copy_to(source_file, destination_file)
def remove_remote_dir(
@@ -85,6 +107,7 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.remove_remote_dir`."""
opts = PosixSession.combine_short_options(r=recursive, f=force)
self.send_command(f"rm{opts} {remote_dir_path}")
@@ -93,6 +116,7 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
self.send_command(
f"tar xfm {remote_tarball_path} "
f"-C {PurePosixPath(remote_tarball_path).parent}",
@@ -110,6 +134,7 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.build_dpdk`."""
try:
if rebuild:
# reconfigure, then build
@@ -140,12 +165,14 @@ def build_dpdk(
raise DPDKBuildError(f"DPDK build failed when doing '{e.command}'.")
def get_dpdk_version(self, build_dir: str | PurePath) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_version`."""
out = self.send_command(
f"cat {self.join_remote_path(build_dir, 'VERSION')}", verify=True
)
return out.stdout
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.kill_cleanup_dpdk_apps`."""
self._logger.info("Cleaning up DPDK apps.")
dpdk_runtime_dirs = self._get_dpdk_runtime_dirs(dpdk_prefix_list)
if dpdk_runtime_dirs:
@@ -159,6 +186,14 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
def _get_dpdk_runtime_dirs(
self, dpdk_prefix_list: Iterable[str]
) -> list[PurePosixPath]:
+ """Find runtime directories DPDK apps are currently using.
+
+ Args:
+ dpdk_prefix_list: The prefixes DPDK apps were started with.
+
+ Returns:
+ The paths of DPDK apps' runtime dirs.
+ """
prefix = PurePosixPath("/var", "run", "dpdk")
if not dpdk_prefix_list:
remote_prefixes = self._list_remote_dirs(prefix)
@@ -170,9 +205,13 @@ def _get_dpdk_runtime_dirs(
return [PurePosixPath(prefix, dpdk_prefix) for dpdk_prefix in dpdk_prefix_list]
def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
- """
- Return a list of directories of the remote_dir.
- If remote_path doesn't exist, return None.
+ """Contents of remote_path.
+
+ Args:
+ remote_path: List the contents of this path.
+
+ Returns:
+ The contents of remote_path. If remote_path doesn't exist, return None.
"""
out = self.send_command(
f"ls -l {remote_path} | awk '/^d/ {{print $NF}}'"
@@ -183,6 +222,17 @@ def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
return out.splitlines()
def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[int]:
+ """Find PIDs of running DPDK apps.
+
+ Look at each "config" file found in dpdk_runtime_dirs and find the PIDs of processes
+ that opened those file.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+
+ Returns:
+ The PIDs of running DPDK apps.
+ """
pids = []
pid_regex = r"p(\d+)"
for dpdk_runtime_dir in dpdk_runtime_dirs:
@@ -203,6 +253,14 @@ def _remote_files_exists(self, remote_path: PurePath) -> bool:
def _check_dpdk_hugepages(
self, dpdk_runtime_dirs: Iterable[str | PurePath]
) -> None:
+ """Check there aren't any leftover hugepages.
+
+ If any hugegapes are found, emit a warning. The hugepages are investigated in the
+ "hugepage_info" file of dpdk_runtime_dirs.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+ """
for dpdk_runtime_dir in dpdk_runtime_dirs:
hugepage_info = PurePosixPath(dpdk_runtime_dir, "hugepage_info")
if self._remote_files_exists(hugepage_info):
@@ -220,9 +278,11 @@ def _remove_dpdk_runtime_dirs(
self.remove_remote_dir(dpdk_runtime_dir)
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return ""
def get_compiler_version(self, compiler_name: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_compiler_version`."""
match compiler_name:
case "gcc":
return self.send_command(
@@ -240,6 +300,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
raise ValueError(f"Unknown compiler {compiler_name}")
def get_node_info(self) -> NodeInfo:
+ """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
os_release_info = self.send_command(
"awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
SETTINGS.timeout,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 17/23] dts: node docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (14 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 16/23] dts: posix and linux sessions " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 18/23] dts: sut and tg nodes " Juraj Linkeš
` (5 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/node.py | 191 +++++++++++++++++++---------
1 file changed, 131 insertions(+), 60 deletions(-)
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 7571e7b98d..abf86793a7 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+A node is any host/server DTS connects to.
+
+The base class, :class:`Node`, provides functionality common to all nodes and is supposed
+to be extended by subclasses with functionality specific to each node type.
+The decorator :func:`Node.skip_setup` can be used without subclassing.
"""
from abc import ABC
@@ -35,10 +40,22 @@
class Node(ABC):
- """
- Basic class for node management. This class implements methods that
- manage a node, such as information gathering (of CPU/PCI/NIC) and
- environment setup.
+ """The base class for node management.
+
+ It shouldn't be instantiated, but rather subclassed.
+ It implements common methods to manage any node:
+
+ * Connection to the node,
+ * Hugepages setup.
+
+ Attributes:
+ main_session: The primary OS-aware remote session used to communicate with the node.
+ config: The node configuration.
+ name: The name of the node.
+ lcores: The list of logical cores that DTS can use on the node.
+ It's derived from logical cores present on the node and the test run configuration.
+ ports: The ports of this node specified in the test run configuration.
+ virtual_devices: The virtual devices used on the node.
"""
main_session: OSSession
@@ -52,6 +69,17 @@ class Node(ABC):
virtual_devices: list[VirtualDevice]
def __init__(self, node_config: NodeConfiguration):
+ """Connect to the node and gather info during initialization.
+
+ Extra gathered information:
+
+ * The list of available logical CPUs. This is then filtered by
+ the ``lcores`` configuration in the YAML test run configuration file,
+ * Information about ports from the YAML test run configuration file.
+
+ Args:
+ node_config: The node's test run configuration.
+ """
self.config = node_config
self.name = node_config.name
self._logger = getLogger(self.name)
@@ -60,7 +88,7 @@ def __init__(self, node_config: NodeConfiguration):
self._logger.info(f"Connected to node: {self.name}")
self._get_remote_cpus()
- # filter the node lcores according to user config
+ # filter the node lcores according to the test run configuration
self.lcores = LogicalCoreListFilter(
self.lcores, LogicalCoreList(self.config.lcores)
).filter()
@@ -77,9 +105,14 @@ def _init_ports(self) -> None:
self.configure_port_state(port)
def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- Perform the execution setup that will be done for each execution
- this node is part of.
+ """Execution setup steps.
+
+ Configure hugepages and call :meth:`_set_up_execution` where
+ the rest of the configuration steps (if any) are implemented.
+
+ Args:
+ execution_config: The execution test run configuration according to which
+ the setup steps will be taken.
"""
self._setup_hugepages()
self._set_up_execution(execution_config)
@@ -88,58 +121,74 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
self.virtual_devices.append(VirtualDevice(vdev))
def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution setup steps.
"""
def tear_down_execution(self) -> None:
- """
- Perform the execution teardown that will be done after each execution
- this node is part of concludes.
+ """Execution teardown steps.
+
+ There are currently no common execution teardown steps common to all DTS node types.
"""
self.virtual_devices = []
self._tear_down_execution()
def _tear_down_execution(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution teardown steps.
"""
def set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Perform the build target setup that will be done for each build target
- tested on this node.
+ """Build target setup steps.
+
+ There are currently no common build target setup steps common to all DTS node types.
+
+ Args:
+ build_target_config: The build target test run configuration according to which
+ the setup steps will be taken.
"""
self._set_up_build_target(build_target_config)
def _set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target setup steps.
"""
def tear_down_build_target(self) -> None:
- """
- Perform the build target teardown that will be done after each build target
- tested on this node.
+ """Build target teardown steps.
+
+ There are currently no common build target teardown steps common to all DTS node types.
"""
self._tear_down_build_target()
def _tear_down_build_target(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target teardown steps.
"""
def create_session(self, name: str) -> OSSession:
- """
- Create and return a new OSSession tailored to the remote OS.
+ """Create and return a new OS-aware remote session.
+
+ The returned session won't be used by the node creating it. The session must be used by
+ the caller. The session will be maintained for the entire lifecycle of the node object,
+ at the end of which the session will be cleaned up automatically.
+
+ Note:
+ Any number of these supplementary sessions may be created.
+
+ Args:
+ name: The name of the session.
+
+ Returns:
+ A new OS-aware remote session.
"""
session_name = f"{self.name} {name}"
connection = create_session(
@@ -157,19 +206,19 @@ def create_interactive_shell(
privileged: bool = False,
app_args: str = "",
) -> InteractiveShellType:
- """Create a handler for an interactive session.
+ """Factory for interactive session handlers.
- Instantiate shell_cls according to the remote OS specifics.
+ Instantiate `shell_cls` according to the remote OS specifics.
Args:
shell_cls: The class of the shell.
- timeout: Timeout for reading output from the SSH channel. If you are
- reading from the buffer and don't receive any data within the timeout
- it will throw an error.
+ timeout: Timeout for reading output from the SSH channel. If you are reading from
+ the buffer and don't receive any data within the timeout it will throw an error.
privileged: Whether to run the shell with administrative privileges.
app_args: The arguments to be passed to the application.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not shell_cls.dpdk_app:
shell_cls.path = self.main_session.join_remote_path(shell_cls.path)
@@ -186,14 +235,22 @@ def filter_lcores(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
) -> list[LogicalCore]:
- """
- Filter the LogicalCores found on the Node according to
- a LogicalCoreCount or a LogicalCoreList.
+ """Filter the node's logical cores that DTS can use.
+
+ Logical cores that DTS can use are the ones that are present on the node, but filtered
+ according to the test run configuration. The `filter_specifier` will filter cores from
+ those logical cores.
+
+ Args:
+ filter_specifier: Two different filters can be used, one that specifies the number
+ of logical cores per core, cores per socket and the number of sockets,
+ and another one that specifies a logical core list.
+ ascending: If :data:`True`, use cores with the lowest numerical id first and continue
+ in ascending order. If :data:`False`, start with the highest id and continue
+ in descending order. This ordering affects which sockets to consider first as well.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+ Returns:
+ The filtered logical cores.
"""
self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
return lcore_filter(
@@ -203,17 +260,14 @@ def filter_lcores(
).filter()
def _get_remote_cpus(self) -> None:
- """
- Scan CPUs in the remote OS and store a list of LogicalCores.
- """
+ """Scan CPUs in the remote OS and store a list of LogicalCores."""
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
def _setup_hugepages(self) -> None:
- """
- Setup hugepages on the Node. Different architectures can supply different
- amounts of memory for hugepages and numa-based hugepage allocation may need
- to be considered.
+ """Setup hugepages on the node.
+
+ Configure the hugepages only if they're specified in the node's test run configuration.
"""
if self.config.hugepages:
self.main_session.setup_hugepages(
@@ -221,8 +275,11 @@ def _setup_hugepages(self) -> None:
)
def configure_port_state(self, port: Port, enable: bool = True) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port`.
+
+ Args:
+ port: The port to enable/disable.
+ enable: :data:`True` to enable, :data:`False` to disable.
"""
self.main_session.configure_port_state(port, enable)
@@ -232,15 +289,17 @@ def configure_port_ip_address(
port: Port,
delete: bool = False,
) -> None:
- """
- Configure the IP address of a port on this node.
+ """Add an IP address to `port` on this node.
+
+ Args:
+ address: The IP address with mask in CIDR format. Can be either IPv4 or IPv6.
+ port: The port to which to add the address.
+ delete: If :data:`True`, will delete the address from the port instead of adding it.
"""
self.main_session.configure_port_ip_address(address, port, delete)
def close(self) -> None:
- """
- Close all connections and free other resources.
- """
+ """Close all connections and free other resources."""
if self.main_session:
self.main_session.close()
for session in self._other_sessions:
@@ -249,6 +308,11 @@ def close(self) -> None:
@staticmethod
def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+ """Skip the decorated function.
+
+ The :option:`--skip-setup` command line argument and the :envvar:`DTS_SKIP_SETUP`
+ environment variable enable the decorator.
+ """
if SETTINGS.skip_setup:
return lambda *args: None
else:
@@ -258,6 +322,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
def create_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> OSSession:
+ """Factory for OS-aware sessions.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
match node_config.os:
case OS.linux:
return LinuxSession(node_config, name, logger)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 18/23] dts: sut and tg nodes docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (15 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 17/23] dts: node " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 19/23] dts: base traffic generators " Juraj Linkeš
` (4 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/sut_node.py | 219 ++++++++++++++++--------
dts/framework/testbed_model/tg_node.py | 42 +++--
2 files changed, 170 insertions(+), 91 deletions(-)
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 4e33cf02ea..b57d48fd31 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -3,6 +3,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""System under test (DPDK + hardware) node.
+
+A system under test (SUT) is the combination of DPDK
+and the hardware we're testing with DPDK (NICs, crypto and other devices).
+An SUT node is where this SUT runs.
+"""
+
+
import os
import tarfile
import time
@@ -26,6 +34,11 @@
class EalParameters(object):
+ """The environment abstraction layer parameters.
+
+ The string representation can be created by converting the instance to a string.
+ """
+
def __init__(
self,
lcore_list: LogicalCoreList,
@@ -35,21 +48,23 @@ def __init__(
vdevs: list[VirtualDevice],
other_eal_param: str,
):
- """
- Generate eal parameters character string;
- :param lcore_list: the list of logical cores to use.
- :param memory_channels: the number of memory channels to use.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
+ """Initialize the parameters according to inputs.
+
+ Process the parameters into the format used on the command line.
+
+ Args:
+ lcore_list: The list of logical cores to use.
+ memory_channels: The number of memory channels to use.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``
"""
self._lcore_list = f"-l {lcore_list}"
self._memory_channels = f"-n {memory_channels}"
@@ -61,6 +76,7 @@ def __init__(
self._other_eal_param = other_eal_param
def __str__(self) -> str:
+ """Create the EAL string."""
return (
f"{self._lcore_list} "
f"{self._memory_channels} "
@@ -72,11 +88,21 @@ def __str__(self) -> str:
class SutNode(Node):
- """
- A class for managing connections to the System under Test, providing
- methods that retrieve the necessary information about the node (such as
- CPU, memory and NIC details) and configuration capabilities.
- Another key capability is building DPDK according to given build target.
+ """The system under test node.
+
+ The SUT node extends :class:`Node` with DPDK specific features:
+
+ * DPDK build,
+ * Gathering of DPDK build info,
+ * The running of DPDK apps, interactively or one-time execution,
+ * DPDK apps cleanup.
+
+ The :option:`--tarball` command line argument and the :envvar:`DTS_DPDK_TARBALL`
+ environment variable configure the path to the DPDK tarball
+ or the git commit ID, tag ID or tree ID to test.
+
+ Attributes:
+ config: The SUT node configuration
"""
config: SutNodeConfiguration
@@ -93,6 +119,11 @@ class SutNode(Node):
_compiler_version: str | None
def __init__(self, node_config: SutNodeConfiguration):
+ """Extend the constructor with SUT node specifics.
+
+ Args:
+ node_config: The SUT node's test run configuration.
+ """
super(SutNode, self).__init__(node_config)
self._dpdk_prefix_list = []
self._build_target_config = None
@@ -111,6 +142,12 @@ def __init__(self, node_config: SutNodeConfiguration):
@property
def _remote_dpdk_dir(self) -> PurePath:
+ """The remote DPDK dir.
+
+ This internal property should be set after extracting the DPDK tarball. If it's not set,
+ that implies the DPDK setup step has been skipped, in which case we can guess where
+ a previous build was located.
+ """
if self.__remote_dpdk_dir is None:
self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
return self.__remote_dpdk_dir
@@ -121,6 +158,11 @@ def _remote_dpdk_dir(self, value: PurePath) -> None:
@property
def remote_dpdk_build_dir(self) -> PurePath:
+ """The remote DPDK build directory.
+
+ This is the directory where DPDK was built.
+ We assume it was built in a subdirectory of the extracted tarball.
+ """
if self._build_target_config:
return self.main_session.join_remote_path(
self._remote_dpdk_dir, self._build_target_config.name
@@ -130,6 +172,7 @@ def remote_dpdk_build_dir(self) -> PurePath:
@property
def dpdk_version(self) -> str:
+ """Last built DPDK version."""
if self._dpdk_version is None:
self._dpdk_version = self.main_session.get_dpdk_version(
self._remote_dpdk_dir
@@ -138,12 +181,14 @@ def dpdk_version(self) -> str:
@property
def node_info(self) -> NodeInfo:
+ """Additional node information."""
if self._node_info is None:
self._node_info = self.main_session.get_node_info()
return self._node_info
@property
def compiler_version(self) -> str:
+ """The node's compiler version."""
if self._compiler_version is None:
if self._build_target_config is not None:
self._compiler_version = self.main_session.get_compiler_version(
@@ -158,6 +203,11 @@ def compiler_version(self) -> str:
return self._compiler_version
def get_build_target_info(self) -> BuildTargetInfo:
+ """Get additional build target information.
+
+ Returns:
+ The build target information,
+ """
return BuildTargetInfo(
dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
)
@@ -168,8 +218,9 @@ def _guess_dpdk_remote_dir(self) -> PurePath:
def _set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Setup DPDK on the SUT node.
+ """Setup DPDK on the SUT node.
+
+ Additional build target setup steps on top of those in :class:`Node`.
"""
# we want to ensure that dpdk_version and compiler_version is reset for new
# build targets
@@ -182,9 +233,7 @@ def _set_up_build_target(
def _configure_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Populate common environment variables and set build target config.
- """
+ """Populate common environment variables and set build target config."""
self._env_vars = {}
self._build_target_config = build_target_config
self._env_vars.update(
@@ -199,9 +248,7 @@ def _configure_build_target(
@Node.skip_setup
def _copy_dpdk_tarball(self) -> None:
- """
- Copy to and extract DPDK tarball on the SUT node.
- """
+ """Copy to and extract DPDK tarball on the SUT node."""
self._logger.info("Copying DPDK tarball to SUT.")
self.main_session.copy_to(SETTINGS.dpdk_tarball_path, self._remote_tmp_dir)
@@ -232,8 +279,9 @@ def _copy_dpdk_tarball(self) -> None:
@Node.skip_setup
def _build_dpdk(self) -> None:
- """
- Build DPDK. Uses the already configured target. Assumes that the tarball has
+ """Build DPDK.
+
+ Uses the already configured target. Assumes that the tarball has
already been copied to and extracted on the SUT node.
"""
self.main_session.build_dpdk(
@@ -244,15 +292,19 @@ def _build_dpdk(self) -> None:
)
def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePath:
- """
- Build one or all DPDK apps. Requires DPDK to be already built on the SUT node.
- When app_name is 'all', build all example apps.
- When app_name is any other string, tries to build that example app.
- Return the directory path of the built app. If building all apps, return
- the path to the examples directory (where all apps reside).
- The meson_dpdk_args are keyword arguments
- found in meson_option.txt in root DPDK directory. Do not use -D with them,
- for example: enable_kmods=True.
+ """Build one or all DPDK apps.
+
+ Requires DPDK to be already built on the SUT node.
+
+ Args:
+ app_name: The name of the DPDK app to build.
+ When `app_name` is ``all``, build all example apps.
+ meson_dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Returns:
+ The directory path of the built app. If building all apps, return
+ the path to the examples directory (where all apps reside).
"""
self.main_session.build_dpdk(
self._env_vars,
@@ -273,9 +325,7 @@ def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePa
)
def kill_cleanup_dpdk_apps(self) -> None:
- """
- Kill all dpdk applications on the SUT. Cleanup hugepages.
- """
+ """Kill all dpdk applications on the SUT, then clean up hugepages."""
if self._dpdk_kill_session and self._dpdk_kill_session.is_alive():
# we can use the session if it exists and responds
self._dpdk_kill_session.kill_cleanup_dpdk_apps(self._dpdk_prefix_list)
@@ -294,33 +344,34 @@ def create_eal_parameters(
vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
- """
- Generate eal parameters character string;
- :param lcore_filter_specifier: a number of lcores/cores/sockets to use
- or a list of lcore ids to use.
- The default will select one lcore for each of two cores
- on one socket, in ascending order of core ids.
- :param ascending_cores: True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the
- highest id and continue in descending order. This ordering
- affects which sockets to consider first as well.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param append_prefix_timestamp: if True, will append a timestamp to
- DPDK file prefix.
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
- :return: eal param string, eg:
- '-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420';
- """
+ """Compose the EAL parameters.
+
+ Process the list of cores and the DPDK prefix and pass that along with
+ the rest of the arguments.
+ Args:
+ lcore_filter_specifier: A number of lcores/cores/sockets to use
+ or a list of lcore ids to use.
+ The default will select one lcore for each of two cores
+ on one socket, in ascending order of core ids.
+ ascending_cores: Sort cores in ascending order (lowest to highest IDs).
+ If :data:`False`, sort in descending order.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ append_prefix_timestamp: If :data:`True`, will append a timestamp to DPDK file prefix.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``.
+
+ Returns:
+ An EAL param string, such as
+ ``-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420``.
+ """
lcore_list = LogicalCoreList(
self.filter_lcores(lcore_filter_specifier, ascending_cores)
)
@@ -346,14 +397,29 @@ def create_eal_parameters(
def run_dpdk_app(
self, app_path: PurePath, eal_args: "EalParameters", timeout: float = 30
) -> CommandResult:
- """
- Run DPDK application on the remote node.
+ """Run DPDK application on the remote node.
+
+ The application is not run interactively - the command that starts the application
+ is executed and then the call waits for it to finish execution.
+
+ Args:
+ app_path: The remote path to the DPDK application.
+ eal_args: EAL parameters to run the DPDK application with.
+ timeout: Wait at most this long in seconds to execute the command.
+
+ Returns:
+ The result of the DPDK app execution.
"""
return self.main_session.send_command(
f"{app_path} {eal_args}", timeout, privileged=True, verify=True
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Enable/disable IPv4 forwarding on the node.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
+ """
self.main_session.configure_ipv4_forwarding(enable)
def create_interactive_shell(
@@ -363,9 +429,13 @@ def create_interactive_shell(
privileged: bool = False,
eal_parameters: EalParameters | str | None = None,
) -> InteractiveShellType:
- """Factory method for creating a handler for an interactive session.
+ """Extend the factory for interactive session handlers.
+
+ The extensions are SUT node specific:
- Instantiate shell_cls according to the remote OS specifics.
+ * The default for `eal_parameters`,
+ * The interactive shell path `shell_cls.path` is prepended with path to the remote
+ DPDK build directory for DPDK apps.
Args:
shell_cls: The class of the shell.
@@ -375,9 +445,10 @@ def create_interactive_shell(
privileged: Whether to run the shell with administrative privileges.
eal_parameters: List of EAL parameters to use to launch the app. If this
isn't provided or an empty string is passed, it will default to calling
- create_eal_parameters().
+ :meth:`create_eal_parameters`.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not eal_parameters:
eal_parameters = self.create_eal_parameters()
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 166eb8430e..69eb33ccb1 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -5,13 +5,8 @@
"""Traffic generator node.
-This is the node where the traffic generator resides.
-The distinction between a node and a traffic generator is as follows:
-A node is a host that DTS connects to. It could be a baremetal server,
-a VM or a container.
-A traffic generator is software running on the node.
-A traffic generator node is a node running a traffic generator.
-A node can be a traffic generator node as well as system under test node.
+A traffic generator (TG) generates traffic that's sent towards the SUT node.
+A TG node is where the TG runs.
"""
from scapy.packet import Packet # type: ignore[import]
@@ -24,13 +19,16 @@
class TGNode(Node):
- """Manage connections to a node with a traffic generator.
+ """The traffic generator node.
- Apart from basic node management capabilities, the Traffic Generator node has
- specialized methods for handling the traffic generator running on it.
+ The TG node extends :class:`Node` with TG specific features:
- Arguments:
- node_config: The user configuration of the traffic generator node.
+ * Traffic generator initialization,
+ * The sending of traffic and receiving packets,
+ * The sending of traffic without receiving packets.
+
+ Not all traffic generators are capable of capturing traffic, which is why there
+ must be a way to send traffic without that.
Attributes:
traffic_generator: The traffic generator running on the node.
@@ -39,6 +37,13 @@ class TGNode(Node):
traffic_generator: CapturingTrafficGenerator
def __init__(self, node_config: TGNodeConfiguration):
+ """Extend the constructor with TG node specifics.
+
+ Initialize the traffic generator on the TG node.
+
+ Args:
+ node_config: The TG node's test run configuration.
+ """
super(TGNode, self).__init__(node_config)
self.traffic_generator = create_traffic_generator(
self, node_config.traffic_generator
@@ -52,17 +57,17 @@ def send_packet_and_capture(
receive_port: Port,
duration: float = 1,
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet`, return received traffic.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given duration. Also record the captured traffic
in a pcap file.
Args:
packet: The packet to send.
send_port: The egress port on the TG node.
receive_port: The ingress port in the TG node.
- duration: Capture traffic for this amount of time after sending the packet.
+ duration: Capture traffic for this amount of time after sending `packet`.
Returns:
A list of received packets. May be empty if no packets are captured.
@@ -72,6 +77,9 @@ def send_packet_and_capture(
)
def close(self) -> None:
- """Free all resources used by the node"""
+ """Free all resources used by the node.
+
+ This extends the superclass method with TG cleanup.
+ """
self.traffic_generator.close()
super(TGNode, self).close()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 19/23] dts: base traffic generators docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (16 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 18/23] dts: sut and tg nodes " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 20/23] dts: scapy tg " Juraj Linkeš
` (3 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../traffic_generator/__init__.py | 22 ++++++++-
.../capturing_traffic_generator.py | 46 +++++++++++--------
.../traffic_generator/traffic_generator.py | 33 +++++++------
3 files changed, 68 insertions(+), 33 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index 11bfa1ee0f..51cca77da4 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -1,6 +1,19 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""DTS traffic generators.
+
+A traffic generator is capable of generating traffic and then monitor returning traffic.
+A traffic generator may just count the number of received packets
+and it may additionally capture individual packets.
+
+A traffic generator may be software running on generic hardware or it could be specialized hardware.
+
+The traffic generators that only count the number of received packets are suitable only for
+performance testing. In functional testing, we need to be able to dissect each arrived packet
+and a capturing traffic generator is required.
+"""
+
from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
from framework.exception import ConfigurationError
from framework.testbed_model.node import Node
@@ -12,8 +25,15 @@
def create_traffic_generator(
tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
+ """The factory function for creating traffic generator objects from the test run configuration.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ traffic_generator_config: The traffic generator config.
+ Returns:
+ A traffic generator capable of capturing received packets.
+ """
match traffic_generator_config.traffic_generator_type:
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGenerator(tg_node, traffic_generator_config)
diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index e521211ef0..b0a43ad003 100644
--- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -23,19 +23,22 @@
def _get_default_capture_name() -> str:
- """
- This is the function used for the default implementation of capture names.
- """
return str(uuid.uuid4())
class CapturingTrafficGenerator(TrafficGenerator):
"""Capture packets after sending traffic.
- A mixin interface which enables a packet generator to declare that it can capture
+ The intermediary interface which enables a packet generator to declare that it can capture
packets and return them to the user.
+ Similarly to
+ :class:`~framework.testbed_model.traffic_generator.traffic_generator.TrafficGenerator`,
+ this class exposes the public methods specific to capturing traffic generators and defines
+ a private method that must implement the traffic generation and capturing logic in subclasses.
+
The methods of capturing traffic generators obey the following workflow:
+
1. send packets
2. capture packets
3. write the capture to a .pcap file
@@ -44,6 +47,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
@property
def is_capturing(self) -> bool:
+ """This traffic generator can capture traffic."""
return True
def send_packet_and_capture(
@@ -54,11 +58,12 @@ def send_packet_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet` and capture received traffic.
+
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ The captured traffic is recorded in the `capture_name`.pcap file.
Args:
packet: The packet to send.
@@ -68,7 +73,7 @@ def send_packet_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
return self.send_packets_and_capture(
[packet], send_port, receive_port, duration, capture_name
@@ -82,11 +87,14 @@ def send_packets_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send packets, return received traffic.
+ """Send `packets` and capture received traffic.
- Send packets on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ Send `packets` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
+
+ The captured traffic is recorded in the `capture_name`.pcap file. The target directory
+ can be configured with the :option:`--output-dir` command line argument or
+ the :envvar:`DTS_OUTPUT_DIR` environment variable.
Args:
packets: The packets to send.
@@ -96,7 +104,7 @@ def send_packets_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
self._logger.debug(get_packet_summaries(packets))
self._logger.debug(
@@ -124,10 +132,12 @@ def _send_packets_and_capture(
receive_port: Port,
duration: float,
) -> list[Packet]:
- """
- The extended classes must implement this method which
- sends packets on send_port and receives packets on the receive_port
- for the specified duration. It must be able to handle no received packets.
+ """The implementation of :method:`send_packets_and_capture`.
+
+ The subclasses must implement this method which sends `packets` on `send_port`
+ and receives packets on `receive_port` for the specified `duration`.
+
+ It must be able to handle no received packets.
"""
def _write_capture_from_packets(
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index ea7c3963da..ed396c6a2f 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -22,7 +22,8 @@
class TrafficGenerator(ABC):
"""The base traffic generator.
- Defines the few basic methods that each traffic generator must implement.
+ Exposes the common public methods of all traffic generators and defines private methods
+ that must implement the traffic generation logic in subclasses.
"""
_config: TrafficGeneratorConfig
@@ -30,6 +31,12 @@ class TrafficGenerator(ABC):
_logger: DTSLOG
def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ """Initialize the traffic generator.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ config: The traffic generator's test run configuration.
+ """
self._config = config
self._tg_node = tg_node
self._logger = getLogger(
@@ -37,9 +44,9 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
)
def send_packet(self, packet: Packet, port: Port) -> None:
- """Send a packet and block until it is fully sent.
+ """Send `packet` and block until it is fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packet` on `port`, then wait until `packet` is fully sent.
Args:
packet: The packet to send.
@@ -48,9 +55,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
self.send_packets([packet], port)
def send_packets(self, packets: list[Packet], port: Port) -> None:
- """Send packets and block until they are fully sent.
+ """Send `packets` and block until they are fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packets` on `port`, then wait until `packets` are fully sent.
Args:
packets: The packets to send.
@@ -62,19 +69,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
@abstractmethod
def _send_packets(self, packets: list[Packet], port: Port) -> None:
- """
- The extended classes must implement this method which
- sends packets on send_port. The method should block until all packets
- are fully sent.
+ """The implementation of :method:`send_packets`.
+
+ The subclasses must implement this method which sends `packets` on `port`.
+ The method should block until all `packets` are fully sent.
+
+ What full sent means is defined by the traffic generator.
"""
@property
def is_capturing(self) -> bool:
- """Whether this traffic generator can capture traffic.
-
- Returns:
- True if the traffic generator can capture traffic, False otherwise.
- """
+ """This traffic generator can't capture traffic."""
return False
@abstractmethod
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 20/23] dts: scapy tg docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (17 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 19/23] dts: base traffic generators " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 21/23] dts: test suites " Juraj Linkeš
` (2 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
1 file changed, 54 insertions(+), 37 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
index 51864b6e6b..d0fe03055a 100644
--- a/dts/framework/testbed_model/traffic_generator/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -2,14 +2,15 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""Scapy traffic generator.
+"""The Scapy traffic generator.
-Traffic generator used for functional testing, implemented using the Scapy library.
+A traffic generator used for functional testing, implemented with
+`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
The traffic generator uses an XML-RPC server to run Scapy on the remote TG node.
-The XML-RPC server runs in an interactive remote SSH session running Python console,
-where we start the server. The communication with the server is facilitated with
-a local server proxy.
+The traffic generator uses the :mod:`xmlrpc.server` module to run an XML-RPC server
+in an interactive remote Python SSH session. The communication with the server is facilitated
+with a local server proxy from the :mod:`xmlrpc.client` module.
"""
import inspect
@@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
recv_iface: str,
duration: float,
) -> list[bytes]:
- """RPC function to send and capture packets.
+ """The RPC function to send and capture packets.
- The function is meant to be executed on the remote TG node.
+ The function is meant to be executed on the remote TG node via the server proxy.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
recv_iface: The logical name of the ingress interface.
duration: Capture for this amount of time, in seconds.
Returns:
A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
+ to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
sniffer = scapy.all.AsyncSniffer(
@@ -98,19 +99,15 @@ def scapy_send_packets_and_capture(
def scapy_send_packets(
xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: str
) -> None:
- """RPC function to send packets.
+ """The RPC function to send packets.
- The function is meant to be executed on the remote TG node.
- It doesn't return anything, only sends packets.
+ The function is meant to be executed on the remote TG node via the server proxy.
+ It only sends `xmlrpc_packets`, without capturing them.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
-
- Returns:
- A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True, verbose=True)
@@ -130,11 +127,19 @@ def scapy_send_packets(
class QuittableXMLRPCServer(SimpleXMLRPCServer):
- """Basic XML-RPC server that may be extended
- by functions serializable by the marshal module.
+ r"""Basic XML-RPC server.
+
+ The server may be augmented by functions serializable by the :mod:`marshal` module.
"""
def __init__(self, *args, **kwargs):
+ """Extend the XML-RPC server initialization.
+
+ Args:
+ args: The positional arguments that will be passed to the superclass's constructor.
+ kwargs: The keyword arguments that will be passed to the superclass's constructor.
+ The `allow_none` argument will be set to ``True``.
+ """
kwargs["allow_none"] = True
super().__init__(*args, **kwargs)
self.register_introspection_functions()
@@ -142,13 +147,12 @@ def __init__(self, *args, **kwargs):
self.register_function(self.add_rpc_function)
def quit(self) -> None:
+ """Quit the server."""
self._BaseServer__shutdown_request = True
return None
def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
- """Add a function to the server.
-
- This is meant to be executed remotely.
+ """Add a function to the server from the local server proxy.
Args:
name: The name of the function.
@@ -159,6 +163,11 @@ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> N
self.register_function(function)
def serve_forever(self, poll_interval: float = 0.5) -> None:
+ """Extend the superclass method with an additional print.
+
+ Once executed in the local server proxy, the print gives us a clear string to expect
+ when starting the server. The print means the function was executed on the XML-RPC server.
+ """
print("XMLRPC OK")
super().serve_forever(poll_interval)
@@ -166,19 +175,12 @@ def serve_forever(self, poll_interval: float = 0.5) -> None:
class ScapyTrafficGenerator(CapturingTrafficGenerator):
"""Provides access to scapy functions via an RPC interface.
- The traffic generator first starts an XML-RPC on the remote TG node.
- Then it populates the server with functions which use the Scapy library
- to send/receive traffic.
-
- Any packets sent to the remote server are first converted to bytes.
- They are received as xmlrpc.client.Binary objects on the server side.
- When the server sends the packets back, they are also received as
- xmlrpc.client.Binary object on the client side, are converted back to Scapy
- packets and only then returned from the methods.
+ The class extends the base with remote execution of scapy functions.
- Arguments:
- tg_node: The node where the traffic generator resides.
- config: The user configuration of the traffic generator.
+ Any packets sent to the remote server are first converted to bytes. They are received as
+ :class:`~xmlrpc.client.Binary` objects on the server side. When the server sends the packets
+ back, they are also received as :class:`~xmlrpc.client.Binary` objects on the client side, are
+ converted back to :class:`scapy.packet.Packet` objects and only then returned from the methods.
Attributes:
session: The exclusive interactive remote session created by the Scapy
@@ -192,6 +194,22 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
_config: ScapyTrafficGeneratorConfig
def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ """Extend the constructor with Scapy TG specifics.
+
+ The traffic generator first starts an XML-RPC on the remote `tg_node`.
+ Then it populates the server with functions which use the Scapy library
+ to send/receive traffic:
+
+ * :func:`scapy_send_packets_and_capture`
+ * :func:`scapy_send_packets`
+
+ To enable verbose logging from the xmlrpc client, use the :option:`--verbose`
+ command line argument or the :envvar:`DTS_VERBOSE` environment variable.
+
+ Args:
+ tg_node: The node where the traffic generator resides.
+ config: The traffic generator's test run configuration.
+ """
super().__init__(tg_node, config)
assert (
@@ -237,10 +255,8 @@ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
[line for line in src.splitlines() if not line.isspace() and line != ""]
)
- spacing = "\n" * 4
-
# execute it in the python terminal
- self.session.send_command(spacing + src + spacing)
+ self.session.send_command(src + "\n")
self.session.send_command(
f"server = QuittableXMLRPCServer(('0.0.0.0', {listen_port}));"
f"server.serve_forever()",
@@ -274,6 +290,7 @@ def _send_packets_and_capture(
return scapy_packets
def close(self) -> None:
+ """Close the traffic generator."""
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 21/23] dts: test suites docstring update
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (18 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 20/23] dts: scapy tg " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 22/23] dts: add doc generation dependencies Juraj Linkeš
2023-11-08 12:53 ` [PATCH v6 23/23] dts: add doc generation Juraj Linkeš
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/tests/TestSuite_hello_world.py | 16 +++++----
dts/tests/TestSuite_os_udp.py | 16 +++++----
dts/tests/TestSuite_smoke_tests.py | 53 +++++++++++++++++++++++++++---
3 files changed, 68 insertions(+), 17 deletions(-)
diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
index 7e3d95c0cf..662a8f8726 100644
--- a/dts/tests/TestSuite_hello_world.py
+++ b/dts/tests/TestSuite_hello_world.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2010-2014 Intel Corporation
-"""
+"""The DPDK hello world app test suite.
+
Run the helloworld example app and verify it prints a message for each used core.
No other EAL parameters apart from cores are used.
"""
@@ -15,22 +16,25 @@
class TestHelloWorld(TestSuite):
+ """DPDK hello world app test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Build the app we're about to test - helloworld.
"""
self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
def test_hello_world_single_core(self) -> None:
- """
+ """Single core test case.
+
Steps:
Run the helloworld app on the first usable logical core.
Verify:
The app prints a message from the used core:
"hello from core <core_id>"
"""
-
# get the first usable core
lcore_amount = LogicalCoreCount(1, 1, 1)
lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
@@ -44,14 +48,14 @@ def test_hello_world_single_core(self) -> None:
)
def test_hello_world_all_cores(self) -> None:
- """
+ """All cores test case.
+
Steps:
Run the helloworld app on all usable logical cores.
Verify:
The app prints a message from all used cores:
"hello from core <core_id>"
"""
-
# get the maximum logical core number
eal_para = self.sut_node.create_eal_parameters(
lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
index 9b5f39711d..f99c4d76e3 100644
--- a/dts/tests/TestSuite_os_udp.py
+++ b/dts/tests/TestSuite_os_udp.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
+"""Basic IPv4 OS routing test suite.
+
Configure SUT node to route traffic from if1 to if2.
Send a packet to the SUT node, verify it comes back on the second port on the TG node.
"""
@@ -13,22 +14,24 @@
class TestOSUdp(TestSuite):
+ """IPv4 UDP OS routing test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Configure SUT ports and SUT to route traffic from if1 to if2.
"""
-
self.configure_testbed_ipv4()
def test_os_udp(self) -> None:
- """
+ """Basic UDP IPv4 traffic test case.
+
Steps:
Send a UDP packet.
Verify:
The packet with proper addresses arrives at the other TG port.
"""
-
packet = Ether() / IP() / UDP()
received_packets = self.send_packet_and_capture(packet)
@@ -38,7 +41,8 @@ def test_os_udp(self) -> None:
self.verify_packets(expected_packet, received_packets)
def tear_down_suite(self) -> None:
- """
+ """Tear down the test suite.
+
Teardown:
Remove the SUT port configuration configured in setup.
"""
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index 4a269df75b..36ff10a862 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -1,6 +1,17 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Smoke test suite.
+
+Smoke tests are a class of tests which are used for validating a minimal set of important features.
+These are the most important features without which (or when they're faulty) the software wouldn't
+work properly. Thus, if any failure occurs while testing these features,
+there isn't that much of a reason to continue testing, as the software is fundamentally broken.
+
+These tests don't have to include only DPDK tests, as the reason for failures could be
+in the infrastructure (a faulty link between NICs or a misconfiguration).
+"""
+
import re
from framework.config import PortConfig
@@ -11,13 +22,25 @@
class SmokeTests(TestSuite):
+ """DPDK and infrastructure smoke test suite.
+
+ The test cases validate the most basic DPDK functionality needed for all other test suites.
+ The infrastructure also needs to be tested, as that is also used by all other test suites.
+
+ Attributes:
+ is_blocking: This test suite will block the execution of all other test suites
+ in the build target after it.
+ nics_in_node: The NICs present on the SUT node.
+ """
+
is_blocking = True
# dicts in this list are expected to have two keys:
# "pci_address" and "current_driver"
nics_in_node: list[PortConfig] = []
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Set the build directory path and generate a list of NICs in the SUT node.
"""
@@ -25,7 +48,13 @@ def set_up_suite(self) -> None:
self.nics_in_node = self.sut_node.config.ports
def test_unit_tests(self) -> None:
- """
+ """DPDK meson fast-tests unit tests.
+
+ The DPDK unit tests are basic tests that indicate regressions and other critical failures.
+ These need to be addressed before other testing.
+
+ The fast-tests unit tests are a subset with only the most basic tests.
+
Test:
Run the fast-test unit-test suite through meson.
"""
@@ -37,7 +66,14 @@ def test_unit_tests(self) -> None:
)
def test_driver_tests(self) -> None:
- """
+ """DPDK meson driver-tests unit tests.
+
+ The DPDK unit tests are basic tests that indicate regressions and other critical failures.
+ These need to be addressed before other testing.
+
+ The driver-tests unit tests are a subset that test only drivers. These may be run
+ with virtual devices as well.
+
Test:
Run the driver-test unit-test suite through meson.
"""
@@ -63,7 +99,10 @@ def test_driver_tests(self) -> None:
)
def test_devices_listed_in_testpmd(self) -> None:
- """
+ """Testpmd device discovery.
+
+ If the configured devices can't be found in testpmd, they can't be tested.
+
Test:
Uses testpmd driver to verify that devices have been found by testpmd.
"""
@@ -79,7 +118,11 @@ def test_devices_listed_in_testpmd(self) -> None:
)
def test_device_bound_to_driver(self) -> None:
- """
+ """Device driver in OS.
+
+ The devices must be bound to the proper driver, otherwise they can't be used by DPDK
+ or the traffic generators.
+
Test:
Ensure that all drivers listed in the config are bound to the correct
driver.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 22/23] dts: add doc generation dependencies
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (19 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 21/23] dts: test suites " Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-08 16:00 ` Yoan Picchi
2023-11-08 12:53 ` [PATCH v6 23/23] dts: add doc generation Juraj Linkeš
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 499 ++++++++++++++++++++++++++++++++++++++++++++-
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 1 deletion(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index a734fa71f0..dea98f6913 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -404,6 +651,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -515,6 +773,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -632,6 +904,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -753,6 +1046,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -775,6 +1084,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "6.2.1"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
+ {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.2"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+ {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
+ {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -819,6 +1299,23 @@ files = [
{file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -837,4 +1334,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
+content-hash = "5faad2e53833e9b8a353ad3554c58de991801a9ebe8f9712fc9c839b35e7a789"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 3943c87c87..98df431b3b 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<7"
+sphinx-rtd-theme = "^1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 22/23] dts: add doc generation dependencies
2023-11-08 12:53 ` [PATCH v6 22/23] dts: add doc generation dependencies Juraj Linkeš
@ 2023-11-08 16:00 ` Yoan Picchi
2023-11-15 10:00 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-08 16:00 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek
Cc: dev
On 11/8/23 12:53, Juraj Linkeš wrote:
> Sphinx imports every Python module when generating documentation from
> docstrings, meaning all dts dependencies, including Python version,
> must be satisfied.
> By adding Sphinx to dts dependencies we make sure that the proper
> Python version and dependencies are used when Sphinx is executed.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/poetry.lock | 499 ++++++++++++++++++++++++++++++++++++++++++++-
> dts/pyproject.toml | 7 +
> 2 files changed, 505 insertions(+), 1 deletion(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index a734fa71f0..dea98f6913 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -1,5 +1,16 @@
> # This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
>
> +[[package]]
> +name = "alabaster"
> +version = "0.7.13"
> +description = "A configurable sidebar-enabled Sphinx theme"
> +optional = false
> +python-versions = ">=3.6"
> +files = [
> + {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
> + {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
> +]
> +
> [[package]]
> name = "attrs"
> version = "23.1.0"
> @@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
> tests = ["attrs[tests-no-zope]", "zope-interface"]
> tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
>
> +[[package]]
> +name = "babel"
> +version = "2.13.1"
> +description = "Internationalization utilities"
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> + {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
> + {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
> +]
> +
> +[package.dependencies]
> +setuptools = {version = "*", markers = "python_version >= \"3.12\""}
> +
> +[package.extras]
> +dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
> +
> [[package]]
> name = "bcrypt"
> version = "4.0.1"
> @@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
> jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
> uvloop = ["uvloop (>=0.15.2)"]
>
> +[[package]]
> +name = "certifi"
> +version = "2023.7.22"
> +description = "Python package for providing Mozilla's CA Bundle."
> +optional = false
> +python-versions = ">=3.6"
> +files = [
> + {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
> + {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
> +]
> +
> [[package]]
> name = "cffi"
> version = "1.15.1"
> @@ -162,6 +201,105 @@ files = [
> [package.dependencies]
> pycparser = "*"
>
> +[[package]]
> +name = "charset-normalizer"
> +version = "3.3.2"
> +description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
> +optional = false
> +python-versions = ">=3.7.0"
> +files = [
> + {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
> + {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
> + {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
> + {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
> + {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
> + {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
> + {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
> + {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
> +]
> +
> [[package]]
> name = "click"
> version = "8.1.6"
> @@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
> test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
> test-randomorder = ["pytest-randomly"]
>
> +[[package]]
> +name = "docutils"
> +version = "0.18.1"
> +description = "Docutils -- Python Documentation Utilities"
> +optional = false
> +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
> +files = [
> + {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
> + {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
> +]
> +
> [[package]]
> name = "fabric"
> version = "2.7.1"
> @@ -252,6 +401,28 @@ pathlib2 = "*"
> pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
> testing = ["mock (>=2.0.0,<3.0)"]
>
> +[[package]]
> +name = "idna"
> +version = "3.4"
> +description = "Internationalized Domain Names in Applications (IDNA)"
> +optional = false
> +python-versions = ">=3.5"
> +files = [
> + {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
> + {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
> +]
> +
> +[[package]]
> +name = "imagesize"
> +version = "1.4.1"
> +description = "Getting image size from png/jpeg/jpeg2000/gif file"
> +optional = false
> +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
> +files = [
> + {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
> + {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
> +]
> +
> [[package]]
> name = "invoke"
> version = "1.7.3"
> @@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
> plugins = ["setuptools"]
> requirements-deprecated-finder = ["pip-api", "pipreqs"]
>
> +[[package]]
> +name = "jinja2"
> +version = "3.1.2"
> +description = "A very fast and expressive template engine."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> + {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
> + {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
> +]
> +
> +[package.dependencies]
> +MarkupSafe = ">=2.0"
> +
> +[package.extras]
> +i18n = ["Babel (>=2.7)"]
> +
> [[package]]
> name = "jsonpatch"
> version = "1.33"
> @@ -340,6 +528,65 @@ files = [
> [package.dependencies]
> referencing = ">=0.28.0"
>
> +[[package]]
> +name = "markupsafe"
> +version = "2.1.3"
> +description = "Safely add untrusted strings to HTML/XML markup."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> + {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
> + {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
> + {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
> + {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
> + {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
> + {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
> + {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
> +]
> +
> [[package]]
> name = "mccabe"
> version = "0.7.0"
> @@ -404,6 +651,17 @@ files = [
> {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
> ]
>
> +[[package]]
> +name = "packaging"
> +version = "23.2"
> +description = "Core utilities for Python packages"
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> + {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
> + {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
> +]
> +
> [[package]]
> name = "paramiko"
> version = "3.2.0"
> @@ -515,6 +773,20 @@ files = [
> {file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
> ]
>
> +[[package]]
> +name = "pygments"
> +version = "2.16.1"
> +description = "Pygments is a syntax highlighting package written in Python."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> + {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
> + {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
> +]
> +
> +[package.extras]
> +plugins = ["importlib-metadata"]
> +
> [[package]]
> name = "pylama"
> version = "8.4.1"
> @@ -632,6 +904,27 @@ files = [
> attrs = ">=22.2.0"
> rpds-py = ">=0.7.0"
>
> +[[package]]
> +name = "requests"
> +version = "2.31.0"
> +description = "Python HTTP for Humans."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> + {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
> + {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
> +]
> +
> +[package.dependencies]
> +certifi = ">=2017.4.17"
> +charset-normalizer = ">=2,<4"
> +idna = ">=2.5,<4"
> +urllib3 = ">=1.21.1,<3"
> +
> +[package.extras]
> +socks = ["PySocks (>=1.5.6,!=1.5.7)"]
> +use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
> +
> [[package]]
> name = "rpds-py"
> version = "0.9.2"
> @@ -753,6 +1046,22 @@ basic = ["ipython"]
> complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
> docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
>
> +[[package]]
> +name = "setuptools"
> +version = "68.2.2"
> +description = "Easily download, build, install, upgrade, and uninstall Python packages"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> + {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
> + {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
> +]
> +
> +[package.extras]
> +docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
> +testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
> +testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
> +
> [[package]]
> name = "six"
> version = "1.16.0"
> @@ -775,6 +1084,177 @@ files = [
> {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
> ]
>
> +[[package]]
> +name = "sphinx"
> +version = "6.2.1"
> +description = "Python documentation generator"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> + {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
> + {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
> +]
> +
> +[package.dependencies]
> +alabaster = ">=0.7,<0.8"
> +babel = ">=2.9"
> +colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
> +docutils = ">=0.18.1,<0.20"
> +imagesize = ">=1.3"
> +Jinja2 = ">=3.0"
> +packaging = ">=21.0"
> +Pygments = ">=2.13"
> +requests = ">=2.25.0"
> +snowballstemmer = ">=2.0"
> +sphinxcontrib-applehelp = "*"
> +sphinxcontrib-devhelp = "*"
> +sphinxcontrib-htmlhelp = ">=2.0.0"
> +sphinxcontrib-jsmath = "*"
> +sphinxcontrib-qthelp = "*"
> +sphinxcontrib-serializinghtml = ">=1.1.5"
> +
> +[package.extras]
> +docs = ["sphinxcontrib-websupport"]
> +lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
> +test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
> +
> +[[package]]
> +name = "sphinx-rtd-theme"
> +version = "1.2.2"
> +description = "Read the Docs theme for Sphinx"
> +optional = false
> +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
> +files = [
> + {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
> + {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
> +]
> +
> +[package.dependencies]
> +docutils = "<0.19"
> +sphinx = ">=1.6,<7"
> +sphinxcontrib-jquery = ">=4,<5"
> +
> +[package.extras]
> +dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
> +
> +[[package]]
> +name = "sphinxcontrib-applehelp"
> +version = "1.0.7"
> +description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
> +optional = false
> +python-versions = ">=3.9"
> +files = [
> + {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
> + {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
> +]
> +
> +[package.dependencies]
> +Sphinx = ">=5"
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-devhelp"
> +version = "1.0.5"
> +description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
> +optional = false
> +python-versions = ">=3.9"
> +files = [
> + {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
> + {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
> +]
> +
> +[package.dependencies]
> +Sphinx = ">=5"
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-htmlhelp"
> +version = "2.0.4"
> +description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
> +optional = false
> +python-versions = ">=3.9"
> +files = [
> + {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
> + {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
> +]
> +
> +[package.dependencies]
> +Sphinx = ">=5"
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["html5lib", "pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-jquery"
> +version = "4.1"
> +description = "Extension to include jQuery on newer Sphinx releases"
> +optional = false
> +python-versions = ">=2.7"
> +files = [
> + {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
> + {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
> +]
> +
> +[package.dependencies]
> +Sphinx = ">=1.8"
> +
> +[[package]]
> +name = "sphinxcontrib-jsmath"
> +version = "1.0.1"
> +description = "A sphinx extension which renders display math in HTML via JavaScript"
> +optional = false
> +python-versions = ">=3.5"
> +files = [
> + {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
> + {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
> +]
> +
> +[package.extras]
> +test = ["flake8", "mypy", "pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-qthelp"
> +version = "1.0.6"
> +description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
> +optional = false
> +python-versions = ">=3.9"
> +files = [
> + {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
> + {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
> +]
> +
> +[package.dependencies]
> +Sphinx = ">=5"
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-serializinghtml"
> +version = "1.1.9"
> +description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
> +optional = false
> +python-versions = ">=3.9"
> +files = [
> + {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
> + {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
> +]
> +
> +[package.dependencies]
> +Sphinx = ">=5"
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["pytest"]
> +
> [[package]]
> name = "toml"
> version = "0.10.2"
> @@ -819,6 +1299,23 @@ files = [
> {file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
> ]
>
> +[[package]]
> +name = "urllib3"
> +version = "2.0.7"
> +description = "HTTP library with thread-safe connection pooling, file post, and more."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> + {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
> + {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
> +]
> +
> +[package.extras]
> +brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
> +secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
> +socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
> +zstd = ["zstandard (>=0.18.0)"]
> +
> [[package]]
> name = "warlock"
> version = "2.0.1"
> @@ -837,4 +1334,4 @@ jsonschema = ">=4,<5"
> [metadata]
> lock-version = "2.0"
> python-versions = "^3.10"
> -content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
> +content-hash = "5faad2e53833e9b8a353ad3554c58de991801a9ebe8f9712fc9c839b35e7a789"
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index 3943c87c87..98df431b3b 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -35,6 +35,13 @@ pylama = "^8.4.1"
> pyflakes = "^2.5.0"
> toml = "^0.10.2"
>
> +[tool.poetry.group.docs]
> +optional = true
> +
> +[tool.poetry.group.docs.dependencies]
> +sphinx = "<7"
> +sphinx-rtd-theme = "^1.2.2"
> +
> [build-system]
> requires = ["poetry-core>=1.0.0"]
> build-backend = "poetry.core.masonry.api"
I do get some warning while I build the doc:
$ poetry install --with docs
[...]
Installing dependencies from lock file
Warning: poetry.lock is not consistent with pyproject.toml. You may be
getting improper dependencies. Run `poetry lock [--no-update]` to fix it.
The doc seems to build fine though
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 22/23] dts: add doc generation dependencies
2023-11-08 16:00 ` Yoan Picchi
@ 2023-11-15 10:00 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 10:00 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, dev
> I do get some warning while I build the doc:
>
> $ poetry install --with docs
>
> [...]
>
> Installing dependencies from lock file
> Warning: poetry.lock is not consistent with pyproject.toml. You may be
> getting improper dependencies. Run `poetry lock [--no-update]` to fix it.
Looks like my version had an improper content-hash. I'll fix it in the
next version.
>
> The doc seems to build fine though
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 23/23] dts: add doc generation
2023-11-08 12:53 ` [PATCH v6 01/23] dts: code adjustments for " Juraj Linkeš
` (20 preceding siblings ...)
2023-11-08 12:53 ` [PATCH v6 22/23] dts: add doc generation dependencies Juraj Linkeš
@ 2023-11-08 12:53 ` Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
21 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-08 12:53 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
The tool used to generate developer docs is Sphinx, which is already
used in DPDK. The same configuration is used to preserve style, but it's
been augmented with doc-generating configuration and a change to how the
sidebar displays the content hierarchy.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 29 ++++++++++------
doc/api/meson.build | 1 +
doc/guides/conf.py | 34 ++++++++++++++++---
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 32 +++++++++++++++++-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/index.rst | 17 ++++++++++
dts/doc/meson.build | 60 +++++++++++++++++++++++++++++++++
dts/meson.build | 16 +++++++++
meson.build | 1 +
10 files changed, 176 insertions(+), 16 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..c2f3acfb1d 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,46 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run, PIPE, STDOUT
from packaging.version import Version
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default='.')
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+os.environ['DTS_ROOT'] = args.dts_root
# for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
if Version(ver) >= Version('1.7'):
sphinx_cmd += ['-j', 'auto']
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..92fe10d9e7 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..169b1d24bc 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,31 @@
file=stderr)
pass
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = False
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+# Sidebar config
+html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+}
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +59,8 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+path.append(environ.get('DTS_ROOT'))
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index cd771a428c..77d9434c1c 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -283,7 +283,10 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
warnings when some of the basics are not met.
-The code must be properly documented with docstrings. The style must conform to
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the `API docs build steps <building_api_docs>`_.
+
+Speaking of which, the code must be properly documented with docstrings. The style must conform to
the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style
`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
@@ -408,3 +411,30 @@ There are three tools used in DTS to help with code checking, style and formatti
These three tools are all used in ``devtools/dts-check-format.sh``,
the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+
+
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system. After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..f5dcd553f2
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,17 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's documentation!
+===========================================
+
+.. toctree::
+ :titlesonly:
+ :caption: Contents:
+
+ framework
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..e11ab83843
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,60 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_api_framework_dir = join_paths(dts_dir, 'framework')
+dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+if meson.version().version_compare('>=0.57.0')
+ dts_api_src = custom_target('dts_api_src',
+ output: 'modules.rst',
+ env: {'SPHINX_APIDOC_OPTIONS': 'members,show-inheritance'},
+ command: [sphinx_apidoc, '--append-syspath', '--force',
+ '--module-first', '--separate', '-V', meson.project_version(),
+ '--output-dir', dts_api_build_dir, '--no-toc', '--implicit-namespaces',
+ dts_api_framework_dir],
+ build_by_default: false)
+else
+ dts_api_src = custom_target('dts_api_src',
+ output: 'modules.rst',
+ command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
+ sphinx_apidoc, '--append-syspath', '--force',
+ '--module-first', '--separate', '-V', meson.project_version(),
+ '--output-dir', dts_api_build_dir, '--no-toc', '--implicit-namespaces',
+ dts_api_framework_dir],
+ build_by_default: false)
+endif
+doc_targets += dts_api_src
+doc_target_names += 'DTS_API_sphinx_sources'
+
+cp = find_program('cp')
+cp_index = custom_target('cp_index',
+ input: ['index.rst', 'conf_yaml_schema.json'],
+ output: 'index.rst',
+ depends: dts_api_src,
+ command: [cp, '--dereference', '@INPUT@', dts_api_build_dir],
+ build_by_default: false)
+doc_targets += cp_index
+doc_target_names += 'DTS_API_sphinx_index'
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ depends: cp_index,
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ dts_api_build_dir, dts_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: false,
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 2e6e546d20..c391bf8c71 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 00/21] dts: docstrings update
2023-11-08 12:53 ` [PATCH v6 23/23] dts: add doc generation Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 01/21] dts: code adjustments for doc generation Juraj Linkeš
` (21 more replies)
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
1 sibling, 22 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
The first commit makes changes to the code. These code changes mainly
change the structure of the code so that the actual API docs generation
works. There are also some code changes which get reflected in the
documentation, such as making functions/methods/attributes private or
public.
The rest of the commits deal with the actual docstring documentation
(from which the API docs are generated). The format of the docstrings
is the Google format [0] with PEP257 [1] and some guidelines captured
in the last commit of this group covering what the Google format
doesn't.
The docstring updates are split into many commits to make review
possible. When accepted, they may be squashed.
The docstrings have been composed in anticipation of [2], adhering to
maximum line length of 100. We don't have a tool for automatic docstring
formatting, hence the usage of 100 right away to save time.
NOTE: The logger.py module is not fully documented, as it's being
refactored and the refactor will be submitted in the near future.
Documenting it now seems unnecessary.
[0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
[1] https://peps.python.org/pep-0257/
[2] https://patches.dpdk.org/project/dpdk/list/?series=29844
v7:
Split the series into docstrings and api docs generation and addressed
comments.
Juraj Linkeš (21):
dts: code adjustments for doc generation
dts: add docstring checker
dts: add basic developer docs
dts: exceptions docstring update
dts: settings docstring update
dts: logger and utils docstring update
dts: dts runner and main docstring update
dts: test suite docstring update
dts: test result docstring update
dts: config docstring update
dts: remote session docstring update
dts: interactive remote session docstring update
dts: port and virtual device docstring update
dts: cpu docstring update
dts: os session docstring update
dts: posix and linux sessions docstring update
dts: node docstring update
dts: sut and tg nodes docstring update
dts: base traffic generators docstring update
dts: scapy tg docstring update
dts: test suites docstring update
doc/guides/tools/dts.rst | 73 +++
dts/framework/__init__.py | 12 +-
dts/framework/config/__init__.py | 379 +++++++++++++---
dts/framework/config/types.py | 132 ++++++
dts/framework/dts.py | 161 +++++--
dts/framework/exception.py | 156 ++++---
dts/framework/logger.py | 72 ++-
dts/framework/remote_session/__init__.py | 80 ++--
.../interactive_remote_session.py | 36 +-
.../remote_session/interactive_shell.py | 152 +++++++
dts/framework/remote_session/os_session.py | 284 ------------
dts/framework/remote_session/python_shell.py | 32 ++
.../remote_session/remote/__init__.py | 27 --
.../remote/interactive_shell.py | 133 ------
.../remote_session/remote/python_shell.py | 12 -
.../remote_session/remote/remote_session.py | 172 -------
.../remote_session/remote/testpmd_shell.py | 49 --
.../remote_session/remote_session.py | 232 ++++++++++
.../{remote => }/ssh_session.py | 28 +-
dts/framework/remote_session/testpmd_shell.py | 86 ++++
dts/framework/settings.py | 190 ++++++--
dts/framework/test_result.py | 296 +++++++++---
dts/framework/test_suite.py | 230 +++++++---
dts/framework/testbed_model/__init__.py | 28 +-
dts/framework/testbed_model/{hw => }/cpu.py | 209 ++++++---
dts/framework/testbed_model/hw/__init__.py | 27 --
dts/framework/testbed_model/hw/port.py | 60 ---
.../testbed_model/hw/virtual_device.py | 16 -
.../linux_session.py | 69 ++-
dts/framework/testbed_model/node.py | 216 ++++++---
dts/framework/testbed_model/os_session.py | 425 ++++++++++++++++++
dts/framework/testbed_model/port.py | 93 ++++
.../posix_session.py | 85 +++-
dts/framework/testbed_model/sut_node.py | 232 ++++++----
dts/framework/testbed_model/tg_node.py | 70 ++-
.../testbed_model/traffic_generator.py | 72 ---
.../traffic_generator/__init__.py | 44 ++
.../capturing_traffic_generator.py | 52 ++-
.../{ => traffic_generator}/scapy.py | 114 ++---
.../traffic_generator/traffic_generator.py | 87 ++++
dts/framework/testbed_model/virtual_device.py | 29 ++
dts/framework/utils.py | 128 +++---
dts/main.py | 17 +-
dts/poetry.lock | 12 +-
dts/pyproject.toml | 6 +-
dts/tests/TestSuite_hello_world.py | 16 +-
dts/tests/TestSuite_os_udp.py | 19 +-
dts/tests/TestSuite_smoke_tests.py | 53 ++-
48 files changed, 3511 insertions(+), 1692 deletions(-)
create mode 100644 dts/framework/config/types.py
rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (76%)
create mode 100644 dts/framework/remote_session/interactive_shell.py
delete mode 100644 dts/framework/remote_session/os_session.py
create mode 100644 dts/framework/remote_session/python_shell.py
delete mode 100644 dts/framework/remote_session/remote/__init__.py
delete mode 100644 dts/framework/remote_session/remote/interactive_shell.py
delete mode 100644 dts/framework/remote_session/remote/python_shell.py
delete mode 100644 dts/framework/remote_session/remote/remote_session.py
delete mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
create mode 100644 dts/framework/remote_session/remote_session.py
rename dts/framework/remote_session/{remote => }/ssh_session.py (83%)
create mode 100644 dts/framework/remote_session/testpmd_shell.py
rename dts/framework/testbed_model/{hw => }/cpu.py (50%)
delete mode 100644 dts/framework/testbed_model/hw/__init__.py
delete mode 100644 dts/framework/testbed_model/hw/port.py
delete mode 100644 dts/framework/testbed_model/hw/virtual_device.py
rename dts/framework/{remote_session => testbed_model}/linux_session.py (79%)
create mode 100644 dts/framework/testbed_model/os_session.py
create mode 100644 dts/framework/testbed_model/port.py
rename dts/framework/{remote_session => testbed_model}/posix_session.py (74%)
delete mode 100644 dts/framework/testbed_model/traffic_generator.py
create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (66%)
rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (71%)
create mode 100644 dts/framework/testbed_model/traffic_generator/traffic_generator.py
create mode 100644 dts/framework/testbed_model/virtual_device.py
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 01/21] dts: code adjustments for doc generation
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-16 21:04 ` Jeremy Spewock
2023-11-20 16:02 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 02/21] dts: add docstring checker Juraj Linkeš
` (20 subsequent siblings)
21 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
The standard Python tool for generating API documentation, Sphinx,
imports modules one-by-one when generating the documentation. This
requires code changes:
* properly guarding argument parsing in the if __name__ == '__main__'
block,
* the logger used by DTS runner underwent the same treatment so that it
doesn't create log files outside of a DTS run,
* however, DTS uses the arguments to construct an object holding global
variables. The defaults for the global variables needed to be moved
from argument parsing elsewhere,
* importing the remote_session module from framework resulted in
circular imports because of one module trying to import another
module. This is fixed by reorganizing the code,
* some code reorganization was done because the resulting structure
makes more sense, improving documentation clarity.
The are some other changes which are documentation related:
* added missing type annotation so they appear in the generated docs,
* reordered arguments in some methods,
* removed superfluous arguments and attributes,
* change private functions/methods/attributes to private and vice-versa.
The above all appear in the generated documentation and the with them,
the documentation is improved.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 10 ++-
dts/framework/dts.py | 33 +++++--
dts/framework/exception.py | 54 +++++-------
dts/framework/remote_session/__init__.py | 41 ++++-----
.../interactive_remote_session.py | 0
.../{remote => }/interactive_shell.py | 0
.../{remote => }/python_shell.py | 0
.../remote_session/remote/__init__.py | 27 ------
.../{remote => }/remote_session.py | 0
.../{remote => }/ssh_session.py | 12 +--
.../{remote => }/testpmd_shell.py | 0
dts/framework/settings.py | 87 +++++++++++--------
dts/framework/test_result.py | 4 +-
dts/framework/test_suite.py | 7 +-
dts/framework/testbed_model/__init__.py | 12 +--
dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
dts/framework/testbed_model/hw/__init__.py | 27 ------
.../linux_session.py | 6 +-
dts/framework/testbed_model/node.py | 25 ++++--
.../os_session.py | 22 ++---
dts/framework/testbed_model/{hw => }/port.py | 0
.../posix_session.py | 4 +-
dts/framework/testbed_model/sut_node.py | 8 +-
dts/framework/testbed_model/tg_node.py | 30 +------
.../traffic_generator/__init__.py | 24 +++++
.../capturing_traffic_generator.py | 6 +-
.../{ => traffic_generator}/scapy.py | 23 ++---
.../traffic_generator.py | 16 +++-
.../testbed_model/{hw => }/virtual_device.py | 0
dts/framework/utils.py | 46 +++-------
dts/main.py | 9 +-
31 files changed, 258 insertions(+), 288 deletions(-)
rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
delete mode 100644 dts/framework/remote_session/remote/__init__.py
rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
delete mode 100644 dts/framework/testbed_model/hw/__init__.py
rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
rename dts/framework/testbed_model/{hw => }/port.py (100%)
rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (96%)
rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index cb7e00ba34..2044c82611 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -17,6 +17,7 @@
import warlock # type: ignore[import]
import yaml
+from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict):
+ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
# This looks useless now, but is designed to allow expansion to traffic
# generators that require more configuration later.
match TrafficGeneratorType(d["type"]):
@@ -97,6 +98,10 @@ def from_dict(d: dict):
return ScapyTrafficGeneratorConfig(
traffic_generator_type=TrafficGeneratorType.SCAPY
)
+ case _:
+ raise ConfigurationError(
+ f'Unknown traffic generator type "{d["type"]}".'
+ )
@dataclass(slots=True, frozen=True)
@@ -324,6 +329,3 @@ def load_config() -> Configuration:
config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
config_obj: Configuration = Configuration.from_dict(dict(config))
return config_obj
-
-
-CONFIGURATION = load_config()
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index f773f0c38d..4c7fb0c40a 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -6,19 +6,19 @@
import sys
from .config import (
- CONFIGURATION,
BuildTargetConfiguration,
ExecutionConfiguration,
TestSuiteConfig,
+ load_config,
)
from .exception import BlockingTestSuiteError
from .logger import DTSLOG, getLogger
from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
from .test_suite import get_test_suites
from .testbed_model import SutNode, TGNode
-from .utils import check_dts_python_version
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG = None # type: ignore[assignment]
result: DTSResult = DTSResult(dts_logger)
@@ -30,14 +30,18 @@ def run_all() -> None:
global dts_logger
global result
+ # create a regular DTS logger and create a new result with it
+ dts_logger = getLogger("DTSRunner")
+ result = DTSResult(dts_logger)
+
# check the python version of the server that run dts
- check_dts_python_version()
+ _check_dts_python_version()
sut_nodes: dict[str, SutNode] = {}
tg_nodes: dict[str, TGNode] = {}
try:
# for all Execution sections
- for execution in CONFIGURATION.executions:
+ for execution in load_config().executions:
sut_node = sut_nodes.get(execution.system_under_test_node.name)
tg_node = tg_nodes.get(execution.traffic_generator_node.name)
@@ -82,6 +86,25 @@ def run_all() -> None:
_exit_dts()
+def _check_dts_python_version() -> None:
+ def RED(text: str) -> str:
+ return f"\u001B[31;1m{str(text)}\u001B[0m"
+
+ if sys.version_info.major < 3 or (
+ sys.version_info.major == 3 and sys.version_info.minor < 10
+ ):
+ print(
+ RED(
+ (
+ "WARNING: DTS execution node's python version is lower than"
+ "python 3.10, is deprecated and will not work in future releases."
+ )
+ ),
+ file=sys.stderr,
+ )
+ print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+
+
def _run_execution(
sut_node: SutNode,
tg_node: TGNode,
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 001a5a5496..7489c03570 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
Command execution timeout.
"""
- command: str
- output: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _command: str
- def __init__(self, command: str, output: str):
- self.command = command
- self.output = output
+ def __init__(self, command: str):
+ self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self.command}"
-
- def get_output(self) -> str:
- return self.output
+ return f"TIMEOUT on {self._command}"
class SSHConnectionError(DTSError):
@@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
SSH connection error.
"""
- host: str
- errors: list[str]
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
+ _errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
- self.host = host
- self.errors = [] if errors is None else errors
+ self._host = host
+ self._errors = [] if errors is None else errors
def __str__(self) -> str:
- message = f"Error trying to connect with {self.host}."
- if self.errors:
- message += f" Errors encountered while retrying: {', '.join(self.errors)}"
+ message = f"Error trying to connect with {self._host}."
+ if self._errors:
+ message += f" Errors encountered while retrying: {', '.join(self._errors)}"
return message
@@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
It can no longer be used.
"""
- host: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
def __init__(self, host: str):
- self.host = host
+ self._host = host
def __str__(self) -> str:
- return f"SSH session with {self.host} has died"
+ return f"SSH session with {self._host} has died"
class ConfigurationError(DTSError):
@@ -107,18 +102,18 @@ class RemoteCommandExecutionError(DTSError):
Raised when a command executed on a Node returns a non-zero exit status.
"""
- command: str
- command_return_code: int
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ command: str
+ _command_return_code: int
def __init__(self, command: str, command_return_code: int):
self.command = command
- self.command_return_code = command_return_code
+ self._command_return_code = command_return_code
def __str__(self) -> str:
return (
f"Command {self.command} returned a non-zero exit code: "
- f"{self.command_return_code}"
+ f"{self._command_return_code}"
)
@@ -143,22 +138,15 @@ class TestCaseVerifyError(DTSError):
Used in test cases to verify the expected behavior.
"""
- value: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
- def __init__(self, value: str):
- self.value = value
-
- def __str__(self) -> str:
- return repr(self.value)
-
class BlockingTestSuiteError(DTSError):
- suite_name: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
+ _suite_name: str
def __init__(self, suite_name: str) -> None:
- self.suite_name = suite_name
+ self._suite_name = suite_name
def __str__(self) -> str:
- return f"Blocking suite {self.suite_name} failed."
+ return f"Blocking suite {self._suite_name} failed."
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 00b6d1f03a..5e7ddb2b05 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -12,29 +12,24 @@
# pylama:ignore=W0611
-from framework.config import OS, NodeConfiguration
-from framework.exception import ConfigurationError
+from framework.config import NodeConfiguration
from framework.logger import DTSLOG
-from .linux_session import LinuxSession
-from .os_session import InteractiveShellType, OSSession
-from .remote import (
- CommandResult,
- InteractiveRemoteSession,
- InteractiveShell,
- PythonShell,
- RemoteSession,
- SSHSession,
- TestPmdDevice,
- TestPmdShell,
-)
-
-
-def create_session(
+from .interactive_remote_session import InteractiveRemoteSession
+from .interactive_shell import InteractiveShell
+from .python_shell import PythonShell
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
+from .testpmd_shell import TestPmdShell
+
+
+def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> OSSession:
- match node_config.os:
- case OS.linux:
- return LinuxSession(node_config, name, logger)
- case _:
- raise ConfigurationError(f"Unsupported OS {node_config.os}")
+) -> RemoteSession:
+ return SSHSession(node_config, name, logger)
+
+
+def create_interactive_session(
+ node_config: NodeConfiguration, logger: DTSLOG
+) -> InteractiveRemoteSession:
+ return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_remote_session.py
rename to dts/framework/remote_session/interactive_remote_session.py
diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_shell.py
rename to dts/framework/remote_session/interactive_shell.py
diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/python_shell.py
rename to dts/framework/remote_session/python_shell.py
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 06403691a5..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-# Copyright(c) 2023 University of New Hampshire
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .interactive_remote_session import InteractiveRemoteSession
-from .interactive_shell import InteractiveShell
-from .python_shell import PythonShell
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-from .testpmd_shell import TestPmdDevice, TestPmdShell
-
-
-def create_remote_session(
- node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
- return SSHSession(node_config, name, logger)
-
-
-def create_interactive_session(
- node_config: NodeConfiguration, logger: DTSLOG
-) -> InteractiveRemoteSession:
- return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 91%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
index 8d127f1601..cee11d14d6 100644
--- a/dts/framework/remote_session/remote/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -18,9 +18,7 @@
SSHException,
)
-from framework.config import NodeConfiguration
from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
-from framework.logger import DTSLOG
from .remote_session import CommandResult, RemoteSession
@@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
session: Connection
- def __init__(
- self,
- node_config: NodeConfiguration,
- session_name: str,
- logger: DTSLOG,
- ):
- super(SSHSession, self).__init__(node_config, session_name, logger)
-
def _connect(self) -> None:
errors = []
retry_attempts = 10
@@ -117,7 +107,7 @@ def _send_command(
except CommandTimedOut as e:
self._logger.exception(e)
- raise SSHTimeoutError(command, e.result.stderr) from e
+ raise SSHTimeoutError(command) from e
return CommandResult(
self.name, command, output.stdout, output.stderr, output.return_code
diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/testpmd_shell.py
rename to dts/framework/remote_session/testpmd_shell.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index cfa39d011b..7f5841d073 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
from pathlib import Path
from typing import Any, TypeVar
@@ -22,8 +22,8 @@ def __init__(
option_strings: Sequence[str],
dest: str,
nargs: str | int | None = None,
- const: str | None = None,
- default: str = None,
+ const: bool | None = None,
+ default: Any = None,
type: Callable[[str], _T | argparse.FileType | None] = None,
choices: Iterable[_T] | None = None,
required: bool = False,
@@ -32,6 +32,12 @@ def __init__(
) -> None:
env_var_value = os.environ.get(env_var)
default = env_var_value or default
+ if const is not None:
+ nargs = 0
+ default = const if env_var_value else default
+ type = None
+ choices = None
+ metavar = None
super(_EnvironmentArgument, self).__init__(
option_strings,
dest,
@@ -52,22 +58,28 @@ def __call__(
values: Any,
option_string: str = None,
) -> None:
- setattr(namespace, self.dest, values)
+ if self.const is not None:
+ setattr(namespace, self.dest, self.const)
+ else:
+ setattr(namespace, self.dest, values)
return _EnvironmentArgument
-@dataclass(slots=True, frozen=True)
-class _Settings:
- config_file_path: str
- output_dir: str
- timeout: float
- verbose: bool
- skip_setup: bool
- dpdk_tarball_path: Path
- compile_timeout: float
- test_cases: list
- re_run: int
+@dataclass(slots=True)
+class Settings:
+ config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ output_dir: str = "output"
+ timeout: float = 15
+ verbose: bool = False
+ skip_setup: bool = False
+ dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ compile_timeout: float = 1200
+ test_cases: list[str] = field(default_factory=list)
+ re_run: int = 0
+
+
+SETTINGS: Settings = Settings()
def _get_parser() -> argparse.ArgumentParser:
@@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--config-file",
action=_env_arg("DTS_CFG_FILE"),
- default="conf.yaml",
+ default=SETTINGS.config_file_path,
+ type=Path,
help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
"and targets.",
)
@@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--output-dir",
"--output",
action=_env_arg("DTS_OUTPUT_DIR"),
- default="output",
+ default=SETTINGS.output_dir,
help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
)
@@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
"-t",
"--timeout",
action=_env_arg("DTS_TIMEOUT"),
- default=15,
+ default=SETTINGS.timeout,
type=float,
help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
"compiling DPDK.",
@@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
"-v",
"--verbose",
action=_env_arg("DTS_VERBOSE"),
- default="N",
- help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
+ default=SETTINGS.verbose,
+ const=True,
+ help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
"to the console.",
)
@@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
"-s",
"--skip-setup",
action=_env_arg("DTS_SKIP_SETUP"),
- default="N",
- help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
+ const=True,
+ help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
)
parser.add_argument(
@@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--snapshot",
"--git-ref",
action=_env_arg("DTS_DPDK_TARBALL"),
- default="dpdk.tar.xz",
+ default=SETTINGS.dpdk_tarball_path,
type=Path,
help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
"tag ID or tree ID to test. To test local changes, first commit them, "
@@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--compile-timeout",
action=_env_arg("DTS_COMPILE_TIMEOUT"),
- default=1200,
+ default=SETTINGS.compile_timeout,
type=float,
help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
)
@@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--re-run",
"--re_run",
action=_env_arg("DTS_RERUN"),
- default=0,
+ default=SETTINGS.re_run,
type=int,
help="[DTS_RERUN] Re-run each test case the specified amount of times "
"if a test failure occurs",
@@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
return parser
-def _get_settings() -> _Settings:
+def get_settings() -> Settings:
parsed_args = _get_parser().parse_args()
- return _Settings(
+ return Settings(
config_file_path=parsed_args.config_file,
output_dir=parsed_args.output_dir,
timeout=parsed_args.timeout,
- verbose=(parsed_args.verbose == "Y"),
- skip_setup=(parsed_args.skip_setup == "Y"),
+ verbose=parsed_args.verbose,
+ skip_setup=parsed_args.skip_setup,
dpdk_tarball_path=Path(
- DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
- )
- if not os.path.exists(parsed_args.tarball)
- else Path(parsed_args.tarball),
+ Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
+ if not os.path.exists(parsed_args.tarball)
+ else Path(parsed_args.tarball)
+ ),
compile_timeout=parsed_args.compile_timeout,
- test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+ test_cases=(
+ parsed_args.test_cases.split(",") if parsed_args.test_cases else []
+ ),
re_run=parsed_args.re_run,
)
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index f0fbe80f6f..603e18872c 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -254,7 +254,7 @@ def add_build_target(
self._inner_results.append(build_target_result)
return build_target_result
- def add_sut_info(self, sut_info: NodeInfo):
+ def add_sut_info(self, sut_info: NodeInfo) -> None:
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
@@ -297,7 +297,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
self._inner_results.append(execution_result)
return execution_result
- def add_error(self, error) -> None:
+ def add_error(self, error: Exception) -> None:
self._errors.append(error)
def process(self) -> None:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 3b890c0451..d53553bf34 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -11,7 +11,7 @@
import re
from ipaddress import IPv4Interface, IPv6Interface, ip_interface
from types import MethodType
-from typing import Union
+from typing import Any, Union
from scapy.layers.inet import IP # type: ignore[import]
from scapy.layers.l2 import Ether # type: ignore[import]
@@ -26,8 +26,7 @@
from .logger import DTSLOG, getLogger
from .settings import SETTINGS
from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
-from .testbed_model import SutNode, TGNode
-from .testbed_model.hw.port import Port, PortLink
+from .testbed_model import Port, PortLink, SutNode, TGNode
from .utils import get_packet_summaries
@@ -453,7 +452,7 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
- def is_test_suite(object) -> bool:
+ def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
return True
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 5cbb859e47..8ced05653b 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,15 +9,9 @@
# pylama:ignore=W0611
-from .hw import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
- VirtualDevice,
- lcore_filter,
-)
+from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
from .node import Node
+from .port import Port, PortLink
from .sut_node import SutNode
from .tg_node import TGNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/testbed_model/cpu.py
index d1918a12dc..8fe785dfe4 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
)
return filtered_lcores
+
+
+def lcore_filter(
+ core_list: list[LogicalCore],
+ filter_specifier: LogicalCoreCount | LogicalCoreList,
+ ascending: bool,
+) -> LogicalCoreFilter:
+ if isinstance(filter_specifier, LogicalCoreList):
+ return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+ elif isinstance(filter_specifier, LogicalCoreCount):
+ return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+ else:
+ raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
- core_list: list[LogicalCore],
- filter_specifier: LogicalCoreCount | LogicalCoreList,
- ascending: bool,
-) -> LogicalCoreFilter:
- if isinstance(filter_specifier, LogicalCoreList):
- return LogicalCoreListFilter(core_list, filter_specifier, ascending)
- elif isinstance(filter_specifier, LogicalCoreCount):
- return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
- else:
- raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
similarity index 97%
rename from dts/framework/remote_session/linux_session.py
rename to dts/framework/testbed_model/linux_session.py
index a3f1a6bf3b..f472bb8f0f 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -9,10 +9,10 @@
from typing_extensions import NotRequired
from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
from framework.utils import expand_range
+from .cpu import LogicalCore
+from .port import Port
from .posix_session import PosixSession
@@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
lcores.append(LogicalCore(lcore, core, socket, node))
return lcores
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return dpdk_prefix
def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index fc01e0bf8e..fa5b143cdd 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,23 +12,26 @@
from typing import Any, Callable, Type, Union
from framework.config import (
+ OS,
BuildTargetConfiguration,
ExecutionConfiguration,
NodeConfiguration,
)
+from framework.exception import ConfigurationError
from framework.logger import DTSLOG, getLogger
-from framework.remote_session import InteractiveShellType, OSSession, create_session
from framework.settings import SETTINGS
-from .hw import (
+from .cpu import (
LogicalCore,
LogicalCoreCount,
LogicalCoreList,
LogicalCoreListFilter,
- VirtualDevice,
lcore_filter,
)
-from .hw.port import Port
+from .linux_session import LinuxSession
+from .os_session import InteractiveShellType, OSSession
+from .port import Port
+from .virtual_device import VirtualDevice
class Node(ABC):
@@ -172,9 +175,9 @@ def create_interactive_shell(
return self.main_session.create_interactive_shell(
shell_cls,
- app_args,
timeout,
privileged,
+ app_args,
)
def filter_lcores(
@@ -205,7 +208,7 @@ def _get_remote_cpus(self) -> None:
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
- def _setup_hugepages(self):
+ def _setup_hugepages(self) -> None:
"""
Setup hugepages on the Node. Different architectures can supply different
amounts of memory for hugepages and numa-based hugepage allocation may need
@@ -249,3 +252,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
return lambda *args: None
else:
return func
+
+
+def create_session(
+ node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> OSSession:
+ match node_config.os:
+ case OS.linux:
+ return LinuxSession(node_config, name, logger)
+ case _:
+ raise ConfigurationError(f"Unsupported OS {node_config.os}")
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
similarity index 95%
rename from dts/framework/remote_session/os_session.py
rename to dts/framework/testbed_model/os_session.py
index 8a709eac1c..76e595a518 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -10,19 +10,19 @@
from framework.config import Architecture, NodeConfiguration, NodeInfo
from framework.logger import DTSLOG
-from framework.remote_session.remote import InteractiveShell
-from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
-from framework.utils import MesonArgs
-
-from .remote import (
+from framework.remote_session import (
CommandResult,
InteractiveRemoteSession,
+ InteractiveShell,
RemoteSession,
create_interactive_session,
create_remote_session,
)
+from framework.settings import SETTINGS
+from framework.utils import MesonArgs
+
+from .cpu import LogicalCore
+from .port import Port
InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
@@ -85,9 +85,9 @@ def send_command(
def create_interactive_shell(
self,
shell_cls: Type[InteractiveShellType],
- eal_parameters: str,
timeout: float,
privileged: bool,
+ app_args: str,
) -> InteractiveShellType:
"""
See "create_interactive_shell" in SutNode
@@ -96,7 +96,7 @@ def create_interactive_shell(
self.interactive_session.session,
self._logger,
self._get_privileged_command if privileged else None,
- eal_parameters,
+ app_args,
timeout,
)
@@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
"""
@abstractmethod
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
"""
Try to find DPDK remote dir in remote_dir.
"""
@@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
"""
@abstractmethod
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
"""
Get the DPDK file prefix that will be used when running DPDK apps.
"""
diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
similarity index 100%
rename from dts/framework/testbed_model/hw/port.py
rename to dts/framework/testbed_model/port.py
diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
similarity index 98%
rename from dts/framework/remote_session/posix_session.py
rename to dts/framework/testbed_model/posix_session.py
index 5da0516e05..1d1d5b1b26 100644
--- a/dts/framework/remote_session/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
@@ -219,7 +219,7 @@ def _remove_dpdk_runtime_dirs(
for dpdk_runtime_dir in dpdk_runtime_dirs:
self.remove_remote_dir(dpdk_runtime_dir)
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return ""
def get_compiler_version(self, compiler_name: str) -> str:
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 4161d3a4d5..17deea06e2 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,12 +15,14 @@
NodeInfo,
SutNodeConfiguration,
)
-from framework.remote_session import CommandResult, InteractiveShellType, OSSession
+from framework.remote_session import CommandResult
from framework.settings import SETTINGS
from framework.utils import MesonArgs
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
+from .cpu import LogicalCoreCount, LogicalCoreList
from .node import Node
+from .os_session import InteractiveShellType, OSSession
+from .virtual_device import VirtualDevice
class EalParameters(object):
@@ -307,7 +309,7 @@ def create_eal_parameters(
prefix: str = "dpdk",
append_prefix_timestamp: bool = True,
no_pci: bool = False,
- vdevs: list[VirtualDevice] = None,
+ vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
"""
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 27025cfa31..166eb8430e 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -16,16 +16,11 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.config import (
- ScapyTrafficGeneratorConfig,
- TGNodeConfiguration,
- TrafficGeneratorType,
-)
-from framework.exception import ConfigurationError
-
-from .capturing_traffic_generator import CapturingTrafficGenerator
-from .hw.port import Port
+from framework.config import TGNodeConfiguration
+
from .node import Node
+from .port import Port
+from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
class TGNode(Node):
@@ -80,20 +75,3 @@ def close(self) -> None:
"""Free all resources used by the node"""
self.traffic_generator.close()
super(TGNode, self).close()
-
-
-def create_traffic_generator(
- tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
-) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
-
- from .scapy import ScapyTrafficGenerator
-
- match traffic_generator_config.traffic_generator_type:
- case TrafficGeneratorType.SCAPY:
- return ScapyTrafficGenerator(tg_node, traffic_generator_config)
- case _:
- raise ConfigurationError(
- "Unknown traffic generator: "
- f"{traffic_generator_config.traffic_generator_type}"
- )
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
new file mode 100644
index 0000000000..11bfa1ee0f
--- /dev/null
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -0,0 +1,24 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
+from framework.exception import ConfigurationError
+from framework.testbed_model.node import Node
+
+from .capturing_traffic_generator import CapturingTrafficGenerator
+from .scapy import ScapyTrafficGenerator
+
+
+def create_traffic_generator(
+ tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
+) -> CapturingTrafficGenerator:
+ """A factory function for creating traffic generator object from user config."""
+
+ match traffic_generator_config.traffic_generator_type:
+ case TrafficGeneratorType.SCAPY:
+ return ScapyTrafficGenerator(tg_node, traffic_generator_config)
+ case _:
+ raise ConfigurationError(
+ "Unknown traffic generator: "
+ f"{traffic_generator_config.traffic_generator_type}"
+ )
diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
similarity index 96%
rename from dts/framework/testbed_model/capturing_traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index ab98987f8e..e521211ef0 100644
--- a/dts/framework/testbed_model/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -16,9 +16,9 @@
from scapy.packet import Packet # type: ignore[import]
from framework.settings import SETTINGS
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
from .traffic_generator import TrafficGenerator
@@ -130,7 +130,9 @@ def _send_packets_and_capture(
for the specified duration. It must be able to handle no received packets.
"""
- def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
+ def _write_capture_from_packets(
+ self, capture_name: str, packets: list[Packet]
+ ) -> None:
file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
self._logger.debug(f"Writing packets to {file_name}.")
scapy.utils.wrpcap(file_name, packets)
diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
similarity index 95%
rename from dts/framework/testbed_model/scapy.py
rename to dts/framework/testbed_model/traffic_generator/scapy.py
index af0d4dbb25..51864b6e6b 100644
--- a/dts/framework/testbed_model/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -24,16 +24,15 @@
from scapy.packet import Packet # type: ignore[import]
from framework.config import OS, ScapyTrafficGeneratorConfig
-from framework.logger import DTSLOG, getLogger
from framework.remote_session import PythonShell
from framework.settings import SETTINGS
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from .capturing_traffic_generator import (
CapturingTrafficGenerator,
_get_default_capture_name,
)
-from .hw.port import Port
-from .tg_node import TGNode
"""
========= BEGIN RPC FUNCTIONS =========
@@ -146,7 +145,7 @@ def quit(self) -> None:
self._BaseServer__shutdown_request = True
return None
- def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
+ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
"""Add a function to the server.
This is meant to be executed remotely.
@@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
session: PythonShell
rpc_server_proxy: xmlrpc.client.ServerProxy
_config: ScapyTrafficGeneratorConfig
- _tg_node: TGNode
- _logger: DTSLOG
-
- def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
- self._config = config
- self._tg_node = tg_node
- self._logger = getLogger(
- f"{self._tg_node.name} {self._config.traffic_generator_type}"
- )
+
+ def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ super().__init__(tg_node, config)
assert (
self._tg_node.config.os == OS.linux
@@ -235,7 +228,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
function_bytes = marshal.dumps(function.__code__)
self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
- def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
+ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
# load the source of the function
src = inspect.getsource(QuittableXMLRPCServer)
# Lines with only whitespace break the repl if in the middle of a function
@@ -280,7 +273,7 @@ def _send_packets_and_capture(
scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
return scapy_packets
- def close(self):
+ def close(self) -> None:
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
similarity index 80%
rename from dts/framework/testbed_model/traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 28c35d3ce4..ea7c3963da 100644
--- a/dts/framework/testbed_model/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -12,11 +12,12 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.logger import DTSLOG
+from framework.config import TrafficGeneratorConfig
+from framework.logger import DTSLOG, getLogger
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
-
class TrafficGenerator(ABC):
"""The base traffic generator.
@@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
Defines the few basic methods that each traffic generator must implement.
"""
+ _config: TrafficGeneratorConfig
+ _tg_node: Node
_logger: DTSLOG
+ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ self._config = config
+ self._tg_node = tg_node
+ self._logger = getLogger(
+ f"{self._tg_node.name} {self._config.traffic_generator_type}"
+ )
+
def send_packet(self, packet: Packet, port: Port) -> None:
"""Send a packet and block until it is fully sent.
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index d27c2c5b5f..f0c916471c 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -7,7 +7,6 @@
import json
import os
import subprocess
-import sys
from enum import Enum
from pathlib import Path
from subprocess import SubprocessError
@@ -16,35 +15,7 @@
from .exception import ConfigurationError
-
-class StrEnum(Enum):
- @staticmethod
- def _generate_next_value_(
- name: str, start: int, count: int, last_values: object
- ) -> str:
- return name
-
- def __str__(self) -> str:
- return self.name
-
-
-REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
-
-
-def check_dts_python_version() -> None:
- if sys.version_info.major < 3 or (
- sys.version_info.major == 3 and sys.version_info.minor < 10
- ):
- print(
- RED(
- (
- "WARNING: DTS execution node's python version is lower than"
- "python 3.10, is deprecated and will not work in future releases."
- )
- ),
- file=sys.stderr,
- )
- print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
def expand_range(range_str: str) -> list[int]:
@@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
return expanded_range
-def get_packet_summaries(packets: list[Packet]):
+def get_packet_summaries(packets: list[Packet]) -> str:
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
return f"Packet contents: \n{packet_summaries}"
-def RED(text: str) -> str:
- return f"\u001B[31;1m{str(text)}\u001B[0m"
+class StrEnum(Enum):
+ @staticmethod
+ def _generate_next_value_(
+ name: str, start: int, count: int, last_values: object
+ ) -> str:
+ return name
+
+ def __str__(self) -> str:
+ return self.name
class MesonArgs(object):
@@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
if self._tarball_path and os.path.exists(self._tarball_path):
os.remove(self._tarball_path)
- def __fspath__(self):
+ def __fspath__(self) -> str:
return str(self._tarball_path)
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..5d4714b0c3 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,17 @@
import logging
-from framework import dts
+from framework import settings
def main() -> None:
+ """Set DTS settings, then run DTS.
+
+ The DTS settings are taken from the command line arguments and the environment variables.
+ """
+ settings.SETTINGS = settings.get_settings()
+ from framework import dts
+
dts.run_all()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 01/21] dts: code adjustments for doc generation
2023-11-15 13:09 ` [PATCH v7 01/21] dts: code adjustments for doc generation Juraj Linkeš
@ 2023-11-16 21:04 ` Jeremy Spewock
2023-11-20 16:10 ` Juraj Linkeš
2023-11-20 16:02 ` Yoan Picchi
1 sibling, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-11-16 21:04 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
[-- Attachment #1: Type: text/plain, Size: 53719 bytes --]
On Wed, Nov 15, 2023 at 8:11 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> The standard Python tool for generating API documentation, Sphinx,
> imports modules one-by-one when generating the documentation. This
> requires code changes:
> * properly guarding argument parsing in the if __name__ == '__main__'
> block,
> * the logger used by DTS runner underwent the same treatment so that it
> doesn't create log files outside of a DTS run,
> * however, DTS uses the arguments to construct an object holding global
> variables. The defaults for the global variables needed to be moved
> from argument parsing elsewhere,
> * importing the remote_session module from framework resulted in
> circular imports because of one module trying to import another
> module. This is fixed by reorganizing the code,
> * some code reorganization was done because the resulting structure
> makes more sense, improving documentation clarity.
>
> The are some other changes which are documentation related:
> * added missing type annotation so they appear in the generated docs,
> * reordered arguments in some methods,
> * removed superfluous arguments and attributes,
> * change private functions/methods/attributes to private and vice-versa.
>
> The above all appear in the generated documentation and the with them,
> the documentation is improved.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/config/__init__.py | 10 ++-
> dts/framework/dts.py | 33 +++++--
> dts/framework/exception.py | 54 +++++-------
> dts/framework/remote_session/__init__.py | 41 ++++-----
> .../interactive_remote_session.py | 0
> .../{remote => }/interactive_shell.py | 0
> .../{remote => }/python_shell.py | 0
> .../remote_session/remote/__init__.py | 27 ------
> .../{remote => }/remote_session.py | 0
> .../{remote => }/ssh_session.py | 12 +--
> .../{remote => }/testpmd_shell.py | 0
> dts/framework/settings.py | 87 +++++++++++--------
> dts/framework/test_result.py | 4 +-
> dts/framework/test_suite.py | 7 +-
> dts/framework/testbed_model/__init__.py | 12 +--
> dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
> dts/framework/testbed_model/hw/__init__.py | 27 ------
> .../linux_session.py | 6 +-
> dts/framework/testbed_model/node.py | 25 ++++--
> .../os_session.py | 22 ++---
> dts/framework/testbed_model/{hw => }/port.py | 0
> .../posix_session.py | 4 +-
> dts/framework/testbed_model/sut_node.py | 8 +-
> dts/framework/testbed_model/tg_node.py | 30 +------
> .../traffic_generator/__init__.py | 24 +++++
> .../capturing_traffic_generator.py | 6 +-
> .../{ => traffic_generator}/scapy.py | 23 ++---
> .../traffic_generator.py | 16 +++-
> .../testbed_model/{hw => }/virtual_device.py | 0
> dts/framework/utils.py | 46 +++-------
> dts/main.py | 9 +-
> 31 files changed, 258 insertions(+), 288 deletions(-)
> rename dts/framework/remote_session/{remote =>
> }/interactive_remote_session.py (100%)
> rename dts/framework/remote_session/{remote => }/interactive_shell.py
> (100%)
> rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
> delete mode 100644 dts/framework/remote_session/remote/__init__.py
> rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
> rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
> rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
> rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
> delete mode 100644 dts/framework/testbed_model/hw/__init__.py
> rename dts/framework/{remote_session => testbed_model}/linux_session.py
> (97%)
> rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
> rename dts/framework/testbed_model/{hw => }/port.py (100%)
> rename dts/framework/{remote_session => testbed_model}/posix_session.py
> (98%)
> create mode 100644
> dts/framework/testbed_model/traffic_generator/__init__.py
> rename dts/framework/testbed_model/{ =>
> traffic_generator}/capturing_traffic_generator.py (96%)
> rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
> rename dts/framework/testbed_model/{ =>
> traffic_generator}/traffic_generator.py (80%)
> rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
>
> diff --git a/dts/framework/config/__init__.py
> b/dts/framework/config/__init__.py
> index cb7e00ba34..2044c82611 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -17,6 +17,7 @@
> import warlock # type: ignore[import]
> import yaml
>
> +from framework.exception import ConfigurationError
> from framework.settings import SETTINGS
> from framework.utils import StrEnum
>
> @@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
> traffic_generator_type: TrafficGeneratorType
>
> @staticmethod
> - def from_dict(d: dict):
> + def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
> # This looks useless now, but is designed to allow expansion to
> traffic
> # generators that require more configuration later.
> match TrafficGeneratorType(d["type"]):
> @@ -97,6 +98,10 @@ def from_dict(d: dict):
> return ScapyTrafficGeneratorConfig(
> traffic_generator_type=TrafficGeneratorType.SCAPY
> )
> + case _:
> + raise ConfigurationError(
> + f'Unknown traffic generator type "{d["type"]}".'
> + )
>
>
> @dataclass(slots=True, frozen=True)
> @@ -324,6 +329,3 @@ def load_config() -> Configuration:
> config: dict[str, Any] = warlock.model_factory(schema,
> name="_Config")(config_data)
> config_obj: Configuration = Configuration.from_dict(dict(config))
> return config_obj
> -
> -
> -CONFIGURATION = load_config()
> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> index f773f0c38d..4c7fb0c40a 100644
> --- a/dts/framework/dts.py
> +++ b/dts/framework/dts.py
> @@ -6,19 +6,19 @@
> import sys
>
> from .config import (
> - CONFIGURATION,
> BuildTargetConfiguration,
> ExecutionConfiguration,
> TestSuiteConfig,
> + load_config,
> )
> from .exception import BlockingTestSuiteError
> from .logger import DTSLOG, getLogger
> from .test_result import BuildTargetResult, DTSResult, ExecutionResult,
> Result
> from .test_suite import get_test_suites
> from .testbed_model import SutNode, TGNode
> -from .utils import check_dts_python_version
>
> -dts_logger: DTSLOG = getLogger("DTSRunner")
> +# dummy defaults to satisfy linters
> +dts_logger: DTSLOG = None # type: ignore[assignment]
> result: DTSResult = DTSResult(dts_logger)
>
>
> @@ -30,14 +30,18 @@ def run_all() -> None:
> global dts_logger
> global result
>
> + # create a regular DTS logger and create a new result with it
> + dts_logger = getLogger("DTSRunner")
> + result = DTSResult(dts_logger)
> +
> # check the python version of the server that run dts
> - check_dts_python_version()
> + _check_dts_python_version()
>
> sut_nodes: dict[str, SutNode] = {}
> tg_nodes: dict[str, TGNode] = {}
> try:
> # for all Execution sections
> - for execution in CONFIGURATION.executions:
> + for execution in load_config().executions:
> sut_node = sut_nodes.get(
> execution.system_under_test_node.name)
> tg_node = tg_nodes.get(execution.traffic_generator_node.name)
>
> @@ -82,6 +86,25 @@ def run_all() -> None:
> _exit_dts()
>
>
> +def _check_dts_python_version() -> None:
> + def RED(text: str) -> str:
> + return f"\u001B[31;1m{str(text)}\u001B[0m"
> +
> + if sys.version_info.major < 3 or (
> + sys.version_info.major == 3 and sys.version_info.minor < 10
> + ):
> + print(
> + RED(
> + (
> + "WARNING: DTS execution node's python version is
> lower than"
> + "python 3.10, is deprecated and will not work in
> future releases."
> + )
> + ),
> + file=sys.stderr,
> + )
> + print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> +
> +
> def _run_execution(
> sut_node: SutNode,
> tg_node: TGNode,
> diff --git a/dts/framework/exception.py b/dts/framework/exception.py
> index 001a5a5496..7489c03570 100644
> --- a/dts/framework/exception.py
> +++ b/dts/framework/exception.py
> @@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
> Command execution timeout.
> """
>
> - command: str
> - output: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _command: str
>
> - def __init__(self, command: str, output: str):
> - self.command = command
> - self.output = output
> + def __init__(self, command: str):
> + self._command = command
>
> def __str__(self) -> str:
> - return f"TIMEOUT on {self.command}"
> -
> - def get_output(self) -> str:
> - return self.output
> + return f"TIMEOUT on {self._command}"
>
>
> class SSHConnectionError(DTSError):
> @@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
> SSH connection error.
> """
>
> - host: str
> - errors: list[str]
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _host: str
> + _errors: list[str]
>
> def __init__(self, host: str, errors: list[str] | None = None):
> - self.host = host
> - self.errors = [] if errors is None else errors
> + self._host = host
> + self._errors = [] if errors is None else errors
>
> def __str__(self) -> str:
> - message = f"Error trying to connect with {self.host}."
> - if self.errors:
> - message += f" Errors encountered while retrying: {',
> '.join(self.errors)}"
> + message = f"Error trying to connect with {self._host}."
> + if self._errors:
> + message += f" Errors encountered while retrying: {',
> '.join(self._errors)}"
>
> return message
>
> @@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
> It can no longer be used.
> """
>
> - host: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _host: str
>
> def __init__(self, host: str):
> - self.host = host
> + self._host = host
>
> def __str__(self) -> str:
> - return f"SSH session with {self.host} has died"
> + return f"SSH session with {self._host} has died"
>
>
> class ConfigurationError(DTSError):
> @@ -107,18 +102,18 @@ class RemoteCommandExecutionError(DTSError):
> Raised when a command executed on a Node returns a non-zero exit
> status.
> """
>
> - command: str
> - command_return_code: int
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
> + command: str
> + _command_return_code: int
>
> def __init__(self, command: str, command_return_code: int):
> self.command = command
> - self.command_return_code = command_return_code
> + self._command_return_code = command_return_code
>
> def __str__(self) -> str:
> return (
> f"Command {self.command} returned a non-zero exit code: "
> - f"{self.command_return_code}"
> + f"{self._command_return_code}"
> )
>
>
> @@ -143,22 +138,15 @@ class TestCaseVerifyError(DTSError):
> Used in test cases to verify the expected behavior.
> """
>
> - value: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
>
> - def __init__(self, value: str):
> - self.value = value
> -
> - def __str__(self) -> str:
> - return repr(self.value)
> -
>
Does this change mean we are no longer providing descriptions for what
failing the verification means? I guess there isn't really harm in removing
that functionality, but I'm not sure I see the value in removing the extra
information either.
>
> class BlockingTestSuiteError(DTSError):
> - suite_name: str
> severity: ClassVar[ErrorSeverity] =
> ErrorSeverity.BLOCKING_TESTSUITE_ERR
> + _suite_name: str
>
> def __init__(self, suite_name: str) -> None:
> - self.suite_name = suite_name
> + self._suite_name = suite_name
>
> def __str__(self) -> str:
> - return f"Blocking suite {self.suite_name} failed."
> + return f"Blocking suite {self._suite_name} failed."
> diff --git a/dts/framework/remote_session/__init__.py
> b/dts/framework/remote_session/__init__.py
> index 00b6d1f03a..5e7ddb2b05 100644
> --- a/dts/framework/remote_session/__init__.py
> +++ b/dts/framework/remote_session/__init__.py
> @@ -12,29 +12,24 @@
>
> # pylama:ignore=W0611
>
> -from framework.config import OS, NodeConfiguration
> -from framework.exception import ConfigurationError
> +from framework.config import NodeConfiguration
> from framework.logger import DTSLOG
>
> -from .linux_session import LinuxSession
> -from .os_session import InteractiveShellType, OSSession
> -from .remote import (
> - CommandResult,
> - InteractiveRemoteSession,
> - InteractiveShell,
> - PythonShell,
> - RemoteSession,
> - SSHSession,
> - TestPmdDevice,
> - TestPmdShell,
> -)
> -
> -
> -def create_session(
> +from .interactive_remote_session import InteractiveRemoteSession
> +from .interactive_shell import InteractiveShell
> +from .python_shell import PythonShell
> +from .remote_session import CommandResult, RemoteSession
> +from .ssh_session import SSHSession
> +from .testpmd_shell import TestPmdShell
> +
> +
> +def create_remote_session(
> node_config: NodeConfiguration, name: str, logger: DTSLOG
> -) -> OSSession:
> - match node_config.os:
> - case OS.linux:
> - return LinuxSession(node_config, name, logger)
> - case _:
> - raise ConfigurationError(f"Unsupported OS {node_config.os}")
> +) -> RemoteSession:
> + return SSHSession(node_config, name, logger)
> +
> +
> +def create_interactive_session(
> + node_config: NodeConfiguration, logger: DTSLOG
> +) -> InteractiveRemoteSession:
> + return InteractiveRemoteSession(node_config, logger)
> diff --git
> a/dts/framework/remote_session/remote/interactive_remote_session.py
> b/dts/framework/remote_session/interactive_remote_session.py
> similarity index 100%
> rename from
> dts/framework/remote_session/remote/interactive_remote_session.py
> rename to dts/framework/remote_session/interactive_remote_session.py
> diff --git a/dts/framework/remote_session/remote/interactive_shell.py
> b/dts/framework/remote_session/interactive_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/interactive_shell.py
> rename to dts/framework/remote_session/interactive_shell.py
> diff --git a/dts/framework/remote_session/remote/python_shell.py
> b/dts/framework/remote_session/python_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/python_shell.py
> rename to dts/framework/remote_session/python_shell.py
> diff --git a/dts/framework/remote_session/remote/__init__.py
> b/dts/framework/remote_session/remote/__init__.py
> deleted file mode 100644
> index 06403691a5..0000000000
> --- a/dts/framework/remote_session/remote/__init__.py
> +++ /dev/null
> @@ -1,27 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -# Copyright(c) 2023 University of New Hampshire
> -
> -# pylama:ignore=W0611
> -
> -from framework.config import NodeConfiguration
> -from framework.logger import DTSLOG
> -
> -from .interactive_remote_session import InteractiveRemoteSession
> -from .interactive_shell import InteractiveShell
> -from .python_shell import PythonShell
> -from .remote_session import CommandResult, RemoteSession
> -from .ssh_session import SSHSession
> -from .testpmd_shell import TestPmdDevice, TestPmdShell
> -
> -
> -def create_remote_session(
> - node_config: NodeConfiguration, name: str, logger: DTSLOG
> -) -> RemoteSession:
> - return SSHSession(node_config, name, logger)
> -
> -
> -def create_interactive_session(
> - node_config: NodeConfiguration, logger: DTSLOG
> -) -> InteractiveRemoteSession:
> - return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote/remote_session.py
> b/dts/framework/remote_session/remote_session.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/remote_session.py
> rename to dts/framework/remote_session/remote_session.py
> diff --git a/dts/framework/remote_session/remote/ssh_session.py
> b/dts/framework/remote_session/ssh_session.py
> similarity index 91%
> rename from dts/framework/remote_session/remote/ssh_session.py
> rename to dts/framework/remote_session/ssh_session.py
> index 8d127f1601..cee11d14d6 100644
> --- a/dts/framework/remote_session/remote/ssh_session.py
> +++ b/dts/framework/remote_session/ssh_session.py
> @@ -18,9 +18,7 @@
> SSHException,
> )
>
> -from framework.config import NodeConfiguration
> from framework.exception import SSHConnectionError, SSHSessionDeadError,
> SSHTimeoutError
> -from framework.logger import DTSLOG
>
> from .remote_session import CommandResult, RemoteSession
>
> @@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
>
> session: Connection
>
> - def __init__(
> - self,
> - node_config: NodeConfiguration,
> - session_name: str,
> - logger: DTSLOG,
> - ):
> - super(SSHSession, self).__init__(node_config, session_name,
> logger)
> -
> def _connect(self) -> None:
> errors = []
> retry_attempts = 10
> @@ -117,7 +107,7 @@ def _send_command(
>
> except CommandTimedOut as e:
> self._logger.exception(e)
> - raise SSHTimeoutError(command, e.result.stderr) from e
> + raise SSHTimeoutError(command) from e
>
> return CommandResult(
> self.name, command, output.stdout, output.stderr,
> output.return_code
> diff --git a/dts/framework/remote_session/remote/testpmd_shell.py
> b/dts/framework/remote_session/testpmd_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/testpmd_shell.py
> rename to dts/framework/remote_session/testpmd_shell.py
> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> index cfa39d011b..7f5841d073 100644
> --- a/dts/framework/settings.py
> +++ b/dts/framework/settings.py
> @@ -6,7 +6,7 @@
> import argparse
> import os
> from collections.abc import Callable, Iterable, Sequence
> -from dataclasses import dataclass
> +from dataclasses import dataclass, field
> from pathlib import Path
> from typing import Any, TypeVar
>
> @@ -22,8 +22,8 @@ def __init__(
> option_strings: Sequence[str],
> dest: str,
> nargs: str | int | None = None,
> - const: str | None = None,
> - default: str = None,
> + const: bool | None = None,
> + default: Any = None,
> type: Callable[[str], _T | argparse.FileType | None] = None,
> choices: Iterable[_T] | None = None,
> required: bool = False,
> @@ -32,6 +32,12 @@ def __init__(
> ) -> None:
> env_var_value = os.environ.get(env_var)
> default = env_var_value or default
> + if const is not None:
> + nargs = 0
> + default = const if env_var_value else default
> + type = None
> + choices = None
> + metavar = None
> super(_EnvironmentArgument, self).__init__(
> option_strings,
> dest,
> @@ -52,22 +58,28 @@ def __call__(
> values: Any,
> option_string: str = None,
> ) -> None:
> - setattr(namespace, self.dest, values)
> + if self.const is not None:
> + setattr(namespace, self.dest, self.const)
> + else:
> + setattr(namespace, self.dest, values)
>
> return _EnvironmentArgument
>
>
> -@dataclass(slots=True, frozen=True)
> -class _Settings:
> - config_file_path: str
> - output_dir: str
> - timeout: float
> - verbose: bool
> - skip_setup: bool
> - dpdk_tarball_path: Path
> - compile_timeout: float
> - test_cases: list
> - re_run: int
> +@dataclass(slots=True)
> +class Settings:
> + config_file_path: Path =
> Path(__file__).parent.parent.joinpath("conf.yaml")
> + output_dir: str = "output"
> + timeout: float = 15
> + verbose: bool = False
> + skip_setup: bool = False
> + dpdk_tarball_path: Path | str = "dpdk.tar.xz"
> + compile_timeout: float = 1200
> + test_cases: list[str] = field(default_factory=list)
> + re_run: int = 0
> +
> +
> +SETTINGS: Settings = Settings()
>
>
> def _get_parser() -> argparse.ArgumentParser:
> @@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
> parser.add_argument(
> "--config-file",
> action=_env_arg("DTS_CFG_FILE"),
> - default="conf.yaml",
> + default=SETTINGS.config_file_path,
> + type=Path,
> help="[DTS_CFG_FILE] configuration file that describes the test
> cases, SUTs "
> "and targets.",
> )
> @@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--output-dir",
> "--output",
> action=_env_arg("DTS_OUTPUT_DIR"),
> - default="output",
> + default=SETTINGS.output_dir,
> help="[DTS_OUTPUT_DIR] Output directory where dts logs and
> results are saved.",
> )
>
> @@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "-t",
> "--timeout",
> action=_env_arg("DTS_TIMEOUT"),
> - default=15,
> + default=SETTINGS.timeout,
> type=float,
> help="[DTS_TIMEOUT] The default timeout for all DTS operations
> except for "
> "compiling DPDK.",
> @@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
> "-v",
> "--verbose",
> action=_env_arg("DTS_VERBOSE"),
> - default="N",
> - help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging
> all messages "
> + default=SETTINGS.verbose,
> + const=True,
> + help="[DTS_VERBOSE] Specify to enable verbose output, logging all
> messages "
> "to the console.",
> )
>
> @@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
> "-s",
> "--skip-setup",
> action=_env_arg("DTS_SKIP_SETUP"),
> - default="N",
> - help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT
> and TG nodes.",
> + const=True,
> + help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and
> TG nodes.",
> )
>
> parser.add_argument(
> @@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--snapshot",
> "--git-ref",
> action=_env_arg("DTS_DPDK_TARBALL"),
> - default="dpdk.tar.xz",
> + default=SETTINGS.dpdk_tarball_path,
> type=Path,
> help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a
> git commit ID, "
> "tag ID or tree ID to test. To test local changes, first commit
> them, "
> @@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
> parser.add_argument(
> "--compile-timeout",
> action=_env_arg("DTS_COMPILE_TIMEOUT"),
> - default=1200,
> + default=SETTINGS.compile_timeout,
> type=float,
> help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
> )
> @@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--re-run",
> "--re_run",
> action=_env_arg("DTS_RERUN"),
> - default=0,
> + default=SETTINGS.re_run,
> type=int,
> help="[DTS_RERUN] Re-run each test case the specified amount of
> times "
> "if a test failure occurs",
> @@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
> return parser
>
>
> -def _get_settings() -> _Settings:
> +def get_settings() -> Settings:
> parsed_args = _get_parser().parse_args()
> - return _Settings(
> + return Settings(
> config_file_path=parsed_args.config_file,
> output_dir=parsed_args.output_dir,
> timeout=parsed_args.timeout,
> - verbose=(parsed_args.verbose == "Y"),
> - skip_setup=(parsed_args.skip_setup == "Y"),
> + verbose=parsed_args.verbose,
> + skip_setup=parsed_args.skip_setup,
> dpdk_tarball_path=Path(
> - DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
> - )
> - if not os.path.exists(parsed_args.tarball)
> - else Path(parsed_args.tarball),
> + Path(DPDKGitTarball(parsed_args.tarball,
> parsed_args.output_dir))
> + if not os.path.exists(parsed_args.tarball)
> + else Path(parsed_args.tarball)
> + ),
> compile_timeout=parsed_args.compile_timeout,
> - test_cases=parsed_args.test_cases.split(",") if
> parsed_args.test_cases else [],
> + test_cases=(
> + parsed_args.test_cases.split(",") if parsed_args.test_cases
> else []
> + ),
> re_run=parsed_args.re_run,
> )
> -
> -
> -SETTINGS: _Settings = _get_settings()
> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
> index f0fbe80f6f..603e18872c 100644
> --- a/dts/framework/test_result.py
> +++ b/dts/framework/test_result.py
> @@ -254,7 +254,7 @@ def add_build_target(
> self._inner_results.append(build_target_result)
> return build_target_result
>
> - def add_sut_info(self, sut_info: NodeInfo):
> + def add_sut_info(self, sut_info: NodeInfo) -> None:
> self.sut_os_name = sut_info.os_name
> self.sut_os_version = sut_info.os_version
> self.sut_kernel_version = sut_info.kernel_version
> @@ -297,7 +297,7 @@ def add_execution(self, sut_node: NodeConfiguration)
> -> ExecutionResult:
> self._inner_results.append(execution_result)
> return execution_result
>
> - def add_error(self, error) -> None:
> + def add_error(self, error: Exception) -> None:
> self._errors.append(error)
>
> def process(self) -> None:
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index 3b890c0451..d53553bf34 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -11,7 +11,7 @@
> import re
> from ipaddress import IPv4Interface, IPv6Interface, ip_interface
> from types import MethodType
> -from typing import Union
> +from typing import Any, Union
>
> from scapy.layers.inet import IP # type: ignore[import]
> from scapy.layers.l2 import Ether # type: ignore[import]
> @@ -26,8 +26,7 @@
> from .logger import DTSLOG, getLogger
> from .settings import SETTINGS
> from .test_result import BuildTargetResult, Result, TestCaseResult,
> TestSuiteResult
> -from .testbed_model import SutNode, TGNode
> -from .testbed_model.hw.port import Port, PortLink
> +from .testbed_model import Port, PortLink, SutNode, TGNode
> from .utils import get_packet_summaries
>
>
> @@ -453,7 +452,7 @@ def _execute_test_case(
>
>
> def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
> - def is_test_suite(object) -> bool:
> + def is_test_suite(object: Any) -> bool:
> try:
> if issubclass(object, TestSuite) and object is not TestSuite:
> return True
> diff --git a/dts/framework/testbed_model/__init__.py
> b/dts/framework/testbed_model/__init__.py
> index 5cbb859e47..8ced05653b 100644
> --- a/dts/framework/testbed_model/__init__.py
> +++ b/dts/framework/testbed_model/__init__.py
> @@ -9,15 +9,9 @@
>
> # pylama:ignore=W0611
>
> -from .hw import (
> - LogicalCore,
> - LogicalCoreCount,
> - LogicalCoreCountFilter,
> - LogicalCoreList,
> - LogicalCoreListFilter,
> - VirtualDevice,
> - lcore_filter,
> -)
> +from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
> from .node import Node
> +from .port import Port, PortLink
> from .sut_node import SutNode
> from .tg_node import TGNode
> +from .virtual_device import VirtualDevice
> diff --git a/dts/framework/testbed_model/hw/cpu.py
> b/dts/framework/testbed_model/cpu.py
> similarity index 95%
> rename from dts/framework/testbed_model/hw/cpu.py
> rename to dts/framework/testbed_model/cpu.py
> index d1918a12dc..8fe785dfe4 100644
> --- a/dts/framework/testbed_model/hw/cpu.py
> +++ b/dts/framework/testbed_model/cpu.py
> @@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
> )
>
> return filtered_lcores
> +
> +
> +def lcore_filter(
> + core_list: list[LogicalCore],
> + filter_specifier: LogicalCoreCount | LogicalCoreList,
> + ascending: bool,
> +) -> LogicalCoreFilter:
> + if isinstance(filter_specifier, LogicalCoreList):
> + return LogicalCoreListFilter(core_list, filter_specifier,
> ascending)
> + elif isinstance(filter_specifier, LogicalCoreCount):
> + return LogicalCoreCountFilter(core_list, filter_specifier,
> ascending)
> + else:
> + raise ValueError(f"Unsupported filter r{filter_specifier}")
> diff --git a/dts/framework/testbed_model/hw/__init__.py
> b/dts/framework/testbed_model/hw/__init__.py
> deleted file mode 100644
> index 88ccac0b0e..0000000000
> --- a/dts/framework/testbed_model/hw/__init__.py
> +++ /dev/null
> @@ -1,27 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -
> -# pylama:ignore=W0611
> -
> -from .cpu import (
> - LogicalCore,
> - LogicalCoreCount,
> - LogicalCoreCountFilter,
> - LogicalCoreFilter,
> - LogicalCoreList,
> - LogicalCoreListFilter,
> -)
> -from .virtual_device import VirtualDevice
> -
> -
> -def lcore_filter(
> - core_list: list[LogicalCore],
> - filter_specifier: LogicalCoreCount | LogicalCoreList,
> - ascending: bool,
> -) -> LogicalCoreFilter:
> - if isinstance(filter_specifier, LogicalCoreList):
> - return LogicalCoreListFilter(core_list, filter_specifier,
> ascending)
> - elif isinstance(filter_specifier, LogicalCoreCount):
> - return LogicalCoreCountFilter(core_list, filter_specifier,
> ascending)
> - else:
> - raise ValueError(f"Unsupported filter r{filter_specifier}")
> diff --git a/dts/framework/remote_session/linux_session.py
> b/dts/framework/testbed_model/linux_session.py
> similarity index 97%
> rename from dts/framework/remote_session/linux_session.py
> rename to dts/framework/testbed_model/linux_session.py
> index a3f1a6bf3b..f472bb8f0f 100644
> --- a/dts/framework/remote_session/linux_session.py
> +++ b/dts/framework/testbed_model/linux_session.py
> @@ -9,10 +9,10 @@
> from typing_extensions import NotRequired
>
> from framework.exception import RemoteCommandExecutionError
> -from framework.testbed_model import LogicalCore
> -from framework.testbed_model.hw.port import Port
> from framework.utils import expand_range
>
> +from .cpu import LogicalCore
> +from .port import Port
> from .posix_session import PosixSession
>
>
> @@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) ->
> list[LogicalCore]:
> lcores.append(LogicalCore(lcore, core, socket, node))
> return lcores
>
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> return dpdk_prefix
>
> def setup_hugepages(self, hugepage_amount: int, force_first_numa:
> bool) -> None:
> diff --git a/dts/framework/testbed_model/node.py
> b/dts/framework/testbed_model/node.py
> index fc01e0bf8e..fa5b143cdd 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -12,23 +12,26 @@
> from typing import Any, Callable, Type, Union
>
> from framework.config import (
> + OS,
> BuildTargetConfiguration,
> ExecutionConfiguration,
> NodeConfiguration,
> )
> +from framework.exception import ConfigurationError
> from framework.logger import DTSLOG, getLogger
> -from framework.remote_session import InteractiveShellType, OSSession,
> create_session
> from framework.settings import SETTINGS
>
> -from .hw import (
> +from .cpu import (
> LogicalCore,
> LogicalCoreCount,
> LogicalCoreList,
> LogicalCoreListFilter,
> - VirtualDevice,
> lcore_filter,
> )
> -from .hw.port import Port
> +from .linux_session import LinuxSession
> +from .os_session import InteractiveShellType, OSSession
> +from .port import Port
> +from .virtual_device import VirtualDevice
>
>
> class Node(ABC):
> @@ -172,9 +175,9 @@ def create_interactive_shell(
>
> return self.main_session.create_interactive_shell(
> shell_cls,
> - app_args,
> timeout,
> privileged,
> + app_args,
> )
>
> def filter_lcores(
> @@ -205,7 +208,7 @@ def _get_remote_cpus(self) -> None:
> self._logger.info("Getting CPU information.")
> self.lcores =
> self.main_session.get_remote_cpus(self.config.use_first_core)
>
> - def _setup_hugepages(self):
> + def _setup_hugepages(self) -> None:
> """
> Setup hugepages on the Node. Different architectures can supply
> different
> amounts of memory for hugepages and numa-based hugepage
> allocation may need
> @@ -249,3 +252,13 @@ def skip_setup(func: Callable[..., Any]) ->
> Callable[..., Any]:
> return lambda *args: None
> else:
> return func
> +
> +
> +def create_session(
> + node_config: NodeConfiguration, name: str, logger: DTSLOG
> +) -> OSSession:
> + match node_config.os:
> + case OS.linux:
> + return LinuxSession(node_config, name, logger)
> + case _:
> + raise ConfigurationError(f"Unsupported OS {node_config.os}")
> diff --git a/dts/framework/remote_session/os_session.py
> b/dts/framework/testbed_model/os_session.py
> similarity index 95%
> rename from dts/framework/remote_session/os_session.py
> rename to dts/framework/testbed_model/os_session.py
> index 8a709eac1c..76e595a518 100644
> --- a/dts/framework/remote_session/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -10,19 +10,19 @@
>
> from framework.config import Architecture, NodeConfiguration, NodeInfo
> from framework.logger import DTSLOG
> -from framework.remote_session.remote import InteractiveShell
> -from framework.settings import SETTINGS
> -from framework.testbed_model import LogicalCore
> -from framework.testbed_model.hw.port import Port
> -from framework.utils import MesonArgs
> -
> -from .remote import (
> +from framework.remote_session import (
> CommandResult,
> InteractiveRemoteSession,
> + InteractiveShell,
> RemoteSession,
> create_interactive_session,
> create_remote_session,
> )
> +from framework.settings import SETTINGS
> +from framework.utils import MesonArgs
> +
> +from .cpu import LogicalCore
> +from .port import Port
>
> InteractiveShellType = TypeVar("InteractiveShellType",
> bound=InteractiveShell)
>
> @@ -85,9 +85,9 @@ def send_command(
> def create_interactive_shell(
> self,
> shell_cls: Type[InteractiveShellType],
> - eal_parameters: str,
> timeout: float,
> privileged: bool,
> + app_args: str,
> ) -> InteractiveShellType:
> """
> See "create_interactive_shell" in SutNode
> @@ -96,7 +96,7 @@ def create_interactive_shell(
> self.interactive_session.session,
> self._logger,
> self._get_privileged_command if privileged else None,
> - eal_parameters,
> + app_args,
> timeout,
> )
>
> @@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
> """
>
> @abstractmethod
> - def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
> + def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) ->
> PurePath:
> """
> Try to find DPDK remote dir in remote_dir.
> """
> @@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list:
> Iterable[str]) -> None:
> """
>
> @abstractmethod
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> """
> Get the DPDK file prefix that will be used when running DPDK apps.
> """
> diff --git a/dts/framework/testbed_model/hw/port.py
> b/dts/framework/testbed_model/port.py
> similarity index 100%
> rename from dts/framework/testbed_model/hw/port.py
> rename to dts/framework/testbed_model/port.py
> diff --git a/dts/framework/remote_session/posix_session.py
> b/dts/framework/testbed_model/posix_session.py
> similarity index 98%
> rename from dts/framework/remote_session/posix_session.py
> rename to dts/framework/testbed_model/posix_session.py
> index 5da0516e05..1d1d5b1b26 100644
> --- a/dts/framework/remote_session/posix_session.py
> +++ b/dts/framework/testbed_model/posix_session.py
> @@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
>
> return ret_opts
>
> - def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
> + def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) ->
> PurePosixPath:
> remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
> result = self.send_command(f"ls -d {remote_guess} | tail -1")
> return PurePosixPath(result.stdout)
> @@ -219,7 +219,7 @@ def _remove_dpdk_runtime_dirs(
> for dpdk_runtime_dir in dpdk_runtime_dirs:
> self.remove_remote_dir(dpdk_runtime_dir)
>
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> return ""
>
> def get_compiler_version(self, compiler_name: str) -> str:
> diff --git a/dts/framework/testbed_model/sut_node.py
> b/dts/framework/testbed_model/sut_node.py
> index 4161d3a4d5..17deea06e2 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -15,12 +15,14 @@
> NodeInfo,
> SutNodeConfiguration,
> )
> -from framework.remote_session import CommandResult, InteractiveShellType,
> OSSession
> +from framework.remote_session import CommandResult
> from framework.settings import SETTINGS
> from framework.utils import MesonArgs
>
> -from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
> +from .cpu import LogicalCoreCount, LogicalCoreList
> from .node import Node
> +from .os_session import InteractiveShellType, OSSession
> +from .virtual_device import VirtualDevice
>
>
> class EalParameters(object):
> @@ -307,7 +309,7 @@ def create_eal_parameters(
> prefix: str = "dpdk",
> append_prefix_timestamp: bool = True,
> no_pci: bool = False,
> - vdevs: list[VirtualDevice] = None,
> + vdevs: list[VirtualDevice] | None = None,
> other_eal_param: str = "",
> ) -> "EalParameters":
> """
> diff --git a/dts/framework/testbed_model/tg_node.py
> b/dts/framework/testbed_model/tg_node.py
> index 27025cfa31..166eb8430e 100644
> --- a/dts/framework/testbed_model/tg_node.py
> +++ b/dts/framework/testbed_model/tg_node.py
> @@ -16,16 +16,11 @@
>
> from scapy.packet import Packet # type: ignore[import]
>
> -from framework.config import (
> - ScapyTrafficGeneratorConfig,
> - TGNodeConfiguration,
> - TrafficGeneratorType,
> -)
> -from framework.exception import ConfigurationError
> -
> -from .capturing_traffic_generator import CapturingTrafficGenerator
> -from .hw.port import Port
> +from framework.config import TGNodeConfiguration
> +
> from .node import Node
> +from .port import Port
> +from .traffic_generator import CapturingTrafficGenerator,
> create_traffic_generator
>
>
> class TGNode(Node):
> @@ -80,20 +75,3 @@ def close(self) -> None:
> """Free all resources used by the node"""
> self.traffic_generator.close()
> super(TGNode, self).close()
> -
> -
> -def create_traffic_generator(
> - tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
> -) -> CapturingTrafficGenerator:
> - """A factory function for creating traffic generator object from user
> config."""
> -
> - from .scapy import ScapyTrafficGenerator
> -
> - match traffic_generator_config.traffic_generator_type:
> - case TrafficGeneratorType.SCAPY:
> - return ScapyTrafficGenerator(tg_node,
> traffic_generator_config)
> - case _:
> - raise ConfigurationError(
> - "Unknown traffic generator: "
> - f"{traffic_generator_config.traffic_generator_type}"
> - )
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py
> b/dts/framework/testbed_model/traffic_generator/__init__.py
> new file mode 100644
> index 0000000000..11bfa1ee0f
> --- /dev/null
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -0,0 +1,24 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +from framework.config import ScapyTrafficGeneratorConfig,
> TrafficGeneratorType
> +from framework.exception import ConfigurationError
> +from framework.testbed_model.node import Node
> +
> +from .capturing_traffic_generator import CapturingTrafficGenerator
> +from .scapy import ScapyTrafficGenerator
> +
> +
> +def create_traffic_generator(
> + tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
> +) -> CapturingTrafficGenerator:
> + """A factory function for creating traffic generator object from user
> config."""
> +
> + match traffic_generator_config.traffic_generator_type:
> + case TrafficGeneratorType.SCAPY:
> + return ScapyTrafficGenerator(tg_node,
> traffic_generator_config)
> + case _:
> + raise ConfigurationError(
> + "Unknown traffic generator: "
> + f"{traffic_generator_config.traffic_generator_type}"
> + )
> diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py
> b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> similarity index 96%
> rename from dts/framework/testbed_model/capturing_traffic_generator.py
> rename to
> dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> index ab98987f8e..e521211ef0 100644
> --- a/dts/framework/testbed_model/capturing_traffic_generator.py
> +++
> b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> @@ -16,9 +16,9 @@
> from scapy.packet import Packet # type: ignore[import]
>
> from framework.settings import SETTINGS
> +from framework.testbed_model.port import Port
> from framework.utils import get_packet_summaries
>
> -from .hw.port import Port
> from .traffic_generator import TrafficGenerator
>
>
> @@ -130,7 +130,9 @@ def _send_packets_and_capture(
> for the specified duration. It must be able to handle no received
> packets.
> """
>
> - def _write_capture_from_packets(self, capture_name: str, packets:
> list[Packet]):
> + def _write_capture_from_packets(
> + self, capture_name: str, packets: list[Packet]
> + ) -> None:
> file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
> self._logger.debug(f"Writing packets to {file_name}.")
> scapy.utils.wrpcap(file_name, packets)
> diff --git a/dts/framework/testbed_model/scapy.py
> b/dts/framework/testbed_model/traffic_generator/scapy.py
> similarity index 95%
> rename from dts/framework/testbed_model/scapy.py
> rename to dts/framework/testbed_model/traffic_generator/scapy.py
> index af0d4dbb25..51864b6e6b 100644
> --- a/dts/framework/testbed_model/scapy.py
> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
> @@ -24,16 +24,15 @@
> from scapy.packet import Packet # type: ignore[import]
>
> from framework.config import OS, ScapyTrafficGeneratorConfig
> -from framework.logger import DTSLOG, getLogger
> from framework.remote_session import PythonShell
> from framework.settings import SETTINGS
> +from framework.testbed_model.node import Node
> +from framework.testbed_model.port import Port
>
> from .capturing_traffic_generator import (
> CapturingTrafficGenerator,
> _get_default_capture_name,
> )
> -from .hw.port import Port
> -from .tg_node import TGNode
>
> """
> ========= BEGIN RPC FUNCTIONS =========
> @@ -146,7 +145,7 @@ def quit(self) -> None:
> self._BaseServer__shutdown_request = True
> return None
>
> - def add_rpc_function(self, name: str, function_bytes:
> xmlrpc.client.Binary):
> + def add_rpc_function(self, name: str, function_bytes:
> xmlrpc.client.Binary) -> None:
> """Add a function to the server.
>
> This is meant to be executed remotely.
> @@ -191,15 +190,9 @@ class
> ScapyTrafficGenerator(CapturingTrafficGenerator):
> session: PythonShell
> rpc_server_proxy: xmlrpc.client.ServerProxy
> _config: ScapyTrafficGeneratorConfig
> - _tg_node: TGNode
> - _logger: DTSLOG
> -
> - def __init__(self, tg_node: TGNode, config:
> ScapyTrafficGeneratorConfig):
> - self._config = config
> - self._tg_node = tg_node
> - self._logger = getLogger(
> - f"{self._tg_node.name} {self._config.traffic_generator_type}"
> - )
> +
> + def __init__(self, tg_node: Node, config:
> ScapyTrafficGeneratorConfig):
> + super().__init__(tg_node, config)
>
> assert (
> self._tg_node.config.os == OS.linux
> @@ -235,7 +228,7 @@ def __init__(self, tg_node: TGNode, config:
> ScapyTrafficGeneratorConfig):
> function_bytes = marshal.dumps(function.__code__)
> self.rpc_server_proxy.add_rpc_function(function.__name__,
> function_bytes)
>
> - def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
> + def _start_xmlrpc_server_in_remote_python(self, listen_port: int) ->
> None:
> # load the source of the function
> src = inspect.getsource(QuittableXMLRPCServer)
> # Lines with only whitespace break the repl if in the middle of a
> function
> @@ -280,7 +273,7 @@ def _send_packets_and_capture(
> scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
> return scapy_packets
>
> - def close(self):
> + def close(self) -> None:
> try:
> self.rpc_server_proxy.quit()
> except ConnectionRefusedError:
> diff --git a/dts/framework/testbed_model/traffic_generator.py
> b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> similarity index 80%
> rename from dts/framework/testbed_model/traffic_generator.py
> rename to
> dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index 28c35d3ce4..ea7c3963da 100644
> --- a/dts/framework/testbed_model/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -12,11 +12,12 @@
>
> from scapy.packet import Packet # type: ignore[import]
>
> -from framework.logger import DTSLOG
> +from framework.config import TrafficGeneratorConfig
> +from framework.logger import DTSLOG, getLogger
> +from framework.testbed_model.node import Node
> +from framework.testbed_model.port import Port
> from framework.utils import get_packet_summaries
>
> -from .hw.port import Port
> -
>
> class TrafficGenerator(ABC):
> """The base traffic generator.
> @@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
> Defines the few basic methods that each traffic generator must
> implement.
> """
>
> + _config: TrafficGeneratorConfig
> + _tg_node: Node
>
Is there a benefit to changing this to be a node instead of a TGNode?
Wouldn't we want the capabilities of the TGNode to be accessible in the
TrafficGenerator class?
> _logger: DTSLOG
>
> + def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> + self._config = config
> + self._tg_node = tg_node
> + self._logger = getLogger(
> + f"{self._tg_node.name} {self._config.traffic_generator_type}"
> + )
> +
> def send_packet(self, packet: Packet, port: Port) -> None:
> """Send a packet and block until it is fully sent.
>
> diff --git a/dts/framework/testbed_model/hw/virtual_device.py
> b/dts/framework/testbed_model/virtual_device.py
> similarity index 100%
> rename from dts/framework/testbed_model/hw/virtual_device.py
> rename to dts/framework/testbed_model/virtual_device.py
> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> index d27c2c5b5f..f0c916471c 100644
> --- a/dts/framework/utils.py
> +++ b/dts/framework/utils.py
> @@ -7,7 +7,6 @@
> import json
> import os
> import subprocess
> -import sys
> from enum import Enum
> from pathlib import Path
> from subprocess import SubprocessError
> @@ -16,35 +15,7 @@
>
> from .exception import ConfigurationError
>
> -
> -class StrEnum(Enum):
> - @staticmethod
> - def _generate_next_value_(
> - name: str, start: int, count: int, last_values: object
> - ) -> str:
> - return name
> -
> - def __str__(self) -> str:
> - return self.name
> -
> -
> -REGEX_FOR_PCI_ADDRESS =
> "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> -
> -
> -def check_dts_python_version() -> None:
> - if sys.version_info.major < 3 or (
> - sys.version_info.major == 3 and sys.version_info.minor < 10
> - ):
> - print(
> - RED(
> - (
> - "WARNING: DTS execution node's python version is
> lower than"
> - "python 3.10, is deprecated and will not work in
> future releases."
> - )
> - ),
> - file=sys.stderr,
> - )
> - print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> +REGEX_FOR_PCI_ADDRESS: str =
> "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
>
>
> def expand_range(range_str: str) -> list[int]:
> @@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
> return expanded_range
>
>
> -def get_packet_summaries(packets: list[Packet]):
> +def get_packet_summaries(packets: list[Packet]) -> str:
> if len(packets) == 1:
> packet_summaries = packets[0].summary()
> else:
> @@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
> return f"Packet contents: \n{packet_summaries}"
>
>
> -def RED(text: str) -> str:
> - return f"\u001B[31;1m{str(text)}\u001B[0m"
> +class StrEnum(Enum):
> + @staticmethod
> + def _generate_next_value_(
> + name: str, start: int, count: int, last_values: object
> + ) -> str:
> + return name
> +
> + def __str__(self) -> str:
> + return self.name
>
>
> class MesonArgs(object):
> @@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
> if self._tarball_path and os.path.exists(self._tarball_path):
> os.remove(self._tarball_path)
>
> - def __fspath__(self):
> + def __fspath__(self) -> str:
> return str(self._tarball_path)
> diff --git a/dts/main.py b/dts/main.py
> index 43311fa847..5d4714b0c3 100755
> --- a/dts/main.py
> +++ b/dts/main.py
> @@ -10,10 +10,17 @@
>
> import logging
>
> -from framework import dts
> +from framework import settings
>
>
> def main() -> None:
> + """Set DTS settings, then run DTS.
> +
> + The DTS settings are taken from the command line arguments and the
> environment variables.
> + """
> + settings.SETTINGS = settings.get_settings()
> + from framework import dts
> +
> dts.run_all()
>
>
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 62293 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 01/21] dts: code adjustments for doc generation
2023-11-16 21:04 ` Jeremy Spewock
@ 2023-11-20 16:10 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-20 16:10 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
On Thu, Nov 16, 2023 at 10:05 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
>
>
>
> On Wed, Nov 15, 2023 at 8:11 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>>
>> The standard Python tool for generating API documentation, Sphinx,
>> imports modules one-by-one when generating the documentation. This
>> requires code changes:
>> * properly guarding argument parsing in the if __name__ == '__main__'
>> block,
>> * the logger used by DTS runner underwent the same treatment so that it
>> doesn't create log files outside of a DTS run,
>> * however, DTS uses the arguments to construct an object holding global
>> variables. The defaults for the global variables needed to be moved
>> from argument parsing elsewhere,
>> * importing the remote_session module from framework resulted in
>> circular imports because of one module trying to import another
>> module. This is fixed by reorganizing the code,
>> * some code reorganization was done because the resulting structure
>> makes more sense, improving documentation clarity.
>>
>> The are some other changes which are documentation related:
>> * added missing type annotation so they appear in the generated docs,
>> * reordered arguments in some methods,
>> * removed superfluous arguments and attributes,
>> * change private functions/methods/attributes to private and vice-versa.
>>
>> The above all appear in the generated documentation and the with them,
>> the documentation is improved.
>>
>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>> ---
>> dts/framework/config/__init__.py | 10 ++-
>> dts/framework/dts.py | 33 +++++--
>> dts/framework/exception.py | 54 +++++-------
>> dts/framework/remote_session/__init__.py | 41 ++++-----
>> .../interactive_remote_session.py | 0
>> .../{remote => }/interactive_shell.py | 0
>> .../{remote => }/python_shell.py | 0
>> .../remote_session/remote/__init__.py | 27 ------
>> .../{remote => }/remote_session.py | 0
>> .../{remote => }/ssh_session.py | 12 +--
>> .../{remote => }/testpmd_shell.py | 0
>> dts/framework/settings.py | 87 +++++++++++--------
>> dts/framework/test_result.py | 4 +-
>> dts/framework/test_suite.py | 7 +-
>> dts/framework/testbed_model/__init__.py | 12 +--
>> dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
>> dts/framework/testbed_model/hw/__init__.py | 27 ------
>> .../linux_session.py | 6 +-
>> dts/framework/testbed_model/node.py | 25 ++++--
>> .../os_session.py | 22 ++---
>> dts/framework/testbed_model/{hw => }/port.py | 0
>> .../posix_session.py | 4 +-
>> dts/framework/testbed_model/sut_node.py | 8 +-
>> dts/framework/testbed_model/tg_node.py | 30 +------
>> .../traffic_generator/__init__.py | 24 +++++
>> .../capturing_traffic_generator.py | 6 +-
>> .../{ => traffic_generator}/scapy.py | 23 ++---
>> .../traffic_generator.py | 16 +++-
>> .../testbed_model/{hw => }/virtual_device.py | 0
>> dts/framework/utils.py | 46 +++-------
>> dts/main.py | 9 +-
>> 31 files changed, 258 insertions(+), 288 deletions(-)
>> rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
>> rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
>> rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
>> delete mode 100644 dts/framework/remote_session/remote/__init__.py
>> rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
>> rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
>> rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
>> rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
>> delete mode 100644 dts/framework/testbed_model/hw/__init__.py
>> rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
>> rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
>> rename dts/framework/testbed_model/{hw => }/port.py (100%)
>> rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
>> create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
>> rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (96%)
>> rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
>> rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
>> rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
>>
>> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
>> index cb7e00ba34..2044c82611 100644
>> --- a/dts/framework/config/__init__.py
>> +++ b/dts/framework/config/__init__.py
>> @@ -17,6 +17,7 @@
>> import warlock # type: ignore[import]
>> import yaml
>>
>> +from framework.exception import ConfigurationError
>> from framework.settings import SETTINGS
>> from framework.utils import StrEnum
>>
>> @@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
>> traffic_generator_type: TrafficGeneratorType
>>
>> @staticmethod
>> - def from_dict(d: dict):
>> + def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
>> # This looks useless now, but is designed to allow expansion to traffic
>> # generators that require more configuration later.
>> match TrafficGeneratorType(d["type"]):
>> @@ -97,6 +98,10 @@ def from_dict(d: dict):
>> return ScapyTrafficGeneratorConfig(
>> traffic_generator_type=TrafficGeneratorType.SCAPY
>> )
>> + case _:
>> + raise ConfigurationError(
>> + f'Unknown traffic generator type "{d["type"]}".'
>> + )
>>
>>
>> @dataclass(slots=True, frozen=True)
>> @@ -324,6 +329,3 @@ def load_config() -> Configuration:
>> config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
>> config_obj: Configuration = Configuration.from_dict(dict(config))
>> return config_obj
>> -
>> -
>> -CONFIGURATION = load_config()
>> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
>> index f773f0c38d..4c7fb0c40a 100644
>> --- a/dts/framework/dts.py
>> +++ b/dts/framework/dts.py
>> @@ -6,19 +6,19 @@
>> import sys
>>
>> from .config import (
>> - CONFIGURATION,
>> BuildTargetConfiguration,
>> ExecutionConfiguration,
>> TestSuiteConfig,
>> + load_config,
>> )
>> from .exception import BlockingTestSuiteError
>> from .logger import DTSLOG, getLogger
>> from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
>> from .test_suite import get_test_suites
>> from .testbed_model import SutNode, TGNode
>> -from .utils import check_dts_python_version
>>
>> -dts_logger: DTSLOG = getLogger("DTSRunner")
>> +# dummy defaults to satisfy linters
>> +dts_logger: DTSLOG = None # type: ignore[assignment]
>> result: DTSResult = DTSResult(dts_logger)
>>
>>
>> @@ -30,14 +30,18 @@ def run_all() -> None:
>> global dts_logger
>> global result
>>
>> + # create a regular DTS logger and create a new result with it
>> + dts_logger = getLogger("DTSRunner")
>> + result = DTSResult(dts_logger)
>> +
>> # check the python version of the server that run dts
>> - check_dts_python_version()
>> + _check_dts_python_version()
>>
>> sut_nodes: dict[str, SutNode] = {}
>> tg_nodes: dict[str, TGNode] = {}
>> try:
>> # for all Execution sections
>> - for execution in CONFIGURATION.executions:
>> + for execution in load_config().executions:
>> sut_node = sut_nodes.get(execution.system_under_test_node.name)
>> tg_node = tg_nodes.get(execution.traffic_generator_node.name)
>>
>> @@ -82,6 +86,25 @@ def run_all() -> None:
>> _exit_dts()
>>
>>
>> +def _check_dts_python_version() -> None:
>> + def RED(text: str) -> str:
>> + return f"\u001B[31;1m{str(text)}\u001B[0m"
>> +
>> + if sys.version_info.major < 3 or (
>> + sys.version_info.major == 3 and sys.version_info.minor < 10
>> + ):
>> + print(
>> + RED(
>> + (
>> + "WARNING: DTS execution node's python version is lower than"
>> + "python 3.10, is deprecated and will not work in future releases."
>> + )
>> + ),
>> + file=sys.stderr,
>> + )
>> + print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
>> +
>> +
>> def _run_execution(
>> sut_node: SutNode,
>> tg_node: TGNode,
>> diff --git a/dts/framework/exception.py b/dts/framework/exception.py
>> index 001a5a5496..7489c03570 100644
>> --- a/dts/framework/exception.py
>> +++ b/dts/framework/exception.py
>> @@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
>> Command execution timeout.
>> """
>>
>> - command: str
>> - output: str
>> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
>> + _command: str
>>
>> - def __init__(self, command: str, output: str):
>> - self.command = command
>> - self.output = output
>> + def __init__(self, command: str):
>> + self._command = command
>>
>> def __str__(self) -> str:
>> - return f"TIMEOUT on {self.command}"
>> -
>> - def get_output(self) -> str:
>> - return self.output
>> + return f"TIMEOUT on {self._command}"
>>
>>
>> class SSHConnectionError(DTSError):
>> @@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
>> SSH connection error.
>> """
>>
>> - host: str
>> - errors: list[str]
>> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
>> + _host: str
>> + _errors: list[str]
>>
>> def __init__(self, host: str, errors: list[str] | None = None):
>> - self.host = host
>> - self.errors = [] if errors is None else errors
>> + self._host = host
>> + self._errors = [] if errors is None else errors
>>
>> def __str__(self) -> str:
>> - message = f"Error trying to connect with {self.host}."
>> - if self.errors:
>> - message += f" Errors encountered while retrying: {', '.join(self.errors)}"
>> + message = f"Error trying to connect with {self._host}."
>> + if self._errors:
>> + message += f" Errors encountered while retrying: {', '.join(self._errors)}"
>>
>> return message
>>
>> @@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
>> It can no longer be used.
>> """
>>
>> - host: str
>> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
>> + _host: str
>>
>> def __init__(self, host: str):
>> - self.host = host
>> + self._host = host
>>
>> def __str__(self) -> str:
>> - return f"SSH session with {self.host} has died"
>> + return f"SSH session with {self._host} has died"
>>
>>
>> class ConfigurationError(DTSError):
>> @@ -107,18 +102,18 @@ class RemoteCommandExecutionError(DTSError):
>> Raised when a command executed on a Node returns a non-zero exit status.
>> """
>>
>> - command: str
>> - command_return_code: int
>> severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
>> + command: str
>> + _command_return_code: int
>>
>> def __init__(self, command: str, command_return_code: int):
>> self.command = command
>> - self.command_return_code = command_return_code
>> + self._command_return_code = command_return_code
>>
>> def __str__(self) -> str:
>> return (
>> f"Command {self.command} returned a non-zero exit code: "
>> - f"{self.command_return_code}"
>> + f"{self._command_return_code}"
>> )
>>
>>
>> @@ -143,22 +138,15 @@ class TestCaseVerifyError(DTSError):
>> Used in test cases to verify the expected behavior.
>> """
>>
>> - value: str
>> severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
>>
>> - def __init__(self, value: str):
>> - self.value = value
>> -
>> - def __str__(self) -> str:
>> - return repr(self.value)
>> -
>
>
> Does this change mean we are no longer providing descriptions for what failing the verification means? I guess there isn't really harm in removing that functionality, but I'm not sure I see the value in removing the extra information either.
>
This shouldn't have any impact on the existing functionality. The
error message will be stored even without the variable (that's the
default behavior of exceptions) and the string representation is not
used anywhere in code and even if it was, the only difference is the
self.value string would be in quotes. This just removes unnecessary
code, which I didn't want to document as that would be just confusing.
>>
>>
>> class BlockingTestSuiteError(DTSError):
>> - suite_name: str
>> severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
>> + _suite_name: str
>>
>> def __init__(self, suite_name: str) -> None:
>> - self.suite_name = suite_name
>> + self._suite_name = suite_name
>>
>> def __str__(self) -> str:
>> - return f"Blocking suite {self.suite_name} failed."
>> + return f"Blocking suite {self._suite_name} failed."
>> diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
>> index 00b6d1f03a..5e7ddb2b05 100644
>> --- a/dts/framework/remote_session/__init__.py
>> +++ b/dts/framework/remote_session/__init__.py
>> @@ -12,29 +12,24 @@
>>
>> # pylama:ignore=W0611
>>
>> -from framework.config import OS, NodeConfiguration
>> -from framework.exception import ConfigurationError
>> +from framework.config import NodeConfiguration
>> from framework.logger import DTSLOG
>>
>> -from .linux_session import LinuxSession
>> -from .os_session import InteractiveShellType, OSSession
>> -from .remote import (
>> - CommandResult,
>> - InteractiveRemoteSession,
>> - InteractiveShell,
>> - PythonShell,
>> - RemoteSession,
>> - SSHSession,
>> - TestPmdDevice,
>> - TestPmdShell,
>> -)
>> -
>> -
>> -def create_session(
>> +from .interactive_remote_session import InteractiveRemoteSession
>> +from .interactive_shell import InteractiveShell
>> +from .python_shell import PythonShell
>> +from .remote_session import CommandResult, RemoteSession
>> +from .ssh_session import SSHSession
>> +from .testpmd_shell import TestPmdShell
>> +
>> +
>> +def create_remote_session(
>> node_config: NodeConfiguration, name: str, logger: DTSLOG
>> -) -> OSSession:
>> - match node_config.os:
>> - case OS.linux:
>> - return LinuxSession(node_config, name, logger)
>> - case _:
>> - raise ConfigurationError(f"Unsupported OS {node_config.os}")
>> +) -> RemoteSession:
>> + return SSHSession(node_config, name, logger)
>> +
>> +
>> +def create_interactive_session(
>> + node_config: NodeConfiguration, logger: DTSLOG
>> +) -> InteractiveRemoteSession:
>> + return InteractiveRemoteSession(node_config, logger)
>> diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
>> similarity index 100%
>> rename from dts/framework/remote_session/remote/interactive_remote_session.py
>> rename to dts/framework/remote_session/interactive_remote_session.py
>> diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
>> similarity index 100%
>> rename from dts/framework/remote_session/remote/interactive_shell.py
>> rename to dts/framework/remote_session/interactive_shell.py
>> diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
>> similarity index 100%
>> rename from dts/framework/remote_session/remote/python_shell.py
>> rename to dts/framework/remote_session/python_shell.py
>> diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
>> deleted file mode 100644
>> index 06403691a5..0000000000
>> --- a/dts/framework/remote_session/remote/__init__.py
>> +++ /dev/null
>> @@ -1,27 +0,0 @@
>> -# SPDX-License-Identifier: BSD-3-Clause
>> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
>> -# Copyright(c) 2023 University of New Hampshire
>> -
>> -# pylama:ignore=W0611
>> -
>> -from framework.config import NodeConfiguration
>> -from framework.logger import DTSLOG
>> -
>> -from .interactive_remote_session import InteractiveRemoteSession
>> -from .interactive_shell import InteractiveShell
>> -from .python_shell import PythonShell
>> -from .remote_session import CommandResult, RemoteSession
>> -from .ssh_session import SSHSession
>> -from .testpmd_shell import TestPmdDevice, TestPmdShell
>> -
>> -
>> -def create_remote_session(
>> - node_config: NodeConfiguration, name: str, logger: DTSLOG
>> -) -> RemoteSession:
>> - return SSHSession(node_config, name, logger)
>> -
>> -
>> -def create_interactive_session(
>> - node_config: NodeConfiguration, logger: DTSLOG
>> -) -> InteractiveRemoteSession:
>> - return InteractiveRemoteSession(node_config, logger)
>> diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
>> similarity index 100%
>> rename from dts/framework/remote_session/remote/remote_session.py
>> rename to dts/framework/remote_session/remote_session.py
>> diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
>> similarity index 91%
>> rename from dts/framework/remote_session/remote/ssh_session.py
>> rename to dts/framework/remote_session/ssh_session.py
>> index 8d127f1601..cee11d14d6 100644
>> --- a/dts/framework/remote_session/remote/ssh_session.py
>> +++ b/dts/framework/remote_session/ssh_session.py
>> @@ -18,9 +18,7 @@
>> SSHException,
>> )
>>
>> -from framework.config import NodeConfiguration
>> from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
>> -from framework.logger import DTSLOG
>>
>> from .remote_session import CommandResult, RemoteSession
>>
>> @@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
>>
>> session: Connection
>>
>> - def __init__(
>> - self,
>> - node_config: NodeConfiguration,
>> - session_name: str,
>> - logger: DTSLOG,
>> - ):
>> - super(SSHSession, self).__init__(node_config, session_name, logger)
>> -
>> def _connect(self) -> None:
>> errors = []
>> retry_attempts = 10
>> @@ -117,7 +107,7 @@ def _send_command(
>>
>> except CommandTimedOut as e:
>> self._logger.exception(e)
>> - raise SSHTimeoutError(command, e.result.stderr) from e
>> + raise SSHTimeoutError(command) from e
>>
>> return CommandResult(
>> self.name, command, output.stdout, output.stderr, output.return_code
>> diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
>> similarity index 100%
>> rename from dts/framework/remote_session/remote/testpmd_shell.py
>> rename to dts/framework/remote_session/testpmd_shell.py
>> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
>> index cfa39d011b..7f5841d073 100644
>> --- a/dts/framework/settings.py
>> +++ b/dts/framework/settings.py
>> @@ -6,7 +6,7 @@
>> import argparse
>> import os
>> from collections.abc import Callable, Iterable, Sequence
>> -from dataclasses import dataclass
>> +from dataclasses import dataclass, field
>> from pathlib import Path
>> from typing import Any, TypeVar
>>
>> @@ -22,8 +22,8 @@ def __init__(
>> option_strings: Sequence[str],
>> dest: str,
>> nargs: str | int | None = None,
>> - const: str | None = None,
>> - default: str = None,
>> + const: bool | None = None,
>> + default: Any = None,
>> type: Callable[[str], _T | argparse.FileType | None] = None,
>> choices: Iterable[_T] | None = None,
>> required: bool = False,
>> @@ -32,6 +32,12 @@ def __init__(
>> ) -> None:
>> env_var_value = os.environ.get(env_var)
>> default = env_var_value or default
>> + if const is not None:
>> + nargs = 0
>> + default = const if env_var_value else default
>> + type = None
>> + choices = None
>> + metavar = None
>> super(_EnvironmentArgument, self).__init__(
>> option_strings,
>> dest,
>> @@ -52,22 +58,28 @@ def __call__(
>> values: Any,
>> option_string: str = None,
>> ) -> None:
>> - setattr(namespace, self.dest, values)
>> + if self.const is not None:
>> + setattr(namespace, self.dest, self.const)
>> + else:
>> + setattr(namespace, self.dest, values)
>>
>> return _EnvironmentArgument
>>
>>
>> -@dataclass(slots=True, frozen=True)
>> -class _Settings:
>> - config_file_path: str
>> - output_dir: str
>> - timeout: float
>> - verbose: bool
>> - skip_setup: bool
>> - dpdk_tarball_path: Path
>> - compile_timeout: float
>> - test_cases: list
>> - re_run: int
>> +@dataclass(slots=True)
>> +class Settings:
>> + config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
>> + output_dir: str = "output"
>> + timeout: float = 15
>> + verbose: bool = False
>> + skip_setup: bool = False
>> + dpdk_tarball_path: Path | str = "dpdk.tar.xz"
>> + compile_timeout: float = 1200
>> + test_cases: list[str] = field(default_factory=list)
>> + re_run: int = 0
>> +
>> +
>> +SETTINGS: Settings = Settings()
>>
>>
>> def _get_parser() -> argparse.ArgumentParser:
>> @@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
>> parser.add_argument(
>> "--config-file",
>> action=_env_arg("DTS_CFG_FILE"),
>> - default="conf.yaml",
>> + default=SETTINGS.config_file_path,
>> + type=Path,
>> help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
>> "and targets.",
>> )
>> @@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
>> "--output-dir",
>> "--output",
>> action=_env_arg("DTS_OUTPUT_DIR"),
>> - default="output",
>> + default=SETTINGS.output_dir,
>> help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
>> )
>>
>> @@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
>> "-t",
>> "--timeout",
>> action=_env_arg("DTS_TIMEOUT"),
>> - default=15,
>> + default=SETTINGS.timeout,
>> type=float,
>> help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
>> "compiling DPDK.",
>> @@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
>> "-v",
>> "--verbose",
>> action=_env_arg("DTS_VERBOSE"),
>> - default="N",
>> - help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
>> + default=SETTINGS.verbose,
>> + const=True,
>> + help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
>> "to the console.",
>> )
>>
>> @@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
>> "-s",
>> "--skip-setup",
>> action=_env_arg("DTS_SKIP_SETUP"),
>> - default="N",
>> - help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
>> + const=True,
>> + help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
>> )
>>
>> parser.add_argument(
>> @@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
>> "--snapshot",
>> "--git-ref",
>> action=_env_arg("DTS_DPDK_TARBALL"),
>> - default="dpdk.tar.xz",
>> + default=SETTINGS.dpdk_tarball_path,
>> type=Path,
>> help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
>> "tag ID or tree ID to test. To test local changes, first commit them, "
>> @@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
>> parser.add_argument(
>> "--compile-timeout",
>> action=_env_arg("DTS_COMPILE_TIMEOUT"),
>> - default=1200,
>> + default=SETTINGS.compile_timeout,
>> type=float,
>> help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
>> )
>> @@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
>> "--re-run",
>> "--re_run",
>> action=_env_arg("DTS_RERUN"),
>> - default=0,
>> + default=SETTINGS.re_run,
>> type=int,
>> help="[DTS_RERUN] Re-run each test case the specified amount of times "
>> "if a test failure occurs",
>> @@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
>> return parser
>>
>>
>> -def _get_settings() -> _Settings:
>> +def get_settings() -> Settings:
>> parsed_args = _get_parser().parse_args()
>> - return _Settings(
>> + return Settings(
>> config_file_path=parsed_args.config_file,
>> output_dir=parsed_args.output_dir,
>> timeout=parsed_args.timeout,
>> - verbose=(parsed_args.verbose == "Y"),
>> - skip_setup=(parsed_args.skip_setup == "Y"),
>> + verbose=parsed_args.verbose,
>> + skip_setup=parsed_args.skip_setup,
>> dpdk_tarball_path=Path(
>> - DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
>> - )
>> - if not os.path.exists(parsed_args.tarball)
>> - else Path(parsed_args.tarball),
>> + Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
>> + if not os.path.exists(parsed_args.tarball)
>> + else Path(parsed_args.tarball)
>> + ),
>> compile_timeout=parsed_args.compile_timeout,
>> - test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
>> + test_cases=(
>> + parsed_args.test_cases.split(",") if parsed_args.test_cases else []
>> + ),
>> re_run=parsed_args.re_run,
>> )
>> -
>> -
>> -SETTINGS: _Settings = _get_settings()
>> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
>> index f0fbe80f6f..603e18872c 100644
>> --- a/dts/framework/test_result.py
>> +++ b/dts/framework/test_result.py
>> @@ -254,7 +254,7 @@ def add_build_target(
>> self._inner_results.append(build_target_result)
>> return build_target_result
>>
>> - def add_sut_info(self, sut_info: NodeInfo):
>> + def add_sut_info(self, sut_info: NodeInfo) -> None:
>> self.sut_os_name = sut_info.os_name
>> self.sut_os_version = sut_info.os_version
>> self.sut_kernel_version = sut_info.kernel_version
>> @@ -297,7 +297,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
>> self._inner_results.append(execution_result)
>> return execution_result
>>
>> - def add_error(self, error) -> None:
>> + def add_error(self, error: Exception) -> None:
>> self._errors.append(error)
>>
>> def process(self) -> None:
>> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
>> index 3b890c0451..d53553bf34 100644
>> --- a/dts/framework/test_suite.py
>> +++ b/dts/framework/test_suite.py
>> @@ -11,7 +11,7 @@
>> import re
>> from ipaddress import IPv4Interface, IPv6Interface, ip_interface
>> from types import MethodType
>> -from typing import Union
>> +from typing import Any, Union
>>
>> from scapy.layers.inet import IP # type: ignore[import]
>> from scapy.layers.l2 import Ether # type: ignore[import]
>> @@ -26,8 +26,7 @@
>> from .logger import DTSLOG, getLogger
>> from .settings import SETTINGS
>> from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
>> -from .testbed_model import SutNode, TGNode
>> -from .testbed_model.hw.port import Port, PortLink
>> +from .testbed_model import Port, PortLink, SutNode, TGNode
>> from .utils import get_packet_summaries
>>
>>
>> @@ -453,7 +452,7 @@ def _execute_test_case(
>>
>>
>> def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
>> - def is_test_suite(object) -> bool:
>> + def is_test_suite(object: Any) -> bool:
>> try:
>> if issubclass(object, TestSuite) and object is not TestSuite:
>> return True
>> diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
>> index 5cbb859e47..8ced05653b 100644
>> --- a/dts/framework/testbed_model/__init__.py
>> +++ b/dts/framework/testbed_model/__init__.py
>> @@ -9,15 +9,9 @@
>>
>> # pylama:ignore=W0611
>>
>> -from .hw import (
>> - LogicalCore,
>> - LogicalCoreCount,
>> - LogicalCoreCountFilter,
>> - LogicalCoreList,
>> - LogicalCoreListFilter,
>> - VirtualDevice,
>> - lcore_filter,
>> -)
>> +from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
>> from .node import Node
>> +from .port import Port, PortLink
>> from .sut_node import SutNode
>> from .tg_node import TGNode
>> +from .virtual_device import VirtualDevice
>> diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
>> similarity index 95%
>> rename from dts/framework/testbed_model/hw/cpu.py
>> rename to dts/framework/testbed_model/cpu.py
>> index d1918a12dc..8fe785dfe4 100644
>> --- a/dts/framework/testbed_model/hw/cpu.py
>> +++ b/dts/framework/testbed_model/cpu.py
>> @@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
>> )
>>
>> return filtered_lcores
>> +
>> +
>> +def lcore_filter(
>> + core_list: list[LogicalCore],
>> + filter_specifier: LogicalCoreCount | LogicalCoreList,
>> + ascending: bool,
>> +) -> LogicalCoreFilter:
>> + if isinstance(filter_specifier, LogicalCoreList):
>> + return LogicalCoreListFilter(core_list, filter_specifier, ascending)
>> + elif isinstance(filter_specifier, LogicalCoreCount):
>> + return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
>> + else:
>> + raise ValueError(f"Unsupported filter r{filter_specifier}")
>> diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
>> deleted file mode 100644
>> index 88ccac0b0e..0000000000
>> --- a/dts/framework/testbed_model/hw/__init__.py
>> +++ /dev/null
>> @@ -1,27 +0,0 @@
>> -# SPDX-License-Identifier: BSD-3-Clause
>> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
>> -
>> -# pylama:ignore=W0611
>> -
>> -from .cpu import (
>> - LogicalCore,
>> - LogicalCoreCount,
>> - LogicalCoreCountFilter,
>> - LogicalCoreFilter,
>> - LogicalCoreList,
>> - LogicalCoreListFilter,
>> -)
>> -from .virtual_device import VirtualDevice
>> -
>> -
>> -def lcore_filter(
>> - core_list: list[LogicalCore],
>> - filter_specifier: LogicalCoreCount | LogicalCoreList,
>> - ascending: bool,
>> -) -> LogicalCoreFilter:
>> - if isinstance(filter_specifier, LogicalCoreList):
>> - return LogicalCoreListFilter(core_list, filter_specifier, ascending)
>> - elif isinstance(filter_specifier, LogicalCoreCount):
>> - return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
>> - else:
>> - raise ValueError(f"Unsupported filter r{filter_specifier}")
>> diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
>> similarity index 97%
>> rename from dts/framework/remote_session/linux_session.py
>> rename to dts/framework/testbed_model/linux_session.py
>> index a3f1a6bf3b..f472bb8f0f 100644
>> --- a/dts/framework/remote_session/linux_session.py
>> +++ b/dts/framework/testbed_model/linux_session.py
>> @@ -9,10 +9,10 @@
>> from typing_extensions import NotRequired
>>
>> from framework.exception import RemoteCommandExecutionError
>> -from framework.testbed_model import LogicalCore
>> -from framework.testbed_model.hw.port import Port
>> from framework.utils import expand_range
>>
>> +from .cpu import LogicalCore
>> +from .port import Port
>> from .posix_session import PosixSession
>>
>>
>> @@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
>> lcores.append(LogicalCore(lcore, core, socket, node))
>> return lcores
>>
>> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
>> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
>> return dpdk_prefix
>>
>> def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
>> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
>> index fc01e0bf8e..fa5b143cdd 100644
>> --- a/dts/framework/testbed_model/node.py
>> +++ b/dts/framework/testbed_model/node.py
>> @@ -12,23 +12,26 @@
>> from typing import Any, Callable, Type, Union
>>
>> from framework.config import (
>> + OS,
>> BuildTargetConfiguration,
>> ExecutionConfiguration,
>> NodeConfiguration,
>> )
>> +from framework.exception import ConfigurationError
>> from framework.logger import DTSLOG, getLogger
>> -from framework.remote_session import InteractiveShellType, OSSession, create_session
>> from framework.settings import SETTINGS
>>
>> -from .hw import (
>> +from .cpu import (
>> LogicalCore,
>> LogicalCoreCount,
>> LogicalCoreList,
>> LogicalCoreListFilter,
>> - VirtualDevice,
>> lcore_filter,
>> )
>> -from .hw.port import Port
>> +from .linux_session import LinuxSession
>> +from .os_session import InteractiveShellType, OSSession
>> +from .port import Port
>> +from .virtual_device import VirtualDevice
>>
>>
>> class Node(ABC):
>> @@ -172,9 +175,9 @@ def create_interactive_shell(
>>
>> return self.main_session.create_interactive_shell(
>> shell_cls,
>> - app_args,
>> timeout,
>> privileged,
>> + app_args,
>> )
>>
>> def filter_lcores(
>> @@ -205,7 +208,7 @@ def _get_remote_cpus(self) -> None:
>> self._logger.info("Getting CPU information.")
>> self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
>>
>> - def _setup_hugepages(self):
>> + def _setup_hugepages(self) -> None:
>> """
>> Setup hugepages on the Node. Different architectures can supply different
>> amounts of memory for hugepages and numa-based hugepage allocation may need
>> @@ -249,3 +252,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
>> return lambda *args: None
>> else:
>> return func
>> +
>> +
>> +def create_session(
>> + node_config: NodeConfiguration, name: str, logger: DTSLOG
>> +) -> OSSession:
>> + match node_config.os:
>> + case OS.linux:
>> + return LinuxSession(node_config, name, logger)
>> + case _:
>> + raise ConfigurationError(f"Unsupported OS {node_config.os}")
>> diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
>> similarity index 95%
>> rename from dts/framework/remote_session/os_session.py
>> rename to dts/framework/testbed_model/os_session.py
>> index 8a709eac1c..76e595a518 100644
>> --- a/dts/framework/remote_session/os_session.py
>> +++ b/dts/framework/testbed_model/os_session.py
>> @@ -10,19 +10,19 @@
>>
>> from framework.config import Architecture, NodeConfiguration, NodeInfo
>> from framework.logger import DTSLOG
>> -from framework.remote_session.remote import InteractiveShell
>> -from framework.settings import SETTINGS
>> -from framework.testbed_model import LogicalCore
>> -from framework.testbed_model.hw.port import Port
>> -from framework.utils import MesonArgs
>> -
>> -from .remote import (
>> +from framework.remote_session import (
>> CommandResult,
>> InteractiveRemoteSession,
>> + InteractiveShell,
>> RemoteSession,
>> create_interactive_session,
>> create_remote_session,
>> )
>> +from framework.settings import SETTINGS
>> +from framework.utils import MesonArgs
>> +
>> +from .cpu import LogicalCore
>> +from .port import Port
>>
>> InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
>>
>> @@ -85,9 +85,9 @@ def send_command(
>> def create_interactive_shell(
>> self,
>> shell_cls: Type[InteractiveShellType],
>> - eal_parameters: str,
>> timeout: float,
>> privileged: bool,
>> + app_args: str,
>> ) -> InteractiveShellType:
>> """
>> See "create_interactive_shell" in SutNode
>> @@ -96,7 +96,7 @@ def create_interactive_shell(
>> self.interactive_session.session,
>> self._logger,
>> self._get_privileged_command if privileged else None,
>> - eal_parameters,
>> + app_args,
>> timeout,
>> )
>>
>> @@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
>> """
>>
>> @abstractmethod
>> - def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
>> + def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
>> """
>> Try to find DPDK remote dir in remote_dir.
>> """
>> @@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
>> """
>>
>> @abstractmethod
>> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
>> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
>> """
>> Get the DPDK file prefix that will be used when running DPDK apps.
>> """
>> diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
>> similarity index 100%
>> rename from dts/framework/testbed_model/hw/port.py
>> rename to dts/framework/testbed_model/port.py
>> diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
>> similarity index 98%
>> rename from dts/framework/remote_session/posix_session.py
>> rename to dts/framework/testbed_model/posix_session.py
>> index 5da0516e05..1d1d5b1b26 100644
>> --- a/dts/framework/remote_session/posix_session.py
>> +++ b/dts/framework/testbed_model/posix_session.py
>> @@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
>>
>> return ret_opts
>>
>> - def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
>> + def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
>> remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
>> result = self.send_command(f"ls -d {remote_guess} | tail -1")
>> return PurePosixPath(result.stdout)
>> @@ -219,7 +219,7 @@ def _remove_dpdk_runtime_dirs(
>> for dpdk_runtime_dir in dpdk_runtime_dirs:
>> self.remove_remote_dir(dpdk_runtime_dir)
>>
>> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
>> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
>> return ""
>>
>> def get_compiler_version(self, compiler_name: str) -> str:
>> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
>> index 4161d3a4d5..17deea06e2 100644
>> --- a/dts/framework/testbed_model/sut_node.py
>> +++ b/dts/framework/testbed_model/sut_node.py
>> @@ -15,12 +15,14 @@
>> NodeInfo,
>> SutNodeConfiguration,
>> )
>> -from framework.remote_session import CommandResult, InteractiveShellType, OSSession
>> +from framework.remote_session import CommandResult
>> from framework.settings import SETTINGS
>> from framework.utils import MesonArgs
>>
>> -from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
>> +from .cpu import LogicalCoreCount, LogicalCoreList
>> from .node import Node
>> +from .os_session import InteractiveShellType, OSSession
>> +from .virtual_device import VirtualDevice
>>
>>
>> class EalParameters(object):
>> @@ -307,7 +309,7 @@ def create_eal_parameters(
>> prefix: str = "dpdk",
>> append_prefix_timestamp: bool = True,
>> no_pci: bool = False,
>> - vdevs: list[VirtualDevice] = None,
>> + vdevs: list[VirtualDevice] | None = None,
>> other_eal_param: str = "",
>> ) -> "EalParameters":
>> """
>> diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
>> index 27025cfa31..166eb8430e 100644
>> --- a/dts/framework/testbed_model/tg_node.py
>> +++ b/dts/framework/testbed_model/tg_node.py
>> @@ -16,16 +16,11 @@
>>
>> from scapy.packet import Packet # type: ignore[import]
>>
>> -from framework.config import (
>> - ScapyTrafficGeneratorConfig,
>> - TGNodeConfiguration,
>> - TrafficGeneratorType,
>> -)
>> -from framework.exception import ConfigurationError
>> -
>> -from .capturing_traffic_generator import CapturingTrafficGenerator
>> -from .hw.port import Port
>> +from framework.config import TGNodeConfiguration
>> +
>> from .node import Node
>> +from .port import Port
>> +from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
>>
>>
>> class TGNode(Node):
>> @@ -80,20 +75,3 @@ def close(self) -> None:
>> """Free all resources used by the node"""
>> self.traffic_generator.close()
>> super(TGNode, self).close()
>> -
>> -
>> -def create_traffic_generator(
>> - tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
>> -) -> CapturingTrafficGenerator:
>> - """A factory function for creating traffic generator object from user config."""
>> -
>> - from .scapy import ScapyTrafficGenerator
>> -
>> - match traffic_generator_config.traffic_generator_type:
>> - case TrafficGeneratorType.SCAPY:
>> - return ScapyTrafficGenerator(tg_node, traffic_generator_config)
>> - case _:
>> - raise ConfigurationError(
>> - "Unknown traffic generator: "
>> - f"{traffic_generator_config.traffic_generator_type}"
>> - )
>> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
>> new file mode 100644
>> index 0000000000..11bfa1ee0f
>> --- /dev/null
>> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
>> @@ -0,0 +1,24 @@
>> +# SPDX-License-Identifier: BSD-3-Clause
>> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
>> +
>> +from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
>> +from framework.exception import ConfigurationError
>> +from framework.testbed_model.node import Node
>> +
>> +from .capturing_traffic_generator import CapturingTrafficGenerator
>> +from .scapy import ScapyTrafficGenerator
>> +
>> +
>> +def create_traffic_generator(
>> + tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
>> +) -> CapturingTrafficGenerator:
>> + """A factory function for creating traffic generator object from user config."""
>> +
>> + match traffic_generator_config.traffic_generator_type:
>> + case TrafficGeneratorType.SCAPY:
>> + return ScapyTrafficGenerator(tg_node, traffic_generator_config)
>> + case _:
>> + raise ConfigurationError(
>> + "Unknown traffic generator: "
>> + f"{traffic_generator_config.traffic_generator_type}"
>> + )
>> diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
>> similarity index 96%
>> rename from dts/framework/testbed_model/capturing_traffic_generator.py
>> rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
>> index ab98987f8e..e521211ef0 100644
>> --- a/dts/framework/testbed_model/capturing_traffic_generator.py
>> +++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
>> @@ -16,9 +16,9 @@
>> from scapy.packet import Packet # type: ignore[import]
>>
>> from framework.settings import SETTINGS
>> +from framework.testbed_model.port import Port
>> from framework.utils import get_packet_summaries
>>
>> -from .hw.port import Port
>> from .traffic_generator import TrafficGenerator
>>
>>
>> @@ -130,7 +130,9 @@ def _send_packets_and_capture(
>> for the specified duration. It must be able to handle no received packets.
>> """
>>
>> - def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
>> + def _write_capture_from_packets(
>> + self, capture_name: str, packets: list[Packet]
>> + ) -> None:
>> file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
>> self._logger.debug(f"Writing packets to {file_name}.")
>> scapy.utils.wrpcap(file_name, packets)
>> diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
>> similarity index 95%
>> rename from dts/framework/testbed_model/scapy.py
>> rename to dts/framework/testbed_model/traffic_generator/scapy.py
>> index af0d4dbb25..51864b6e6b 100644
>> --- a/dts/framework/testbed_model/scapy.py
>> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
>> @@ -24,16 +24,15 @@
>> from scapy.packet import Packet # type: ignore[import]
>>
>> from framework.config import OS, ScapyTrafficGeneratorConfig
>> -from framework.logger import DTSLOG, getLogger
>> from framework.remote_session import PythonShell
>> from framework.settings import SETTINGS
>> +from framework.testbed_model.node import Node
>> +from framework.testbed_model.port import Port
>>
>> from .capturing_traffic_generator import (
>> CapturingTrafficGenerator,
>> _get_default_capture_name,
>> )
>> -from .hw.port import Port
>> -from .tg_node import TGNode
>>
>> """
>> ========= BEGIN RPC FUNCTIONS =========
>> @@ -146,7 +145,7 @@ def quit(self) -> None:
>> self._BaseServer__shutdown_request = True
>> return None
>>
>> - def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
>> + def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
>> """Add a function to the server.
>>
>> This is meant to be executed remotely.
>> @@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
>> session: PythonShell
>> rpc_server_proxy: xmlrpc.client.ServerProxy
>> _config: ScapyTrafficGeneratorConfig
>> - _tg_node: TGNode
>> - _logger: DTSLOG
>> -
>> - def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
>> - self._config = config
>> - self._tg_node = tg_node
>> - self._logger = getLogger(
>> - f"{self._tg_node.name} {self._config.traffic_generator_type}"
>> - )
>> +
>> + def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
>> + super().__init__(tg_node, config)
>>
>> assert (
>> self._tg_node.config.os == OS.linux
>> @@ -235,7 +228,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
>> function_bytes = marshal.dumps(function.__code__)
>> self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
>>
>> - def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
>> + def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
>> # load the source of the function
>> src = inspect.getsource(QuittableXMLRPCServer)
>> # Lines with only whitespace break the repl if in the middle of a function
>> @@ -280,7 +273,7 @@ def _send_packets_and_capture(
>> scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
>> return scapy_packets
>>
>> - def close(self):
>> + def close(self) -> None:
>> try:
>> self.rpc_server_proxy.quit()
>> except ConnectionRefusedError:
>> diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
>> similarity index 80%
>> rename from dts/framework/testbed_model/traffic_generator.py
>> rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
>> index 28c35d3ce4..ea7c3963da 100644
>> --- a/dts/framework/testbed_model/traffic_generator.py
>> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
>> @@ -12,11 +12,12 @@
>>
>> from scapy.packet import Packet # type: ignore[import]
>>
>> -from framework.logger import DTSLOG
>> +from framework.config import TrafficGeneratorConfig
>> +from framework.logger import DTSLOG, getLogger
>> +from framework.testbed_model.node import Node
>> +from framework.testbed_model.port import Port
>> from framework.utils import get_packet_summaries
>>
>> -from .hw.port import Port
>> -
>>
>> class TrafficGenerator(ABC):
>> """The base traffic generator.
>> @@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
>> Defines the few basic methods that each traffic generator must implement.
>> """
>>
>> + _config: TrafficGeneratorConfig
>> + _tg_node: Node
>
>
> Is there a benefit to changing this to be a node instead of a TGNode? Wouldn't we want the capabilities of the TGNode to be accessible in the TrafficGenerator class?
>
The benefit is that it works :-). If this was TGNode there would be
circular imports. It's possible this should be done differently, but I
wanted to do as little as possible to make the doc generation work.
Anything more would be out of scope of this patch I feel.
>>
>> _logger: DTSLOG
>>
>> + def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
>> + self._config = config
>> + self._tg_node = tg_node
>> + self._logger = getLogger(
>> + f"{self._tg_node.name} {self._config.traffic_generator_type}"
>> + )
>> +
>> def send_packet(self, packet: Packet, port: Port) -> None:
>> """Send a packet and block until it is fully sent.
>>
>> diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
>> similarity index 100%
>> rename from dts/framework/testbed_model/hw/virtual_device.py
>> rename to dts/framework/testbed_model/virtual_device.py
>> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
>> index d27c2c5b5f..f0c916471c 100644
>> --- a/dts/framework/utils.py
>> +++ b/dts/framework/utils.py
>> @@ -7,7 +7,6 @@
>> import json
>> import os
>> import subprocess
>> -import sys
>> from enum import Enum
>> from pathlib import Path
>> from subprocess import SubprocessError
>> @@ -16,35 +15,7 @@
>>
>> from .exception import ConfigurationError
>>
>> -
>> -class StrEnum(Enum):
>> - @staticmethod
>> - def _generate_next_value_(
>> - name: str, start: int, count: int, last_values: object
>> - ) -> str:
>> - return name
>> -
>> - def __str__(self) -> str:
>> - return self.name
>> -
>> -
>> -REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
>> -
>> -
>> -def check_dts_python_version() -> None:
>> - if sys.version_info.major < 3 or (
>> - sys.version_info.major == 3 and sys.version_info.minor < 10
>> - ):
>> - print(
>> - RED(
>> - (
>> - "WARNING: DTS execution node's python version is lower than"
>> - "python 3.10, is deprecated and will not work in future releases."
>> - )
>> - ),
>> - file=sys.stderr,
>> - )
>> - print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
>> +REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
>>
>>
>> def expand_range(range_str: str) -> list[int]:
>> @@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
>> return expanded_range
>>
>>
>> -def get_packet_summaries(packets: list[Packet]):
>> +def get_packet_summaries(packets: list[Packet]) -> str:
>> if len(packets) == 1:
>> packet_summaries = packets[0].summary()
>> else:
>> @@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
>> return f"Packet contents: \n{packet_summaries}"
>>
>>
>> -def RED(text: str) -> str:
>> - return f"\u001B[31;1m{str(text)}\u001B[0m"
>> +class StrEnum(Enum):
>> + @staticmethod
>> + def _generate_next_value_(
>> + name: str, start: int, count: int, last_values: object
>> + ) -> str:
>> + return name
>> +
>> + def __str__(self) -> str:
>> + return self.name
>>
>>
>> class MesonArgs(object):
>> @@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
>> if self._tarball_path and os.path.exists(self._tarball_path):
>> os.remove(self._tarball_path)
>>
>> - def __fspath__(self):
>> + def __fspath__(self) -> str:
>> return str(self._tarball_path)
>> diff --git a/dts/main.py b/dts/main.py
>> index 43311fa847..5d4714b0c3 100755
>> --- a/dts/main.py
>> +++ b/dts/main.py
>> @@ -10,10 +10,17 @@
>>
>> import logging
>>
>> -from framework import dts
>> +from framework import settings
>>
>>
>> def main() -> None:
>> + """Set DTS settings, then run DTS.
>> +
>> + The DTS settings are taken from the command line arguments and the environment variables.
>> + """
>> + settings.SETTINGS = settings.get_settings()
>> + from framework import dts
>> +
>> dts.run_all()
>>
>>
>> --
>> 2.34.1
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 01/21] dts: code adjustments for doc generation
2023-11-15 13:09 ` [PATCH v7 01/21] dts: code adjustments for doc generation Juraj Linkeš
2023-11-16 21:04 ` Jeremy Spewock
@ 2023-11-20 16:02 ` Yoan Picchi
1 sibling, 0 replies; 393+ messages in thread
From: Yoan Picchi @ 2023-11-20 16:02 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> The standard Python tool for generating API documentation, Sphinx,
> imports modules one-by-one when generating the documentation. This
> requires code changes:
> * properly guarding argument parsing in the if __name__ == '__main__'
> block,
> * the logger used by DTS runner underwent the same treatment so that it
> doesn't create log files outside of a DTS run,
> * however, DTS uses the arguments to construct an object holding global
> variables. The defaults for the global variables needed to be moved
> from argument parsing elsewhere,
> * importing the remote_session module from framework resulted in
> circular imports because of one module trying to import another
> module. This is fixed by reorganizing the code,
> * some code reorganization was done because the resulting structure
> makes more sense, improving documentation clarity.
>
> The are some other changes which are documentation related:
> * added missing type annotation so they appear in the generated docs,
> * reordered arguments in some methods,
> * removed superfluous arguments and attributes,
> * change private functions/methods/attributes to private and vice-versa.
>
> The above all appear in the generated documentation and the with them,
> the documentation is improved.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/config/__init__.py | 10 ++-
> dts/framework/dts.py | 33 +++++--
> dts/framework/exception.py | 54 +++++-------
> dts/framework/remote_session/__init__.py | 41 ++++-----
> .../interactive_remote_session.py | 0
> .../{remote => }/interactive_shell.py | 0
> .../{remote => }/python_shell.py | 0
> .../remote_session/remote/__init__.py | 27 ------
> .../{remote => }/remote_session.py | 0
> .../{remote => }/ssh_session.py | 12 +--
> .../{remote => }/testpmd_shell.py | 0
> dts/framework/settings.py | 87 +++++++++++--------
> dts/framework/test_result.py | 4 +-
> dts/framework/test_suite.py | 7 +-
> dts/framework/testbed_model/__init__.py | 12 +--
> dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
> dts/framework/testbed_model/hw/__init__.py | 27 ------
> .../linux_session.py | 6 +-
> dts/framework/testbed_model/node.py | 25 ++++--
> .../os_session.py | 22 ++---
> dts/framework/testbed_model/{hw => }/port.py | 0
> .../posix_session.py | 4 +-
> dts/framework/testbed_model/sut_node.py | 8 +-
> dts/framework/testbed_model/tg_node.py | 30 +------
> .../traffic_generator/__init__.py | 24 +++++
> .../capturing_traffic_generator.py | 6 +-
> .../{ => traffic_generator}/scapy.py | 23 ++---
> .../traffic_generator.py | 16 +++-
> .../testbed_model/{hw => }/virtual_device.py | 0
> dts/framework/utils.py | 46 +++-------
> dts/main.py | 9 +-
> 31 files changed, 258 insertions(+), 288 deletions(-)
> rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
> rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
> rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
> delete mode 100644 dts/framework/remote_session/remote/__init__.py
> rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
> rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
> rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
> rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
> delete mode 100644 dts/framework/testbed_model/hw/__init__.py
> rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
> rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
> rename dts/framework/testbed_model/{hw => }/port.py (100%)
> rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
> create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
> rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (96%)
> rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
> rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
> rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
>
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index cb7e00ba34..2044c82611 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -17,6 +17,7 @@
> import warlock # type: ignore[import]
> import yaml
>
> +from framework.exception import ConfigurationError
> from framework.settings import SETTINGS
> from framework.utils import StrEnum
>
> @@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
> traffic_generator_type: TrafficGeneratorType
>
> @staticmethod
> - def from_dict(d: dict):
> + def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
> # This looks useless now, but is designed to allow expansion to traffic
> # generators that require more configuration later.
> match TrafficGeneratorType(d["type"]):
> @@ -97,6 +98,10 @@ def from_dict(d: dict):
> return ScapyTrafficGeneratorConfig(
> traffic_generator_type=TrafficGeneratorType.SCAPY
> )
> + case _:
> + raise ConfigurationError(
> + f'Unknown traffic generator type "{d["type"]}".'
> + )
>
>
> @dataclass(slots=True, frozen=True)
> @@ -324,6 +329,3 @@ def load_config() -> Configuration:
> config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
> config_obj: Configuration = Configuration.from_dict(dict(config))
> return config_obj
> -
> -
> -CONFIGURATION = load_config()
> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> index f773f0c38d..4c7fb0c40a 100644
> --- a/dts/framework/dts.py
> +++ b/dts/framework/dts.py
> @@ -6,19 +6,19 @@
> import sys
>
> from .config import (
> - CONFIGURATION,
> BuildTargetConfiguration,
> ExecutionConfiguration,
> TestSuiteConfig,
> + load_config,
> )
> from .exception import BlockingTestSuiteError
> from .logger import DTSLOG, getLogger
> from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
> from .test_suite import get_test_suites
> from .testbed_model import SutNode, TGNode
> -from .utils import check_dts_python_version
>
> -dts_logger: DTSLOG = getLogger("DTSRunner")
> +# dummy defaults to satisfy linters
> +dts_logger: DTSLOG = None # type: ignore[assignment]
> result: DTSResult = DTSResult(dts_logger)
>
>
> @@ -30,14 +30,18 @@ def run_all() -> None:
> global dts_logger
> global result
>
> + # create a regular DTS logger and create a new result with it
> + dts_logger = getLogger("DTSRunner")
> + result = DTSResult(dts_logger)
> +
> # check the python version of the server that run dts
> - check_dts_python_version()
> + _check_dts_python_version()
>
> sut_nodes: dict[str, SutNode] = {}
> tg_nodes: dict[str, TGNode] = {}
> try:
> # for all Execution sections
> - for execution in CONFIGURATION.executions:
> + for execution in load_config().executions:
> sut_node = sut_nodes.get(execution.system_under_test_node.name)
> tg_node = tg_nodes.get(execution.traffic_generator_node.name)
>
> @@ -82,6 +86,25 @@ def run_all() -> None:
> _exit_dts()
>
>
> +def _check_dts_python_version() -> None:
> + def RED(text: str) -> str:
> + return f"\u001B[31;1m{str(text)}\u001B[0m"
> +
> + if sys.version_info.major < 3 or (
> + sys.version_info.major == 3 and sys.version_info.minor < 10
> + ):
> + print(
> + RED(
> + (
> + "WARNING: DTS execution node's python version is lower than"
> + "python 3.10, is deprecated and will not work in future releases."
> + )
> + ),
> + file=sys.stderr,
> + )
> + print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> +
> +
> def _run_execution(
> sut_node: SutNode,
> tg_node: TGNode,
> diff --git a/dts/framework/exception.py b/dts/framework/exception.py
> index 001a5a5496..7489c03570 100644
> --- a/dts/framework/exception.py
> +++ b/dts/framework/exception.py
> @@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
> Command execution timeout.
> """
>
> - command: str
> - output: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _command: str
>
> - def __init__(self, command: str, output: str):
> - self.command = command
> - self.output = output
> + def __init__(self, command: str):
> + self._command = command
>
> def __str__(self) -> str:
> - return f"TIMEOUT on {self.command}"
> -
> - def get_output(self) -> str:
> - return self.output
> + return f"TIMEOUT on {self._command}"
>
>
> class SSHConnectionError(DTSError):
> @@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
> SSH connection error.
> """
>
> - host: str
> - errors: list[str]
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _host: str
> + _errors: list[str]
>
> def __init__(self, host: str, errors: list[str] | None = None):
> - self.host = host
> - self.errors = [] if errors is None else errors
> + self._host = host
> + self._errors = [] if errors is None else errors
>
> def __str__(self) -> str:
> - message = f"Error trying to connect with {self.host}."
> - if self.errors:
> - message += f" Errors encountered while retrying: {', '.join(self.errors)}"
> + message = f"Error trying to connect with {self._host}."
> + if self._errors:
> + message += f" Errors encountered while retrying: {', '.join(self._errors)}"
>
> return message
>
> @@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
> It can no longer be used.
> """
>
> - host: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> + _host: str
>
> def __init__(self, host: str):
> - self.host = host
> + self._host = host
>
> def __str__(self) -> str:
> - return f"SSH session with {self.host} has died"
> + return f"SSH session with {self._host} has died"
>
>
> class ConfigurationError(DTSError):
> @@ -107,18 +102,18 @@ class RemoteCommandExecutionError(DTSError):
> Raised when a command executed on a Node returns a non-zero exit status.
> """
>
> - command: str
> - command_return_code: int
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
> + command: str
> + _command_return_code: int
>
> def __init__(self, command: str, command_return_code: int):
> self.command = command
> - self.command_return_code = command_return_code
> + self._command_return_code = command_return_code
>
> def __str__(self) -> str:
> return (
> f"Command {self.command} returned a non-zero exit code: "
> - f"{self.command_return_code}"
> + f"{self._command_return_code}"
> )
>
>
> @@ -143,22 +138,15 @@ class TestCaseVerifyError(DTSError):
> Used in test cases to verify the expected behavior.
> """
>
> - value: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
>
> - def __init__(self, value: str):
> - self.value = value
> -
> - def __str__(self) -> str:
> - return repr(self.value)
> -
>
> class BlockingTestSuiteError(DTSError):
> - suite_name: str
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
> + _suite_name: str
>
> def __init__(self, suite_name: str) -> None:
> - self.suite_name = suite_name
> + self._suite_name = suite_name
>
> def __str__(self) -> str:
> - return f"Blocking suite {self.suite_name} failed."
> + return f"Blocking suite {self._suite_name} failed."
> diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
> index 00b6d1f03a..5e7ddb2b05 100644
> --- a/dts/framework/remote_session/__init__.py
> +++ b/dts/framework/remote_session/__init__.py
> @@ -12,29 +12,24 @@
>
> # pylama:ignore=W0611
>
> -from framework.config import OS, NodeConfiguration
> -from framework.exception import ConfigurationError
> +from framework.config import NodeConfiguration
> from framework.logger import DTSLOG
>
> -from .linux_session import LinuxSession
> -from .os_session import InteractiveShellType, OSSession
> -from .remote import (
> - CommandResult,
> - InteractiveRemoteSession,
> - InteractiveShell,
> - PythonShell,
> - RemoteSession,
> - SSHSession,
> - TestPmdDevice,
> - TestPmdShell,
> -)
> -
> -
> -def create_session(
> +from .interactive_remote_session import InteractiveRemoteSession
> +from .interactive_shell import InteractiveShell
> +from .python_shell import PythonShell
> +from .remote_session import CommandResult, RemoteSession
> +from .ssh_session import SSHSession
> +from .testpmd_shell import TestPmdShell
> +
> +
> +def create_remote_session(
> node_config: NodeConfiguration, name: str, logger: DTSLOG
> -) -> OSSession:
> - match node_config.os:
> - case OS.linux:
> - return LinuxSession(node_config, name, logger)
> - case _:
> - raise ConfigurationError(f"Unsupported OS {node_config.os}")
> +) -> RemoteSession:
> + return SSHSession(node_config, name, logger)
> +
> +
> +def create_interactive_session(
> + node_config: NodeConfiguration, logger: DTSLOG
> +) -> InteractiveRemoteSession:
> + return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/interactive_remote_session.py
> rename to dts/framework/remote_session/interactive_remote_session.py
> diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/interactive_shell.py
> rename to dts/framework/remote_session/interactive_shell.py
> diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/python_shell.py
> rename to dts/framework/remote_session/python_shell.py
> diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
> deleted file mode 100644
> index 06403691a5..0000000000
> --- a/dts/framework/remote_session/remote/__init__.py
> +++ /dev/null
> @@ -1,27 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -# Copyright(c) 2023 University of New Hampshire
> -
> -# pylama:ignore=W0611
> -
> -from framework.config import NodeConfiguration
> -from framework.logger import DTSLOG
> -
> -from .interactive_remote_session import InteractiveRemoteSession
> -from .interactive_shell import InteractiveShell
> -from .python_shell import PythonShell
> -from .remote_session import CommandResult, RemoteSession
> -from .ssh_session import SSHSession
> -from .testpmd_shell import TestPmdDevice, TestPmdShell
> -
> -
> -def create_remote_session(
> - node_config: NodeConfiguration, name: str, logger: DTSLOG
> -) -> RemoteSession:
> - return SSHSession(node_config, name, logger)
> -
> -
> -def create_interactive_session(
> - node_config: NodeConfiguration, logger: DTSLOG
> -) -> InteractiveRemoteSession:
> - return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/remote_session.py
> rename to dts/framework/remote_session/remote_session.py
> diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
> similarity index 91%
> rename from dts/framework/remote_session/remote/ssh_session.py
> rename to dts/framework/remote_session/ssh_session.py
> index 8d127f1601..cee11d14d6 100644
> --- a/dts/framework/remote_session/remote/ssh_session.py
> +++ b/dts/framework/remote_session/ssh_session.py
> @@ -18,9 +18,7 @@
> SSHException,
> )
>
> -from framework.config import NodeConfiguration
> from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
> -from framework.logger import DTSLOG
>
> from .remote_session import CommandResult, RemoteSession
>
> @@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
>
> session: Connection
>
> - def __init__(
> - self,
> - node_config: NodeConfiguration,
> - session_name: str,
> - logger: DTSLOG,
> - ):
> - super(SSHSession, self).__init__(node_config, session_name, logger)
> -
> def _connect(self) -> None:
> errors = []
> retry_attempts = 10
> @@ -117,7 +107,7 @@ def _send_command(
>
> except CommandTimedOut as e:
> self._logger.exception(e)
> - raise SSHTimeoutError(command, e.result.stderr) from e
> + raise SSHTimeoutError(command) from e
>
> return CommandResult(
> self.name, command, output.stdout, output.stderr, output.return_code
> diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/testpmd_shell.py
> rename to dts/framework/remote_session/testpmd_shell.py
> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> index cfa39d011b..7f5841d073 100644
> --- a/dts/framework/settings.py
> +++ b/dts/framework/settings.py
> @@ -6,7 +6,7 @@
> import argparse
> import os
> from collections.abc import Callable, Iterable, Sequence
> -from dataclasses import dataclass
> +from dataclasses import dataclass, field
> from pathlib import Path
> from typing import Any, TypeVar
>
> @@ -22,8 +22,8 @@ def __init__(
> option_strings: Sequence[str],
> dest: str,
> nargs: str | int | None = None,
> - const: str | None = None,
> - default: str = None,
> + const: bool | None = None,
> + default: Any = None,
> type: Callable[[str], _T | argparse.FileType | None] = None,
> choices: Iterable[_T] | None = None,
> required: bool = False,
> @@ -32,6 +32,12 @@ def __init__(
> ) -> None:
> env_var_value = os.environ.get(env_var)
> default = env_var_value or default
> + if const is not None:
> + nargs = 0
> + default = const if env_var_value else default
> + type = None
> + choices = None
> + metavar = None
> super(_EnvironmentArgument, self).__init__(
> option_strings,
> dest,
> @@ -52,22 +58,28 @@ def __call__(
> values: Any,
> option_string: str = None,
> ) -> None:
> - setattr(namespace, self.dest, values)
> + if self.const is not None:
> + setattr(namespace, self.dest, self.const)
> + else:
> + setattr(namespace, self.dest, values)
>
> return _EnvironmentArgument
>
>
> -@dataclass(slots=True, frozen=True)
> -class _Settings:
> - config_file_path: str
> - output_dir: str
> - timeout: float
> - verbose: bool
> - skip_setup: bool
> - dpdk_tarball_path: Path
> - compile_timeout: float
> - test_cases: list
> - re_run: int
> +@dataclass(slots=True)
> +class Settings:
> + config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
> + output_dir: str = "output"
> + timeout: float = 15
> + verbose: bool = False
> + skip_setup: bool = False
> + dpdk_tarball_path: Path | str = "dpdk.tar.xz"
> + compile_timeout: float = 1200
> + test_cases: list[str] = field(default_factory=list)
> + re_run: int = 0
> +
> +
> +SETTINGS: Settings = Settings()
>
>
> def _get_parser() -> argparse.ArgumentParser:
> @@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
> parser.add_argument(
> "--config-file",
> action=_env_arg("DTS_CFG_FILE"),
> - default="conf.yaml",
> + default=SETTINGS.config_file_path,
> + type=Path,
> help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
> "and targets.",
> )
> @@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--output-dir",
> "--output",
> action=_env_arg("DTS_OUTPUT_DIR"),
> - default="output",
> + default=SETTINGS.output_dir,
> help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
> )
>
> @@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "-t",
> "--timeout",
> action=_env_arg("DTS_TIMEOUT"),
> - default=15,
> + default=SETTINGS.timeout,
> type=float,
> help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
> "compiling DPDK.",
> @@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
> "-v",
> "--verbose",
> action=_env_arg("DTS_VERBOSE"),
> - default="N",
> - help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
> + default=SETTINGS.verbose,
> + const=True,
> + help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
> "to the console.",
> )
>
> @@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
> "-s",
> "--skip-setup",
> action=_env_arg("DTS_SKIP_SETUP"),
> - default="N",
> - help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
> + const=True,
> + help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
> )
>
> parser.add_argument(
> @@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--snapshot",
> "--git-ref",
> action=_env_arg("DTS_DPDK_TARBALL"),
> - default="dpdk.tar.xz",
> + default=SETTINGS.dpdk_tarball_path,
> type=Path,
> help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
> "tag ID or tree ID to test. To test local changes, first commit them, "
> @@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
> parser.add_argument(
> "--compile-timeout",
> action=_env_arg("DTS_COMPILE_TIMEOUT"),
> - default=1200,
> + default=SETTINGS.compile_timeout,
> type=float,
> help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
> )
> @@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
> "--re-run",
> "--re_run",
> action=_env_arg("DTS_RERUN"),
> - default=0,
> + default=SETTINGS.re_run,
> type=int,
> help="[DTS_RERUN] Re-run each test case the specified amount of times "
> "if a test failure occurs",
> @@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
> return parser
>
>
> -def _get_settings() -> _Settings:
> +def get_settings() -> Settings:
> parsed_args = _get_parser().parse_args()
> - return _Settings(
> + return Settings(
> config_file_path=parsed_args.config_file,
> output_dir=parsed_args.output_dir,
> timeout=parsed_args.timeout,
> - verbose=(parsed_args.verbose == "Y"),
> - skip_setup=(parsed_args.skip_setup == "Y"),
> + verbose=parsed_args.verbose,
> + skip_setup=parsed_args.skip_setup,
> dpdk_tarball_path=Path(
> - DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
> - )
> - if not os.path.exists(parsed_args.tarball)
> - else Path(parsed_args.tarball),
> + Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
> + if not os.path.exists(parsed_args.tarball)
> + else Path(parsed_args.tarball)
> + ),
> compile_timeout=parsed_args.compile_timeout,
> - test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
> + test_cases=(
> + parsed_args.test_cases.split(",") if parsed_args.test_cases else []
> + ),
> re_run=parsed_args.re_run,
> )
> -
> -
> -SETTINGS: _Settings = _get_settings()
> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
> index f0fbe80f6f..603e18872c 100644
> --- a/dts/framework/test_result.py
> +++ b/dts/framework/test_result.py
> @@ -254,7 +254,7 @@ def add_build_target(
> self._inner_results.append(build_target_result)
> return build_target_result
>
> - def add_sut_info(self, sut_info: NodeInfo):
> + def add_sut_info(self, sut_info: NodeInfo) -> None:
> self.sut_os_name = sut_info.os_name
> self.sut_os_version = sut_info.os_version
> self.sut_kernel_version = sut_info.kernel_version
> @@ -297,7 +297,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
> self._inner_results.append(execution_result)
> return execution_result
>
> - def add_error(self, error) -> None:
> + def add_error(self, error: Exception) -> None:
> self._errors.append(error)
>
> def process(self) -> None:
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index 3b890c0451..d53553bf34 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -11,7 +11,7 @@
> import re
> from ipaddress import IPv4Interface, IPv6Interface, ip_interface
> from types import MethodType
> -from typing import Union
> +from typing import Any, Union
>
> from scapy.layers.inet import IP # type: ignore[import]
> from scapy.layers.l2 import Ether # type: ignore[import]
> @@ -26,8 +26,7 @@
> from .logger import DTSLOG, getLogger
> from .settings import SETTINGS
> from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
> -from .testbed_model import SutNode, TGNode
> -from .testbed_model.hw.port import Port, PortLink
> +from .testbed_model import Port, PortLink, SutNode, TGNode
> from .utils import get_packet_summaries
>
>
> @@ -453,7 +452,7 @@ def _execute_test_case(
>
>
> def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
> - def is_test_suite(object) -> bool:
> + def is_test_suite(object: Any) -> bool:
> try:
> if issubclass(object, TestSuite) and object is not TestSuite:
> return True
> diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
> index 5cbb859e47..8ced05653b 100644
> --- a/dts/framework/testbed_model/__init__.py
> +++ b/dts/framework/testbed_model/__init__.py
> @@ -9,15 +9,9 @@
>
> # pylama:ignore=W0611
>
> -from .hw import (
> - LogicalCore,
> - LogicalCoreCount,
> - LogicalCoreCountFilter,
> - LogicalCoreList,
> - LogicalCoreListFilter,
> - VirtualDevice,
> - lcore_filter,
> -)
> +from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
> from .node import Node
> +from .port import Port, PortLink
> from .sut_node import SutNode
> from .tg_node import TGNode
> +from .virtual_device import VirtualDevice
> diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
> similarity index 95%
> rename from dts/framework/testbed_model/hw/cpu.py
> rename to dts/framework/testbed_model/cpu.py
> index d1918a12dc..8fe785dfe4 100644
> --- a/dts/framework/testbed_model/hw/cpu.py
> +++ b/dts/framework/testbed_model/cpu.py
> @@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
> )
>
> return filtered_lcores
> +
> +
> +def lcore_filter(
> + core_list: list[LogicalCore],
> + filter_specifier: LogicalCoreCount | LogicalCoreList,
> + ascending: bool,
> +) -> LogicalCoreFilter:
> + if isinstance(filter_specifier, LogicalCoreList):
> + return LogicalCoreListFilter(core_list, filter_specifier, ascending)
> + elif isinstance(filter_specifier, LogicalCoreCount):
> + return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
> + else:
> + raise ValueError(f"Unsupported filter r{filter_specifier}")
> diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
> deleted file mode 100644
> index 88ccac0b0e..0000000000
> --- a/dts/framework/testbed_model/hw/__init__.py
> +++ /dev/null
> @@ -1,27 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -
> -# pylama:ignore=W0611
> -
> -from .cpu import (
> - LogicalCore,
> - LogicalCoreCount,
> - LogicalCoreCountFilter,
> - LogicalCoreFilter,
> - LogicalCoreList,
> - LogicalCoreListFilter,
> -)
> -from .virtual_device import VirtualDevice
> -
> -
> -def lcore_filter(
> - core_list: list[LogicalCore],
> - filter_specifier: LogicalCoreCount | LogicalCoreList,
> - ascending: bool,
> -) -> LogicalCoreFilter:
> - if isinstance(filter_specifier, LogicalCoreList):
> - return LogicalCoreListFilter(core_list, filter_specifier, ascending)
> - elif isinstance(filter_specifier, LogicalCoreCount):
> - return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
> - else:
> - raise ValueError(f"Unsupported filter r{filter_specifier}")
> diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
> similarity index 97%
> rename from dts/framework/remote_session/linux_session.py
> rename to dts/framework/testbed_model/linux_session.py
> index a3f1a6bf3b..f472bb8f0f 100644
> --- a/dts/framework/remote_session/linux_session.py
> +++ b/dts/framework/testbed_model/linux_session.py
> @@ -9,10 +9,10 @@
> from typing_extensions import NotRequired
>
> from framework.exception import RemoteCommandExecutionError
> -from framework.testbed_model import LogicalCore
> -from framework.testbed_model.hw.port import Port
> from framework.utils import expand_range
>
> +from .cpu import LogicalCore
> +from .port import Port
> from .posix_session import PosixSession
>
>
> @@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> lcores.append(LogicalCore(lcore, core, socket, node))
> return lcores
>
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> return dpdk_prefix
>
> def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index fc01e0bf8e..fa5b143cdd 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -12,23 +12,26 @@
> from typing import Any, Callable, Type, Union
>
> from framework.config import (
> + OS,
> BuildTargetConfiguration,
> ExecutionConfiguration,
> NodeConfiguration,
> )
> +from framework.exception import ConfigurationError
> from framework.logger import DTSLOG, getLogger
> -from framework.remote_session import InteractiveShellType, OSSession, create_session
> from framework.settings import SETTINGS
>
> -from .hw import (
> +from .cpu import (
> LogicalCore,
> LogicalCoreCount,
> LogicalCoreList,
> LogicalCoreListFilter,
> - VirtualDevice,
> lcore_filter,
> )
> -from .hw.port import Port
> +from .linux_session import LinuxSession
> +from .os_session import InteractiveShellType, OSSession
> +from .port import Port
> +from .virtual_device import VirtualDevice
>
>
> class Node(ABC):
> @@ -172,9 +175,9 @@ def create_interactive_shell(
>
> return self.main_session.create_interactive_shell(
> shell_cls,
> - app_args,
> timeout,
> privileged,
> + app_args,
> )
>
> def filter_lcores(
> @@ -205,7 +208,7 @@ def _get_remote_cpus(self) -> None:
> self._logger.info("Getting CPU information.")
> self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
>
> - def _setup_hugepages(self):
> + def _setup_hugepages(self) -> None:
> """
> Setup hugepages on the Node. Different architectures can supply different
> amounts of memory for hugepages and numa-based hugepage allocation may need
> @@ -249,3 +252,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> return lambda *args: None
> else:
> return func
> +
> +
> +def create_session(
> + node_config: NodeConfiguration, name: str, logger: DTSLOG
> +) -> OSSession:
> + match node_config.os:
> + case OS.linux:
> + return LinuxSession(node_config, name, logger)
> + case _:
> + raise ConfigurationError(f"Unsupported OS {node_config.os}")
> diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
> similarity index 95%
> rename from dts/framework/remote_session/os_session.py
> rename to dts/framework/testbed_model/os_session.py
> index 8a709eac1c..76e595a518 100644
> --- a/dts/framework/remote_session/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -10,19 +10,19 @@
>
> from framework.config import Architecture, NodeConfiguration, NodeInfo
> from framework.logger import DTSLOG
> -from framework.remote_session.remote import InteractiveShell
> -from framework.settings import SETTINGS
> -from framework.testbed_model import LogicalCore
> -from framework.testbed_model.hw.port import Port
> -from framework.utils import MesonArgs
> -
> -from .remote import (
> +from framework.remote_session import (
> CommandResult,
> InteractiveRemoteSession,
> + InteractiveShell,
> RemoteSession,
> create_interactive_session,
> create_remote_session,
> )
> +from framework.settings import SETTINGS
> +from framework.utils import MesonArgs
> +
> +from .cpu import LogicalCore
> +from .port import Port
>
> InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
>
> @@ -85,9 +85,9 @@ def send_command(
> def create_interactive_shell(
> self,
> shell_cls: Type[InteractiveShellType],
> - eal_parameters: str,
> timeout: float,
> privileged: bool,
> + app_args: str,
> ) -> InteractiveShellType:
> """
> See "create_interactive_shell" in SutNode
> @@ -96,7 +96,7 @@ def create_interactive_shell(
> self.interactive_session.session,
> self._logger,
> self._get_privileged_command if privileged else None,
> - eal_parameters,
> + app_args,
> timeout,
> )
>
> @@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
> """
>
> @abstractmethod
> - def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
> + def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
> """
> Try to find DPDK remote dir in remote_dir.
> """
> @@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
> """
>
> @abstractmethod
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> """
> Get the DPDK file prefix that will be used when running DPDK apps.
> """
> diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
> similarity index 100%
> rename from dts/framework/testbed_model/hw/port.py
> rename to dts/framework/testbed_model/port.py
> diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
> similarity index 98%
> rename from dts/framework/remote_session/posix_session.py
> rename to dts/framework/testbed_model/posix_session.py
> index 5da0516e05..1d1d5b1b26 100644
> --- a/dts/framework/remote_session/posix_session.py
> +++ b/dts/framework/testbed_model/posix_session.py
> @@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
>
> return ret_opts
>
> - def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
> + def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
> remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
> result = self.send_command(f"ls -d {remote_guess} | tail -1")
> return PurePosixPath(result.stdout)
> @@ -219,7 +219,7 @@ def _remove_dpdk_runtime_dirs(
> for dpdk_runtime_dir in dpdk_runtime_dirs:
> self.remove_remote_dir(dpdk_runtime_dir)
>
> - def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
> + def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> return ""
>
> def get_compiler_version(self, compiler_name: str) -> str:
> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> index 4161d3a4d5..17deea06e2 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -15,12 +15,14 @@
> NodeInfo,
> SutNodeConfiguration,
> )
> -from framework.remote_session import CommandResult, InteractiveShellType, OSSession
> +from framework.remote_session import CommandResult
> from framework.settings import SETTINGS
> from framework.utils import MesonArgs
>
> -from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
> +from .cpu import LogicalCoreCount, LogicalCoreList
> from .node import Node
> +from .os_session import InteractiveShellType, OSSession
> +from .virtual_device import VirtualDevice
>
>
> class EalParameters(object):
> @@ -307,7 +309,7 @@ def create_eal_parameters(
> prefix: str = "dpdk",
> append_prefix_timestamp: bool = True,
> no_pci: bool = False,
> - vdevs: list[VirtualDevice] = None,
> + vdevs: list[VirtualDevice] | None = None,
> other_eal_param: str = "",
> ) -> "EalParameters":
> """
> diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
> index 27025cfa31..166eb8430e 100644
> --- a/dts/framework/testbed_model/tg_node.py
> +++ b/dts/framework/testbed_model/tg_node.py
> @@ -16,16 +16,11 @@
>
> from scapy.packet import Packet # type: ignore[import]
>
> -from framework.config import (
> - ScapyTrafficGeneratorConfig,
> - TGNodeConfiguration,
> - TrafficGeneratorType,
> -)
> -from framework.exception import ConfigurationError
> -
> -from .capturing_traffic_generator import CapturingTrafficGenerator
> -from .hw.port import Port
> +from framework.config import TGNodeConfiguration
> +
> from .node import Node
> +from .port import Port
> +from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
>
>
> class TGNode(Node):
> @@ -80,20 +75,3 @@ def close(self) -> None:
> """Free all resources used by the node"""
> self.traffic_generator.close()
> super(TGNode, self).close()
> -
> -
> -def create_traffic_generator(
> - tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
> -) -> CapturingTrafficGenerator:
> - """A factory function for creating traffic generator object from user config."""
> -
> - from .scapy import ScapyTrafficGenerator
> -
> - match traffic_generator_config.traffic_generator_type:
> - case TrafficGeneratorType.SCAPY:
> - return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> - case _:
> - raise ConfigurationError(
> - "Unknown traffic generator: "
> - f"{traffic_generator_config.traffic_generator_type}"
> - )
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> new file mode 100644
> index 0000000000..11bfa1ee0f
> --- /dev/null
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -0,0 +1,24 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
> +from framework.exception import ConfigurationError
> +from framework.testbed_model.node import Node
> +
> +from .capturing_traffic_generator import CapturingTrafficGenerator
> +from .scapy import ScapyTrafficGenerator
> +
> +
> +def create_traffic_generator(
> + tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
> +) -> CapturingTrafficGenerator:
> + """A factory function for creating traffic generator object from user config."""
> +
> + match traffic_generator_config.traffic_generator_type:
> + case TrafficGeneratorType.SCAPY:
> + return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> + case _:
> + raise ConfigurationError(
> + "Unknown traffic generator: "
> + f"{traffic_generator_config.traffic_generator_type}"
> + )
> diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> similarity index 96%
> rename from dts/framework/testbed_model/capturing_traffic_generator.py
> rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> index ab98987f8e..e521211ef0 100644
> --- a/dts/framework/testbed_model/capturing_traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> @@ -16,9 +16,9 @@
> from scapy.packet import Packet # type: ignore[import]
>
> from framework.settings import SETTINGS
> +from framework.testbed_model.port import Port
> from framework.utils import get_packet_summaries
>
> -from .hw.port import Port
> from .traffic_generator import TrafficGenerator
>
>
> @@ -130,7 +130,9 @@ def _send_packets_and_capture(
> for the specified duration. It must be able to handle no received packets.
> """
>
> - def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
> + def _write_capture_from_packets(
> + self, capture_name: str, packets: list[Packet]
> + ) -> None:
> file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
> self._logger.debug(f"Writing packets to {file_name}.")
> scapy.utils.wrpcap(file_name, packets)
> diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
> similarity index 95%
> rename from dts/framework/testbed_model/scapy.py
> rename to dts/framework/testbed_model/traffic_generator/scapy.py
> index af0d4dbb25..51864b6e6b 100644
> --- a/dts/framework/testbed_model/scapy.py
> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
> @@ -24,16 +24,15 @@
> from scapy.packet import Packet # type: ignore[import]
>
> from framework.config import OS, ScapyTrafficGeneratorConfig
> -from framework.logger import DTSLOG, getLogger
> from framework.remote_session import PythonShell
> from framework.settings import SETTINGS
> +from framework.testbed_model.node import Node
> +from framework.testbed_model.port import Port
>
> from .capturing_traffic_generator import (
> CapturingTrafficGenerator,
> _get_default_capture_name,
> )
> -from .hw.port import Port
> -from .tg_node import TGNode
>
> """
> ========= BEGIN RPC FUNCTIONS =========
> @@ -146,7 +145,7 @@ def quit(self) -> None:
> self._BaseServer__shutdown_request = True
> return None
>
> - def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
> + def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
> """Add a function to the server.
>
> This is meant to be executed remotely.
> @@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
> session: PythonShell
> rpc_server_proxy: xmlrpc.client.ServerProxy
> _config: ScapyTrafficGeneratorConfig
> - _tg_node: TGNode
> - _logger: DTSLOG
> -
> - def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
> - self._config = config
> - self._tg_node = tg_node
> - self._logger = getLogger(
> - f"{self._tg_node.name} {self._config.traffic_generator_type}"
> - )
> +
> + def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
> + super().__init__(tg_node, config)
>
> assert (
> self._tg_node.config.os == OS.linux
> @@ -235,7 +228,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
> function_bytes = marshal.dumps(function.__code__)
> self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
>
> - def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
> + def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
> # load the source of the function
> src = inspect.getsource(QuittableXMLRPCServer)
> # Lines with only whitespace break the repl if in the middle of a function
> @@ -280,7 +273,7 @@ def _send_packets_and_capture(
> scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
> return scapy_packets
>
> - def close(self):
> + def close(self) -> None:
> try:
> self.rpc_server_proxy.quit()
> except ConnectionRefusedError:
> diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> similarity index 80%
> rename from dts/framework/testbed_model/traffic_generator.py
> rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index 28c35d3ce4..ea7c3963da 100644
> --- a/dts/framework/testbed_model/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -12,11 +12,12 @@
>
> from scapy.packet import Packet # type: ignore[import]
>
> -from framework.logger import DTSLOG
> +from framework.config import TrafficGeneratorConfig
> +from framework.logger import DTSLOG, getLogger
> +from framework.testbed_model.node import Node
> +from framework.testbed_model.port import Port
> from framework.utils import get_packet_summaries
>
> -from .hw.port import Port
> -
>
> class TrafficGenerator(ABC):
> """The base traffic generator.
> @@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
> Defines the few basic methods that each traffic generator must implement.
> """
>
> + _config: TrafficGeneratorConfig
> + _tg_node: Node
> _logger: DTSLOG
>
> + def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> + self._config = config
> + self._tg_node = tg_node
> + self._logger = getLogger(
> + f"{self._tg_node.name} {self._config.traffic_generator_type}"
> + )
> +
> def send_packet(self, packet: Packet, port: Port) -> None:
> """Send a packet and block until it is fully sent.
>
> diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
> similarity index 100%
> rename from dts/framework/testbed_model/hw/virtual_device.py
> rename to dts/framework/testbed_model/virtual_device.py
> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> index d27c2c5b5f..f0c916471c 100644
> --- a/dts/framework/utils.py
> +++ b/dts/framework/utils.py
> @@ -7,7 +7,6 @@
> import json
> import os
> import subprocess
> -import sys
> from enum import Enum
> from pathlib import Path
> from subprocess import SubprocessError
> @@ -16,35 +15,7 @@
>
> from .exception import ConfigurationError
>
> -
> -class StrEnum(Enum):
> - @staticmethod
> - def _generate_next_value_(
> - name: str, start: int, count: int, last_values: object
> - ) -> str:
> - return name
> -
> - def __str__(self) -> str:
> - return self.name
> -
> -
> -REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> -
> -
> -def check_dts_python_version() -> None:
> - if sys.version_info.major < 3 or (
> - sys.version_info.major == 3 and sys.version_info.minor < 10
> - ):
> - print(
> - RED(
> - (
> - "WARNING: DTS execution node's python version is lower than"
> - "python 3.10, is deprecated and will not work in future releases."
> - )
> - ),
> - file=sys.stderr,
> - )
> - print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> +REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
>
>
> def expand_range(range_str: str) -> list[int]:
> @@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
> return expanded_range
>
>
> -def get_packet_summaries(packets: list[Packet]):
> +def get_packet_summaries(packets: list[Packet]) -> str:
> if len(packets) == 1:
> packet_summaries = packets[0].summary()
> else:
> @@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
> return f"Packet contents: \n{packet_summaries}"
>
>
> -def RED(text: str) -> str:
> - return f"\u001B[31;1m{str(text)}\u001B[0m"
> +class StrEnum(Enum):
> + @staticmethod
> + def _generate_next_value_(
> + name: str, start: int, count: int, last_values: object
> + ) -> str:
> + return name
> +
> + def __str__(self) -> str:
> + return self.name
>
>
> class MesonArgs(object):
> @@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
> if self._tarball_path and os.path.exists(self._tarball_path):
> os.remove(self._tarball_path)
>
> - def __fspath__(self):
> + def __fspath__(self) -> str:
> return str(self._tarball_path)
> diff --git a/dts/main.py b/dts/main.py
> index 43311fa847..5d4714b0c3 100755
> --- a/dts/main.py
> +++ b/dts/main.py
> @@ -10,10 +10,17 @@
>
> import logging
>
> -from framework import dts
> +from framework import settings
>
>
> def main() -> None:
> + """Set DTS settings, then run DTS.
> +
> + The DTS settings are taken from the command line arguments and the environment variables.
> + """
> + settings.SETTINGS = settings.get_settings()
> + from framework import dts
> +
> dts.run_all()
>
>
Reviewed-by: Yoan Picchi <yoan.picchi@arm.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 02/21] dts: add docstring checker
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 01/21] dts: code adjustments for doc generation Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-20 16:03 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 03/21] dts: add basic developer docs Juraj Linkeš
` (19 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Python docstrings are the in-code way to document the code. The
docstring checker of choice is pydocstyle which we're executing from
Pylama, but the current latest versions are not complatible due to [0],
so pin the pydocstyle version to the latest working version.
[0] https://github.com/klen/pylama/issues/232
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 12 ++++++------
dts/pyproject.toml | 6 +++++-
2 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index f7b3b6d602..a734fa71f0 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -489,20 +489,20 @@ files = [
[[package]]
name = "pydocstyle"
-version = "6.3.0"
+version = "6.1.1"
description = "Python docstring style checker"
optional = false
python-versions = ">=3.6"
files = [
- {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
- {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+ {file = "pydocstyle-6.1.1-py3-none-any.whl", hash = "sha256:6987826d6775056839940041beef5c08cc7e3d71d63149b48e36727f70144dc4"},
+ {file = "pydocstyle-6.1.1.tar.gz", hash = "sha256:1d41b7c459ba0ee6c345f2eb9ae827cab14a7533a88c5c6f7e94923f72df92dc"},
]
[package.dependencies]
-snowballstemmer = ">=2.2.0"
+snowballstemmer = "*"
[package.extras]
-toml = ["tomli (>=1.2.3)"]
+toml = ["toml"]
[[package]]
name = "pyflakes"
@@ -837,4 +837,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
+content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6762edfa6b..3943c87c87 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -25,6 +25,7 @@ PyYAML = "^6.0"
types-PyYAML = "^6.0.8"
fabric = "^2.7.1"
scapy = "^2.5.0"
+pydocstyle = "6.1.1"
[tool.poetry.group.dev.dependencies]
mypy = "^0.961"
@@ -39,10 +40,13 @@ requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
[tool.pylama]
-linters = "mccabe,pycodestyle,pyflakes"
+linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
format = "pylint"
max_line_length = 88 # https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html#line-length
+[tool.pylama.linter.pydocstyle]
+convention = "google"
+
[tool.mypy]
python_version = "3.10"
enable_error_code = ["ignore-without-code"]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 02/21] dts: add docstring checker
2023-11-15 13:09 ` [PATCH v7 02/21] dts: add docstring checker Juraj Linkeš
@ 2023-11-20 16:03 ` Yoan Picchi
0 siblings, 0 replies; 393+ messages in thread
From: Yoan Picchi @ 2023-11-20 16:03 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Python docstrings are the in-code way to document the code. The
> docstring checker of choice is pydocstyle which we're executing from
> Pylama, but the current latest versions are not complatible due to [0],
> so pin the pydocstyle version to the latest working version.
>
> [0] https://github.com/klen/pylama/issues/232
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/poetry.lock | 12 ++++++------
> dts/pyproject.toml | 6 +++++-
> 2 files changed, 11 insertions(+), 7 deletions(-)
>
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index f7b3b6d602..a734fa71f0 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -489,20 +489,20 @@ files = [
>
> [[package]]
> name = "pydocstyle"
> -version = "6.3.0"
> +version = "6.1.1"
> description = "Python docstring style checker"
> optional = false
> python-versions = ">=3.6"
> files = [
> - {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
> - {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
> + {file = "pydocstyle-6.1.1-py3-none-any.whl", hash = "sha256:6987826d6775056839940041beef5c08cc7e3d71d63149b48e36727f70144dc4"},
> + {file = "pydocstyle-6.1.1.tar.gz", hash = "sha256:1d41b7c459ba0ee6c345f2eb9ae827cab14a7533a88c5c6f7e94923f72df92dc"},
> ]
>
> [package.dependencies]
> -snowballstemmer = ">=2.2.0"
> +snowballstemmer = "*"
>
> [package.extras]
> -toml = ["tomli (>=1.2.3)"]
> +toml = ["toml"]
>
> [[package]]
> name = "pyflakes"
> @@ -837,4 +837,4 @@ jsonschema = ">=4,<5"
> [metadata]
> lock-version = "2.0"
> python-versions = "^3.10"
> -content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
> +content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index 6762edfa6b..3943c87c87 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -25,6 +25,7 @@ PyYAML = "^6.0"
> types-PyYAML = "^6.0.8"
> fabric = "^2.7.1"
> scapy = "^2.5.0"
> +pydocstyle = "6.1.1"
>
> [tool.poetry.group.dev.dependencies]
> mypy = "^0.961"
> @@ -39,10 +40,13 @@ requires = ["poetry-core>=1.0.0"]
> build-backend = "poetry.core.masonry.api"
>
> [tool.pylama]
> -linters = "mccabe,pycodestyle,pyflakes"
> +linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
> format = "pylint"
> max_line_length = 88 # https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html#line-length
>
> +[tool.pylama.linter.pydocstyle]
> +convention = "google"
> +
> [tool.mypy]
> python_version = "3.10"
> enable_error_code = ["ignore-without-code"]
Reviewed-by: Yoan Picchi <yoan.picchi@arm.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 03/21] dts: add basic developer docs
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 01/21] dts: code adjustments for doc generation Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 02/21] dts: add docstring checker Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-20 16:03 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 04/21] dts: exceptions docstring update Juraj Linkeš
` (18 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Expand the framework contribution guidelines and add how to document the
code with Python docstrings.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
1 file changed, 73 insertions(+)
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 32c18ee472..cd771a428c 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
The results contain basic statistics of passed/failed test cases and DPDK version.
+Contributing to DTS
+-------------------
+
+There are two areas of contribution: The DTS framework and DTS test suites.
+
+The framework contains the logic needed to run test cases, such as connecting to nodes,
+running DPDK apps and collecting results.
+
+The test cases call APIs from the framework to test their scenarios. Adding test cases may
+require adding code to the framework as well.
+
+
+Framework Coding Guidelines
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When adding code to the DTS framework, pay attention to the rest of the code
+and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
+warnings when some of the basics are not met.
+
+The code must be properly documented with docstrings. The style must conform to
+the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
+See an example of the style
+`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
+For cases which are not covered by the Google style, refer
+to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
+the two style guides, where we deviate or where some additional clarification is helpful:
+
+ * The __init__() methods of classes are documented separately from the docstring of the class
+ itself.
+ * The docstrigs of implemented abstract methods should refer to the superclass's definition
+ if there's no deviation.
+ * Instance variables/attributes should be documented in the docstring of the class
+ in the ``Attributes:`` section.
+ * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
+ attributes which result in instance variables/attributes should also be recorded
+ in the ``Attributes:`` section.
+ * Class variables/attributes, on the other hand, should be documented with ``#:`` above
+ the type annotated line. The description may be omitted if the meaning is obvious.
+ * The Enum and TypedDict also process the attributes in particular ways and should be documented
+ with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
+ * When referencing a parameter of a function or a method in their docstring, don't use
+ any articles and put the parameter into single backticks. This mimics the style of
+ `Python's documentation <https://docs.python.org/3/index.html>`_.
+ * When specifying a value, use double backticks::
+
+ def foo(greet: bool) -> None:
+ """Demonstration of single and double backticks.
+
+ `greet` controls whether ``Hello World`` is printed.
+
+ Args:
+ greet: Whether to print the ``Hello World`` message.
+ """
+ if greet:
+ print(f"Hello World")
+
+ * The docstring maximum line length is the same as the code maximum line length.
+
+
How To Write a Test Suite
-------------------------
@@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
| These methods don't need to be implemented if there's no need for them in a test suite.
In that case, nothing will happen when they're is executed.
+#. **Configuration, traffic and other logic**
+
+ The ``TestSuite`` class contains a variety of methods for anything that
+ a test suite setup, a teardown, or a test case may need to do.
+
+ The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
+ and use the interactive shell instances directly.
+
+ These are the two main ways to call the framework logic in test suites. If there's any
+ functionality or logic missing from the framework, it should be implemented so that
+ the test suites can use one of these two ways.
+
#. **Test case verification**
Test case verification should be done with the ``verify`` method, which records the result.
@@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
and used by the test suite via the ``sut_node`` field.
+.. _dts_dev_tools:
+
DTS Developer Tools
-------------------
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 03/21] dts: add basic developer docs
2023-11-15 13:09 ` [PATCH v7 03/21] dts: add basic developer docs Juraj Linkeš
@ 2023-11-20 16:03 ` Yoan Picchi
0 siblings, 0 replies; 393+ messages in thread
From: Yoan Picchi @ 2023-11-20 16:03 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Expand the framework contribution guidelines and add how to document the
> code with Python docstrings.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
> 1 file changed, 73 insertions(+)
>
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> index 32c18ee472..cd771a428c 100644
> --- a/doc/guides/tools/dts.rst
> +++ b/doc/guides/tools/dts.rst
> @@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
> The results contain basic statistics of passed/failed test cases and DPDK version.
>
>
> +Contributing to DTS
> +-------------------
> +
> +There are two areas of contribution: The DTS framework and DTS test suites.
> +
> +The framework contains the logic needed to run test cases, such as connecting to nodes,
> +running DPDK apps and collecting results.
> +
> +The test cases call APIs from the framework to test their scenarios. Adding test cases may
> +require adding code to the framework as well.
> +
> +
> +Framework Coding Guidelines
> +~~~~~~~~~~~~~~~~~~~~~~~~~~~
> +
> +When adding code to the DTS framework, pay attention to the rest of the code
> +and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
> +warnings when some of the basics are not met.
> +
> +The code must be properly documented with docstrings. The style must conform to
> +the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
> +See an example of the style
> +`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
> +For cases which are not covered by the Google style, refer
> +to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
> +the two style guides, where we deviate or where some additional clarification is helpful:
> +
> + * The __init__() methods of classes are documented separately from the docstring of the class
> + itself.
> + * The docstrigs of implemented abstract methods should refer to the superclass's definition
> + if there's no deviation.
> + * Instance variables/attributes should be documented in the docstring of the class
> + in the ``Attributes:`` section.
> + * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
> + attributes which result in instance variables/attributes should also be recorded
> + in the ``Attributes:`` section.
> + * Class variables/attributes, on the other hand, should be documented with ``#:`` above
> + the type annotated line. The description may be omitted if the meaning is obvious.
> + * The Enum and TypedDict also process the attributes in particular ways and should be documented
> + with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
> + * When referencing a parameter of a function or a method in their docstring, don't use
> + any articles and put the parameter into single backticks. This mimics the style of
> + `Python's documentation <https://docs.python.org/3/index.html>`_.
> + * When specifying a value, use double backticks::
> +
> + def foo(greet: bool) -> None:
> + """Demonstration of single and double backticks.
> +
> + `greet` controls whether ``Hello World`` is printed.
> +
> + Args:
> + greet: Whether to print the ``Hello World`` message.
> + """
> + if greet:
> + print(f"Hello World")
> +
> + * The docstring maximum line length is the same as the code maximum line length.
> +
> +
> How To Write a Test Suite
> -------------------------
>
> @@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
> | These methods don't need to be implemented if there's no need for them in a test suite.
> In that case, nothing will happen when they're is executed.
>
> +#. **Configuration, traffic and other logic**
> +
> + The ``TestSuite`` class contains a variety of methods for anything that
> + a test suite setup, a teardown, or a test case may need to do.
> +
> + The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
> + and use the interactive shell instances directly.
> +
> + These are the two main ways to call the framework logic in test suites. If there's any
> + functionality or logic missing from the framework, it should be implemented so that
> + the test suites can use one of these two ways.
> +
> #. **Test case verification**
>
> Test case verification should be done with the ``verify`` method, which records the result.
> @@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
> and used by the test suite via the ``sut_node`` field.
>
>
> +.. _dts_dev_tools:
> +
> DTS Developer Tools
> -------------------
>
Reviewed-by: Yoan Picchi <yoan.picchi@arm.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 04/21] dts: exceptions docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (2 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 03/21] dts: add basic developer docs Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-20 16:22 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 05/21] dts: settings " Juraj Linkeš
` (17 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/__init__.py | 12 ++++-
dts/framework/exception.py | 106 +++++++++++++++++++++++++------------
2 files changed, 83 insertions(+), 35 deletions(-)
diff --git a/dts/framework/__init__.py b/dts/framework/__init__.py
index d551ad4bf0..662e6ccad2 100644
--- a/dts/framework/__init__.py
+++ b/dts/framework/__init__.py
@@ -1,3 +1,13 @@
# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2022 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+
+"""Libraries and utilities for running DPDK Test Suite (DTS).
+
+The various modules in the DTS framework offer:
+
+* Connections to nodes, both interactive and non-interactive,
+* A straightforward way to add support for different operating systems of remote nodes,
+* Test suite setup, execution and teardown, along with test case setup, execution and teardown,
+* Pre-test suite setup and post-test suite teardown.
+"""
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 7489c03570..ee1562c672 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -3,8 +3,10 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-User-defined exceptions used across the framework.
+"""DTS exceptions.
+
+The exceptions all have different severities expressed as an integer.
+The highest severity of all raised exception is used as the exit code of DTS.
"""
from enum import IntEnum, unique
@@ -13,59 +15,79 @@
@unique
class ErrorSeverity(IntEnum):
- """
- The severity of errors that occur during DTS execution.
+ """The severity of errors that occur during DTS execution.
+
All exceptions are caught and the most severe error is used as return code.
"""
+ #:
NO_ERR = 0
+ #:
GENERIC_ERR = 1
+ #:
CONFIG_ERR = 2
+ #:
REMOTE_CMD_EXEC_ERR = 3
+ #:
SSH_ERR = 4
+ #:
DPDK_BUILD_ERR = 10
+ #:
TESTCASE_VERIFY_ERR = 20
+ #:
BLOCKING_TESTSUITE_ERR = 25
class DTSError(Exception):
- """
- The base exception from which all DTS exceptions are derived.
- Stores error severity.
+ """The base exception from which all DTS exceptions are subclassed.
+
+ Do not use this exception, only use subclassed exceptions.
"""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.GENERIC_ERR
class SSHTimeoutError(DTSError):
- """
- Command execution timeout.
- """
+ """The SSH execution of a command timed out."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_command: str
def __init__(self, command: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ command: The executed command.
+ """
self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self._command}"
+ """Add some context to the string representation."""
+ return f"{self._command} execution timed out."
class SSHConnectionError(DTSError):
- """
- SSH connection error.
- """
+ """An unsuccessful SSH connection."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
_errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ host: The hostname to which we're trying to connect.
+ errors: Any errors that occurred during the connection attempt.
+ """
self._host = host
self._errors = [] if errors is None else errors
def __str__(self) -> str:
+ """Include the errors in the string representation."""
message = f"Error trying to connect with {self._host}."
if self._errors:
message += f" Errors encountered while retrying: {', '.join(self._errors)}"
@@ -74,43 +96,53 @@ def __str__(self) -> str:
class SSHSessionDeadError(DTSError):
- """
- SSH session is not alive.
- It can no longer be used.
- """
+ """The SSH session is no longer alive."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
def __init__(self, host: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ host: The hostname of the disconnected node.
+ """
self._host = host
def __str__(self) -> str:
- return f"SSH session with {self._host} has died"
+ """Add some context to the string representation."""
+ return f"SSH session with {self._host} has died."
class ConfigurationError(DTSError):
- """
- Raised when an invalid configuration is encountered.
- """
+ """An invalid configuration."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.CONFIG_ERR
class RemoteCommandExecutionError(DTSError):
- """
- Raised when a command executed on a Node returns a non-zero exit status.
- """
+ """An unsuccessful execution of a remote command."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ #: The executed command.
command: str
_command_return_code: int
def __init__(self, command: str, command_return_code: int):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ command: The executed command.
+ command_return_code: The return code of the executed command.
+ """
self.command = command
self._command_return_code = command_return_code
def __str__(self) -> str:
+ """Include both the command and return code in the string representation."""
return (
f"Command {self.command} returned a non-zero exit code: "
f"{self._command_return_code}"
@@ -118,35 +150,41 @@ def __str__(self) -> str:
class RemoteDirectoryExistsError(DTSError):
- """
- Raised when a remote directory to be created already exists.
- """
+ """A directory that exists on a remote node."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
class DPDKBuildError(DTSError):
- """
- Raised when DPDK build fails for any reason.
- """
+ """A DPDK build failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.DPDK_BUILD_ERR
class TestCaseVerifyError(DTSError):
- """
- Used in test cases to verify the expected behavior.
- """
+ """A test case failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
class BlockingTestSuiteError(DTSError):
+ """A failure in a blocking test suite."""
+
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
_suite_name: str
def __init__(self, suite_name: str) -> None:
+ """Define the meaning of the first argument.
+
+ Args:
+ suite_name: The blocking test suite.
+ """
self._suite_name = suite_name
def __str__(self) -> str:
+ """Add some context to the string representation."""
return f"Blocking suite {self._suite_name} failed."
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 04/21] dts: exceptions docstring update
2023-11-15 13:09 ` [PATCH v7 04/21] dts: exceptions docstring update Juraj Linkeš
@ 2023-11-20 16:22 ` Yoan Picchi
2023-11-20 16:35 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-20 16:22 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/__init__.py | 12 ++++-
> dts/framework/exception.py | 106 +++++++++++++++++++++++++------------
> 2 files changed, 83 insertions(+), 35 deletions(-)
>
> diff --git a/dts/framework/__init__.py b/dts/framework/__init__.py
> index d551ad4bf0..662e6ccad2 100644
> --- a/dts/framework/__init__.py
> +++ b/dts/framework/__init__.py
> @@ -1,3 +1,13 @@
> # SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2022 PANTHEON.tech s.r.o.
> +# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022 University of New Hampshire
> +
> +"""Libraries and utilities for running DPDK Test Suite (DTS).
> +
> +The various modules in the DTS framework offer:
> +
> +* Connections to nodes, both interactive and non-interactive,
> +* A straightforward way to add support for different operating systems of remote nodes,
> +* Test suite setup, execution and teardown, along with test case setup, execution and teardown,
> +* Pre-test suite setup and post-test suite teardown.
> +"""
> diff --git a/dts/framework/exception.py b/dts/framework/exception.py
> index 7489c03570..ee1562c672 100644
> --- a/dts/framework/exception.py
> +++ b/dts/framework/exception.py
> @@ -3,8 +3,10 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> -"""
> -User-defined exceptions used across the framework.
> +"""DTS exceptions.
> +
> +The exceptions all have different severities expressed as an integer.
> +The highest severity of all raised exception is used as the exit code of DTS.
all raised exception*s*
> """
>
> from enum import IntEnum, unique
> @@ -13,59 +15,79 @@
>
> @unique
> class ErrorSeverity(IntEnum):
> - """
> - The severity of errors that occur during DTS execution.
> + """The severity of errors that occur during DTS execution.
> +
> All exceptions are caught and the most severe error is used as return code.
> """
>
> + #:
> NO_ERR = 0
> + #:
> GENERIC_ERR = 1
> + #:
> CONFIG_ERR = 2
> + #:
> REMOTE_CMD_EXEC_ERR = 3
> + #:
> SSH_ERR = 4
> + #:
> DPDK_BUILD_ERR = 10
> + #:
> TESTCASE_VERIFY_ERR = 20
> + #:
> BLOCKING_TESTSUITE_ERR = 25
>
>
> class DTSError(Exception):
> - """
> - The base exception from which all DTS exceptions are derived.
> - Stores error severity.
> + """The base exception from which all DTS exceptions are subclassed.
> +
> + Do not use this exception, only use subclassed exceptions.
> """
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.GENERIC_ERR
>
>
> class SSHTimeoutError(DTSError):
> - """
> - Command execution timeout.
> - """
> + """The SSH execution of a command timed out."""
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> _command: str
>
> def __init__(self, command: str):
> + """Define the meaning of the first argument.
> +
> + Args:
> + command: The executed command.
> + """
> self._command = command
>
> def __str__(self) -> str:
> - return f"TIMEOUT on {self._command}"
> + """Add some context to the string representation."""
> + return f"{self._command} execution timed out."
>
>
> class SSHConnectionError(DTSError):
> - """
> - SSH connection error.
> - """
> + """An unsuccessful SSH connection."""
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> _host: str
> _errors: list[str]
>
> def __init__(self, host: str, errors: list[str] | None = None):
> + """Define the meaning of the first two arguments.
> +
> + Args:
> + host: The hostname to which we're trying to connect.
> + errors: Any errors that occurred during the connection attempt.
> + """
> self._host = host
> self._errors = [] if errors is None else errors
>
> def __str__(self) -> str:
> + """Include the errors in the string representation."""
> message = f"Error trying to connect with {self._host}."
> if self._errors:
> message += f" Errors encountered while retrying: {', '.join(self._errors)}"
> @@ -74,43 +96,53 @@ def __str__(self) -> str:
>
>
> class SSHSessionDeadError(DTSError):
> - """
> - SSH session is not alive.
> - It can no longer be used.
> - """
> + """The SSH session is no longer alive."""
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> _host: str
>
> def __init__(self, host: str):
> + """Define the meaning of the first argument.
> +
> + Args:
> + host: The hostname of the disconnected node.
> + """
> self._host = host
>
> def __str__(self) -> str:
> - return f"SSH session with {self._host} has died"
> + """Add some context to the string representation."""
> + return f"SSH session with {self._host} has died."
>
>
> class ConfigurationError(DTSError):
> - """
> - Raised when an invalid configuration is encountered.
> - """
> + """An invalid configuration."""
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.CONFIG_ERR
>
>
> class RemoteCommandExecutionError(DTSError):
> - """
> - Raised when a command executed on a Node returns a non-zero exit status.
> - """
> + """An unsuccessful execution of a remote command."""
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
> + #: The executed command.
> command: str
> _command_return_code: int
>
> def __init__(self, command: str, command_return_code: int):
> + """Define the meaning of the first two arguments.
> +
> + Args:
> + command: The executed command.
> + command_return_code: The return code of the executed command.
> + """
> self.command = command
> self._command_return_code = command_return_code
>
> def __str__(self) -> str:
> + """Include both the command and return code in the string representation."""
> return (
> f"Command {self.command} returned a non-zero exit code: "
> f"{self._command_return_code}"
> @@ -118,35 +150,41 @@ def __str__(self) -> str:
>
>
> class RemoteDirectoryExistsError(DTSError):
> - """
> - Raised when a remote directory to be created already exists.
> - """
> + """A directory that exists on a remote node."""
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
>
>
> class DPDKBuildError(DTSError):
> - """
> - Raised when DPDK build fails for any reason.
> - """
> + """A DPDK build failure."""
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.DPDK_BUILD_ERR
>
>
> class TestCaseVerifyError(DTSError):
> - """
> - Used in test cases to verify the expected behavior.
> - """
> + """A test case failure."""
>
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
>
>
> class BlockingTestSuiteError(DTSError):
> + """A failure in a blocking test suite."""
> +
> + #:
> severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
> _suite_name: str
>
> def __init__(self, suite_name: str) -> None:
> + """Define the meaning of the first argument.
> +
> + Args:
> + suite_name: The blocking test suite.
> + """
> self._suite_name = suite_name
>
> def __str__(self) -> str:
> + """Add some context to the string representation."""
> return f"Blocking suite {self._suite_name} failed."
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 04/21] dts: exceptions docstring update
2023-11-20 16:22 ` Yoan Picchi
@ 2023-11-20 16:35 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-20 16:35 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Mon, Nov 20, 2023 at 5:22 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/__init__.py | 12 ++++-
> > dts/framework/exception.py | 106 +++++++++++++++++++++++++------------
> > 2 files changed, 83 insertions(+), 35 deletions(-)
> >
> > diff --git a/dts/framework/__init__.py b/dts/framework/__init__.py
> > index d551ad4bf0..662e6ccad2 100644
> > --- a/dts/framework/__init__.py
> > +++ b/dts/framework/__init__.py
> > @@ -1,3 +1,13 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > -# Copyright(c) 2022 PANTHEON.tech s.r.o.
> > +# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022 University of New Hampshire
> > +
> > +"""Libraries and utilities for running DPDK Test Suite (DTS).
> > +
> > +The various modules in the DTS framework offer:
> > +
> > +* Connections to nodes, both interactive and non-interactive,
> > +* A straightforward way to add support for different operating systems of remote nodes,
> > +* Test suite setup, execution and teardown, along with test case setup, execution and teardown,
> > +* Pre-test suite setup and post-test suite teardown.
> > +"""
> > diff --git a/dts/framework/exception.py b/dts/framework/exception.py
> > index 7489c03570..ee1562c672 100644
> > --- a/dts/framework/exception.py
> > +++ b/dts/framework/exception.py
> > @@ -3,8 +3,10 @@
> > # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022-2023 University of New Hampshire
> >
> > -"""
> > -User-defined exceptions used across the framework.
> > +"""DTS exceptions.
> > +
> > +The exceptions all have different severities expressed as an integer.
> > +The highest severity of all raised exception is used as the exit code of DTS.
>
> all raised exception*s*
>
Ack, will fix.
> > """
> >
> > from enum import IntEnum, unique
> > @@ -13,59 +15,79 @@
> >
> > @unique
> > class ErrorSeverity(IntEnum):
> > - """
> > - The severity of errors that occur during DTS execution.
> > + """The severity of errors that occur during DTS execution.
> > +
> > All exceptions are caught and the most severe error is used as return code.
> > """
> >
> > + #:
> > NO_ERR = 0
> > + #:
> > GENERIC_ERR = 1
> > + #:
> > CONFIG_ERR = 2
> > + #:
> > REMOTE_CMD_EXEC_ERR = 3
> > + #:
> > SSH_ERR = 4
> > + #:
> > DPDK_BUILD_ERR = 10
> > + #:
> > TESTCASE_VERIFY_ERR = 20
> > + #:
> > BLOCKING_TESTSUITE_ERR = 25
> >
> >
> > class DTSError(Exception):
> > - """
> > - The base exception from which all DTS exceptions are derived.
> > - Stores error severity.
> > + """The base exception from which all DTS exceptions are subclassed.
> > +
> > + Do not use this exception, only use subclassed exceptions.
> > """
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.GENERIC_ERR
> >
> >
> > class SSHTimeoutError(DTSError):
> > - """
> > - Command execution timeout.
> > - """
> > + """The SSH execution of a command timed out."""
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> > _command: str
> >
> > def __init__(self, command: str):
> > + """Define the meaning of the first argument.
> > +
> > + Args:
> > + command: The executed command.
> > + """
> > self._command = command
> >
> > def __str__(self) -> str:
> > - return f"TIMEOUT on {self._command}"
> > + """Add some context to the string representation."""
> > + return f"{self._command} execution timed out."
> >
> >
> > class SSHConnectionError(DTSError):
> > - """
> > - SSH connection error.
> > - """
> > + """An unsuccessful SSH connection."""
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> > _host: str
> > _errors: list[str]
> >
> > def __init__(self, host: str, errors: list[str] | None = None):
> > + """Define the meaning of the first two arguments.
> > +
> > + Args:
> > + host: The hostname to which we're trying to connect.
> > + errors: Any errors that occurred during the connection attempt.
> > + """
> > self._host = host
> > self._errors = [] if errors is None else errors
> >
> > def __str__(self) -> str:
> > + """Include the errors in the string representation."""
> > message = f"Error trying to connect with {self._host}."
> > if self._errors:
> > message += f" Errors encountered while retrying: {', '.join(self._errors)}"
> > @@ -74,43 +96,53 @@ def __str__(self) -> str:
> >
> >
> > class SSHSessionDeadError(DTSError):
> > - """
> > - SSH session is not alive.
> > - It can no longer be used.
> > - """
> > + """The SSH session is no longer alive."""
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
> > _host: str
> >
> > def __init__(self, host: str):
> > + """Define the meaning of the first argument.
> > +
> > + Args:
> > + host: The hostname of the disconnected node.
> > + """
> > self._host = host
> >
> > def __str__(self) -> str:
> > - return f"SSH session with {self._host} has died"
> > + """Add some context to the string representation."""
> > + return f"SSH session with {self._host} has died."
> >
> >
> > class ConfigurationError(DTSError):
> > - """
> > - Raised when an invalid configuration is encountered.
> > - """
> > + """An invalid configuration."""
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.CONFIG_ERR
> >
> >
> > class RemoteCommandExecutionError(DTSError):
> > - """
> > - Raised when a command executed on a Node returns a non-zero exit status.
> > - """
> > + """An unsuccessful execution of a remote command."""
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
> > + #: The executed command.
> > command: str
> > _command_return_code: int
> >
> > def __init__(self, command: str, command_return_code: int):
> > + """Define the meaning of the first two arguments.
> > +
> > + Args:
> > + command: The executed command.
> > + command_return_code: The return code of the executed command.
> > + """
> > self.command = command
> > self._command_return_code = command_return_code
> >
> > def __str__(self) -> str:
> > + """Include both the command and return code in the string representation."""
> > return (
> > f"Command {self.command} returned a non-zero exit code: "
> > f"{self._command_return_code}"
> > @@ -118,35 +150,41 @@ def __str__(self) -> str:
> >
> >
> > class RemoteDirectoryExistsError(DTSError):
> > - """
> > - Raised when a remote directory to be created already exists.
> > - """
> > + """A directory that exists on a remote node."""
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
> >
> >
> > class DPDKBuildError(DTSError):
> > - """
> > - Raised when DPDK build fails for any reason.
> > - """
> > + """A DPDK build failure."""
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.DPDK_BUILD_ERR
> >
> >
> > class TestCaseVerifyError(DTSError):
> > - """
> > - Used in test cases to verify the expected behavior.
> > - """
> > + """A test case failure."""
> >
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
> >
> >
> > class BlockingTestSuiteError(DTSError):
> > + """A failure in a blocking test suite."""
> > +
> > + #:
> > severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
> > _suite_name: str
> >
> > def __init__(self, suite_name: str) -> None:
> > + """Define the meaning of the first argument.
> > +
> > + Args:
> > + suite_name: The blocking test suite.
> > + """
> > self._suite_name = suite_name
> >
> > def __str__(self) -> str:
> > + """Add some context to the string representation."""
> > return f"Blocking suite {self._suite_name} failed."
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 05/21] dts: settings docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (3 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 04/21] dts: exceptions docstring update Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 06/21] dts: logger and utils " Juraj Linkeš
` (16 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/settings.py | 103 +++++++++++++++++++++++++++++++++++++-
1 file changed, 102 insertions(+), 1 deletion(-)
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 7f5841d073..fc7c4e00e8 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -3,6 +3,72 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+"""Environment variables and command line arguments parsing.
+
+This is a simple module utilizing the built-in argparse module to parse command line arguments,
+augment them with values from environment variables and make them available across the framework.
+
+The command line value takes precedence, followed by the environment variable value,
+followed by the default value defined in this module.
+
+The command line arguments along with the supported environment variables are:
+
+.. option:: --config-file
+.. envvar:: DTS_CFG_FILE
+
+ The path to the YAML test run configuration file.
+
+.. option:: --output-dir, --output
+.. envvar:: DTS_OUTPUT_DIR
+
+ The directory where DTS logs and results are saved.
+
+.. option:: --compile-timeout
+.. envvar:: DTS_COMPILE_TIMEOUT
+
+ The timeout for compiling DPDK.
+
+.. option:: -t, --timeout
+.. envvar:: DTS_TIMEOUT
+
+ The timeout for all DTS operation except for compiling DPDK.
+
+.. option:: -v, --verbose
+.. envvar:: DTS_VERBOSE
+
+ Set to any value to enable logging everything to the console.
+
+.. option:: -s, --skip-setup
+.. envvar:: DTS_SKIP_SETUP
+
+ Set to any value to skip building DPDK.
+
+.. option:: --tarball, --snapshot, --git-ref
+.. envvar:: DTS_DPDK_TARBALL
+
+ The path to a DPDK tarball, git commit ID, tag ID or tree ID to test.
+
+.. option:: --test-cases
+.. envvar:: DTS_TESTCASES
+
+ A comma-separated list of test cases to execute. Unknown test cases will be silently ignored.
+
+.. option:: --re-run, --re_run
+.. envvar:: DTS_RERUN
+
+ Re-run each test case this many times in case of a failure.
+
+The module provides one key module-level variable:
+
+Attributes:
+ SETTINGS: The module level variable storing framework-wide DTS settings.
+
+Typical usage example::
+
+ from framework.settings import SETTINGS
+ foo = SETTINGS.foo
+"""
+
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
@@ -16,6 +82,23 @@
def _env_arg(env_var: str) -> Any:
+ """A helper method augmenting the argparse Action with environment variables.
+
+ If the supplied environment variable is defined, then the default value
+ of the argument is modified. This satisfies the priority order of
+ command line argument > environment variable > default value.
+
+ Arguments with no values (flags) should be defined using the const keyword argument
+ (True or False). When the argument is specified, it will be set to const, if not specified,
+ the default will be stored (possibly modified by the corresponding environment variable).
+
+ Other arguments work the same as default argparse arguments, that is using
+ the default 'store' action.
+
+ Returns:
+ The modified argparse.Action.
+ """
+
class _EnvironmentArgument(argparse.Action):
def __init__(
self,
@@ -68,14 +151,28 @@ def __call__(
@dataclass(slots=True)
class Settings:
+ """Default framework-wide user settings.
+
+ The defaults may be modified at the start of the run.
+ """
+
+ #:
config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ #:
output_dir: str = "output"
+ #:
timeout: float = 15
+ #:
verbose: bool = False
+ #:
skip_setup: bool = False
+ #:
dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ #:
compile_timeout: float = 1200
+ #:
test_cases: list[str] = field(default_factory=list)
+ #:
re_run: int = 0
@@ -169,7 +266,7 @@ def _get_parser() -> argparse.ArgumentParser:
action=_env_arg("DTS_RERUN"),
default=SETTINGS.re_run,
type=int,
- help="[DTS_RERUN] Re-run each test case the specified amount of times "
+ help="[DTS_RERUN] Re-run each test case the specified number of times "
"if a test failure occurs",
)
@@ -177,6 +274,10 @@ def _get_parser() -> argparse.ArgumentParser:
def get_settings() -> Settings:
+ """Create new settings with inputs from the user.
+
+ The inputs are taken from the command line and from environment variables.
+ """
parsed_args = _get_parser().parse_args()
return Settings(
config_file_path=parsed_args.config_file,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 06/21] dts: logger and utils docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (4 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 05/21] dts: settings " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-20 16:23 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 07/21] dts: dts runner and main " Juraj Linkeš
` (15 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/logger.py | 72 ++++++++++++++++++++++-----------
dts/framework/utils.py | 88 +++++++++++++++++++++++++++++------------
2 files changed, 113 insertions(+), 47 deletions(-)
diff --git a/dts/framework/logger.py b/dts/framework/logger.py
index bb2991e994..d3eb75a4e4 100644
--- a/dts/framework/logger.py
+++ b/dts/framework/logger.py
@@ -3,9 +3,9 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-DTS logger module with several log level. DTS framework and TestSuite logs
-are saved in different log files.
+"""DTS logger module.
+
+DTS framework and TestSuite logs are saved in different log files.
"""
import logging
@@ -18,19 +18,21 @@
stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
-class LoggerDictType(TypedDict):
- logger: "DTSLOG"
- name: str
- node: str
-
+class DTSLOG(logging.LoggerAdapter):
+ """DTS logger adapter class for framework and testsuites.
-# List for saving all using loggers
-Loggers: list[LoggerDictType] = []
+ The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
+ variable control the verbosity of output. If enabled, all messages will be emitted to the
+ console.
+ The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+ variable modify the directory where the logs will be stored.
-class DTSLOG(logging.LoggerAdapter):
- """
- DTS log class for framework and testsuite.
+ Attributes:
+ node: The additional identifier. Currently unused.
+ sh: The handler which emits logs to console.
+ fh: The handler which emits logs to a file.
+ verbose_fh: Just as fh, but logs with a different, more verbose, format.
"""
_logger: logging.Logger
@@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
verbose_fh: logging.FileHandler
def __init__(self, logger: logging.Logger, node: str = "suite"):
+ """Extend the constructor with additional handlers.
+
+ One handler logs to the console, the other one to a file, with either a regular or verbose
+ format.
+
+ Args:
+ logger: The logger from which to create the logger adapter.
+ node: An additional identifier. Currently unused.
+ """
self._logger = logger
# 1 means log everything, this will be used by file handlers if their level
# is not set
@@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
def logger_exit(self) -> None:
- """
- Remove stream handler and logfile handler.
- """
+ """Remove the stream handler and the logfile handler."""
for handler in (self.sh, self.fh, self.verbose_fh):
handler.flush()
self._logger.removeHandler(handler)
+class _LoggerDictType(TypedDict):
+ logger: DTSLOG
+ name: str
+ node: str
+
+
+# List for saving all loggers in use
+_Loggers: list[_LoggerDictType] = []
+
+
def getLogger(name: str, node: str = "suite") -> DTSLOG:
+ """Get DTS logger adapter identified by name and node.
+
+ An existing logger will be return if one with the exact name and node already exists.
+ A new one will be created and stored otherwise.
+
+ Args:
+ name: The name of the logger.
+ node: An additional identifier for the logger.
+
+ Returns:
+ A logger uniquely identified by both name and node.
"""
- Get logger handler and if there's no handler for specified Node will create one.
- """
- global Loggers
+ global _Loggers
# return saved logger
- logger: LoggerDictType
- for logger in Loggers:
+ logger: _LoggerDictType
+ for logger in _Loggers:
if logger["name"] == name and logger["node"] == node:
return logger["logger"]
# return new logger
dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
- Loggers.append({"logger": dts_logger, "name": name, "node": node})
+ _Loggers.append({"logger": dts_logger, "name": name, "node": node})
return dts_logger
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index f0c916471c..5016e3be10 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -3,6 +3,16 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Various utility classes and functions.
+
+These are used in multiple modules across the framework. They're here because
+they provide some non-specific functionality, greatly simplify imports or just don't
+fit elsewhere.
+
+Attributes:
+ REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
+"""
+
import atexit
import json
import os
@@ -19,12 +29,20 @@
def expand_range(range_str: str) -> list[int]:
- """
- Process range string into a list of integers. There are two possible formats:
- n - a single integer
- n-m - a range of integers
+ """Process `range_str` into a list of integers.
+
+ There are two possible formats of `range_str`:
+
+ * ``n`` - a single integer,
+ * ``n-m`` - a range of integers.
- The returned range includes both n and m. Empty string returns an empty list.
+ The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
+
+ Args:
+ range_str: The range to expand.
+
+ Returns:
+ All the numbers from the range.
"""
expanded_range: list[int] = []
if range_str:
@@ -39,6 +57,14 @@ def expand_range(range_str: str) -> list[int]:
def get_packet_summaries(packets: list[Packet]) -> str:
+ """Format a string summary from `packets`.
+
+ Args:
+ packets: The packets to format.
+
+ Returns:
+ The summary of `packets`.
+ """
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -49,6 +75,8 @@ def get_packet_summaries(packets: list[Packet]) -> str:
class StrEnum(Enum):
+ """Enum with members stored as strings."""
+
@staticmethod
def _generate_next_value_(
name: str, start: int, count: int, last_values: object
@@ -56,22 +84,29 @@ def _generate_next_value_(
return name
def __str__(self) -> str:
+ """The string representation is the name of the member."""
return self.name
class MesonArgs(object):
- """
- Aggregate the arguments needed to build DPDK:
- default_library: Default library type, Meson allows "shared", "static" and "both".
- Defaults to None, in which case the argument won't be used.
- Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
- Do not use -D with them, for example:
- meson_args = MesonArgs(enable_kmods=True).
- """
+ """Aggregate the arguments needed to build DPDK."""
_default_library: str
def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
+ """Initialize the meson arguments.
+
+ Args:
+ default_library: The default library type, Meson supports ``shared``, ``static`` and
+ ``both``. Defaults to :data:`None`, in which case the argument won't be used.
+ dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Example:
+ ::
+
+ meson_args = MesonArgs(enable_kmods=True).
+ """
self._default_library = (
f"--default-library={default_library}" if default_library else ""
)
@@ -83,6 +118,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
)
def __str__(self) -> str:
+ """The actual args."""
return " ".join(f"{self._default_library} {self._dpdk_args}".split())
@@ -104,24 +140,14 @@ class _TarCompressionFormat(StrEnum):
class DPDKGitTarball(object):
- """Create a compressed tarball of DPDK from the repository.
-
- The DPDK version is specified with git object git_ref.
- The tarball will be compressed with _TarCompressionFormat,
- which must be supported by the DTS execution environment.
- The resulting tarball will be put into output_dir.
+ """Compressed tarball of DPDK from the repository.
- The class supports the os.PathLike protocol,
+ The class supports the :class:`os.PathLike` protocol,
which is used to get the Path of the tarball::
from pathlib import Path
tarball = DPDKGitTarball("HEAD", "output")
tarball_path = Path(tarball)
-
- Arguments:
- git_ref: A git commit ID, tag ID or tree ID.
- output_dir: The directory where to put the resulting tarball.
- tar_compression_format: The compression format to use.
"""
_git_ref: str
@@ -136,6 +162,17 @@ def __init__(
output_dir: str,
tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
):
+ """Create the tarball during initialization.
+
+ The DPDK version is specified with `git_ref`. The tarball will be compressed with
+ `tar_compression_format`, which must be supported by the DTS execution environment.
+ The resulting tarball will be put into `output_dir`.
+
+ Args:
+ git_ref: A git commit ID, tag ID or tree ID.
+ output_dir: The directory where to put the resulting tarball.
+ tar_compression_format: The compression format to use.
+ """
self._git_ref = git_ref
self._tar_compression_format = tar_compression_format
@@ -204,4 +241,5 @@ def _delete_tarball(self) -> None:
os.remove(self._tarball_path)
def __fspath__(self) -> str:
+ """The os.PathLike protocol implementation."""
return str(self._tarball_path)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 06/21] dts: logger and utils docstring update
2023-11-15 13:09 ` [PATCH v7 06/21] dts: logger and utils " Juraj Linkeš
@ 2023-11-20 16:23 ` Yoan Picchi
2023-11-20 16:36 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-20 16:23 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/logger.py | 72 ++++++++++++++++++++++-----------
> dts/framework/utils.py | 88 +++++++++++++++++++++++++++++------------
> 2 files changed, 113 insertions(+), 47 deletions(-)
>
> diff --git a/dts/framework/logger.py b/dts/framework/logger.py
> index bb2991e994..d3eb75a4e4 100644
> --- a/dts/framework/logger.py
> +++ b/dts/framework/logger.py
> @@ -3,9 +3,9 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> -"""
> -DTS logger module with several log level. DTS framework and TestSuite logs
> -are saved in different log files.
> +"""DTS logger module.
> +
> +DTS framework and TestSuite logs are saved in different log files.
> """
>
> import logging
> @@ -18,19 +18,21 @@
> stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
>
>
> -class LoggerDictType(TypedDict):
> - logger: "DTSLOG"
> - name: str
> - node: str
> -
> +class DTSLOG(logging.LoggerAdapter):
> + """DTS logger adapter class for framework and testsuites.
>
> -# List for saving all using loggers
> -Loggers: list[LoggerDictType] = []
> + The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
> + variable control the verbosity of output. If enabled, all messages will be emitted to the
> + console.
>
> + The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
> + variable modify the directory where the logs will be stored.
>
> -class DTSLOG(logging.LoggerAdapter):
> - """
> - DTS log class for framework and testsuite.
> + Attributes:
> + node: The additional identifier. Currently unused.
> + sh: The handler which emits logs to console.
> + fh: The handler which emits logs to a file.
> + verbose_fh: Just as fh, but logs with a different, more verbose, format.
> """
>
> _logger: logging.Logger
> @@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
> verbose_fh: logging.FileHandler
>
> def __init__(self, logger: logging.Logger, node: str = "suite"):
> + """Extend the constructor with additional handlers.
> +
> + One handler logs to the console, the other one to a file, with either a regular or verbose
> + format.
> +
> + Args:
> + logger: The logger from which to create the logger adapter.
> + node: An additional identifier. Currently unused.
> + """
> self._logger = logger
> # 1 means log everything, this will be used by file handlers if their level
> # is not set
> @@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
> super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
>
> def logger_exit(self) -> None:
> - """
> - Remove stream handler and logfile handler.
> - """
> + """Remove the stream handler and the logfile handler."""
> for handler in (self.sh, self.fh, self.verbose_fh):
> handler.flush()
> self._logger.removeHandler(handler)
>
>
> +class _LoggerDictType(TypedDict):
> + logger: DTSLOG
> + name: str
> + node: str
> +
> +
> +# List for saving all loggers in use
> +_Loggers: list[_LoggerDictType] = []
> +
> +
> def getLogger(name: str, node: str = "suite") -> DTSLOG:
> + """Get DTS logger adapter identified by name and node.
> +
> + An existing logger will be return if one with the exact name and node already exists.
An existing logger will be return*ed*
> + A new one will be created and stored otherwise.
> +
> + Args:
> + name: The name of the logger.
> + node: An additional identifier for the logger.
> +
> + Returns:
> + A logger uniquely identified by both name and node.
> """
> - Get logger handler and if there's no handler for specified Node will create one.
> - """
> - global Loggers
> + global _Loggers
> # return saved logger
> - logger: LoggerDictType
> - for logger in Loggers:
> + logger: _LoggerDictType
> + for logger in _Loggers:
> if logger["name"] == name and logger["node"] == node:
> return logger["logger"]
>
> # return new logger
> dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
> - Loggers.append({"logger": dts_logger, "name": name, "node": node})
> + _Loggers.append({"logger": dts_logger, "name": name, "node": node})
> return dts_logger
> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> index f0c916471c..5016e3be10 100644
> --- a/dts/framework/utils.py
> +++ b/dts/framework/utils.py
> @@ -3,6 +3,16 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> +"""Various utility classes and functions.
> +
> +These are used in multiple modules across the framework. They're here because
> +they provide some non-specific functionality, greatly simplify imports or just don't
> +fit elsewhere.
> +
> +Attributes:
> + REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
> +"""
> +
> import atexit
> import json
> import os
> @@ -19,12 +29,20 @@
>
>
> def expand_range(range_str: str) -> list[int]:
> - """
> - Process range string into a list of integers. There are two possible formats:
> - n - a single integer
> - n-m - a range of integers
> + """Process `range_str` into a list of integers.
> +
> + There are two possible formats of `range_str`:
> +
> + * ``n`` - a single integer,
> + * ``n-m`` - a range of integers.
>
> - The returned range includes both n and m. Empty string returns an empty list.
> + The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
> +
> + Args:
> + range_str: The range to expand.
> +
> + Returns:
> + All the numbers from the range.
> """
> expanded_range: list[int] = []
> if range_str:
> @@ -39,6 +57,14 @@ def expand_range(range_str: str) -> list[int]:
>
>
> def get_packet_summaries(packets: list[Packet]) -> str:
> + """Format a string summary from `packets`.
> +
> + Args:
> + packets: The packets to format.
> +
> + Returns:
> + The summary of `packets`.
> + """
> if len(packets) == 1:
> packet_summaries = packets[0].summary()
> else:
> @@ -49,6 +75,8 @@ def get_packet_summaries(packets: list[Packet]) -> str:
>
>
> class StrEnum(Enum):
> + """Enum with members stored as strings."""
> +
> @staticmethod
> def _generate_next_value_(
> name: str, start: int, count: int, last_values: object
> @@ -56,22 +84,29 @@ def _generate_next_value_(
> return name
>
> def __str__(self) -> str:
> + """The string representation is the name of the member."""
> return self.name
>
>
> class MesonArgs(object):
> - """
> - Aggregate the arguments needed to build DPDK:
> - default_library: Default library type, Meson allows "shared", "static" and "both".
> - Defaults to None, in which case the argument won't be used.
> - Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
> - Do not use -D with them, for example:
> - meson_args = MesonArgs(enable_kmods=True).
> - """
> + """Aggregate the arguments needed to build DPDK."""
>
> _default_library: str
>
> def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
> + """Initialize the meson arguments.
> +
> + Args:
> + default_library: The default library type, Meson supports ``shared``, ``static`` and
> + ``both``. Defaults to :data:`None`, in which case the argument won't be used.
> + dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
> + Do not use ``-D`` with them.
> +
> + Example:
> + ::
> +
> + meson_args = MesonArgs(enable_kmods=True).
> + """
> self._default_library = (
> f"--default-library={default_library}" if default_library else ""
> )
> @@ -83,6 +118,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
> )
>
> def __str__(self) -> str:
> + """The actual args."""
> return " ".join(f"{self._default_library} {self._dpdk_args}".split())
>
>
> @@ -104,24 +140,14 @@ class _TarCompressionFormat(StrEnum):
>
>
> class DPDKGitTarball(object):
> - """Create a compressed tarball of DPDK from the repository.
> -
> - The DPDK version is specified with git object git_ref.
> - The tarball will be compressed with _TarCompressionFormat,
> - which must be supported by the DTS execution environment.
> - The resulting tarball will be put into output_dir.
> + """Compressed tarball of DPDK from the repository.
>
> - The class supports the os.PathLike protocol,
> + The class supports the :class:`os.PathLike` protocol,
> which is used to get the Path of the tarball::
>
> from pathlib import Path
> tarball = DPDKGitTarball("HEAD", "output")
> tarball_path = Path(tarball)
> -
> - Arguments:
> - git_ref: A git commit ID, tag ID or tree ID.
> - output_dir: The directory where to put the resulting tarball.
> - tar_compression_format: The compression format to use.
> """
>
> _git_ref: str
> @@ -136,6 +162,17 @@ def __init__(
> output_dir: str,
> tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
> ):
> + """Create the tarball during initialization.
> +
> + The DPDK version is specified with `git_ref`. The tarball will be compressed with
> + `tar_compression_format`, which must be supported by the DTS execution environment.
> + The resulting tarball will be put into `output_dir`.
> +
> + Args:
> + git_ref: A git commit ID, tag ID or tree ID.
> + output_dir: The directory where to put the resulting tarball.
> + tar_compression_format: The compression format to use.
> + """
> self._git_ref = git_ref
> self._tar_compression_format = tar_compression_format
>
> @@ -204,4 +241,5 @@ def _delete_tarball(self) -> None:
> os.remove(self._tarball_path)
>
> def __fspath__(self) -> str:
> + """The os.PathLike protocol implementation."""
> return str(self._tarball_path)
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 06/21] dts: logger and utils docstring update
2023-11-20 16:23 ` Yoan Picchi
@ 2023-11-20 16:36 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-20 16:36 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Mon, Nov 20, 2023 at 5:23 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/logger.py | 72 ++++++++++++++++++++++-----------
> > dts/framework/utils.py | 88 +++++++++++++++++++++++++++++------------
> > 2 files changed, 113 insertions(+), 47 deletions(-)
> >
> > diff --git a/dts/framework/logger.py b/dts/framework/logger.py
> > index bb2991e994..d3eb75a4e4 100644
> > --- a/dts/framework/logger.py
> > +++ b/dts/framework/logger.py
> > @@ -3,9 +3,9 @@
> > # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022-2023 University of New Hampshire
> >
> > -"""
> > -DTS logger module with several log level. DTS framework and TestSuite logs
> > -are saved in different log files.
> > +"""DTS logger module.
> > +
> > +DTS framework and TestSuite logs are saved in different log files.
> > """
> >
> > import logging
> > @@ -18,19 +18,21 @@
> > stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
> >
> >
> > -class LoggerDictType(TypedDict):
> > - logger: "DTSLOG"
> > - name: str
> > - node: str
> > -
> > +class DTSLOG(logging.LoggerAdapter):
> > + """DTS logger adapter class for framework and testsuites.
> >
> > -# List for saving all using loggers
> > -Loggers: list[LoggerDictType] = []
> > + The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
> > + variable control the verbosity of output. If enabled, all messages will be emitted to the
> > + console.
> >
> > + The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
> > + variable modify the directory where the logs will be stored.
> >
> > -class DTSLOG(logging.LoggerAdapter):
> > - """
> > - DTS log class for framework and testsuite.
> > + Attributes:
> > + node: The additional identifier. Currently unused.
> > + sh: The handler which emits logs to console.
> > + fh: The handler which emits logs to a file.
> > + verbose_fh: Just as fh, but logs with a different, more verbose, format.
> > """
> >
> > _logger: logging.Logger
> > @@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
> > verbose_fh: logging.FileHandler
> >
> > def __init__(self, logger: logging.Logger, node: str = "suite"):
> > + """Extend the constructor with additional handlers.
> > +
> > + One handler logs to the console, the other one to a file, with either a regular or verbose
> > + format.
> > +
> > + Args:
> > + logger: The logger from which to create the logger adapter.
> > + node: An additional identifier. Currently unused.
> > + """
> > self._logger = logger
> > # 1 means log everything, this will be used by file handlers if their level
> > # is not set
> > @@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
> > super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
> >
> > def logger_exit(self) -> None:
> > - """
> > - Remove stream handler and logfile handler.
> > - """
> > + """Remove the stream handler and the logfile handler."""
> > for handler in (self.sh, self.fh, self.verbose_fh):
> > handler.flush()
> > self._logger.removeHandler(handler)
> >
> >
> > +class _LoggerDictType(TypedDict):
> > + logger: DTSLOG
> > + name: str
> > + node: str
> > +
> > +
> > +# List for saving all loggers in use
> > +_Loggers: list[_LoggerDictType] = []
> > +
> > +
> > def getLogger(name: str, node: str = "suite") -> DTSLOG:
> > + """Get DTS logger adapter identified by name and node.
> > +
> > + An existing logger will be return if one with the exact name and node already exists.
>
> An existing logger will be return*ed*
>
Ack, will fix.
> > + A new one will be created and stored otherwise.
> > +
> > + Args:
> > + name: The name of the logger.
> > + node: An additional identifier for the logger.
> > +
> > + Returns:
> > + A logger uniquely identified by both name and node.
> > """
> > - Get logger handler and if there's no handler for specified Node will create one.
> > - """
> > - global Loggers
> > + global _Loggers
> > # return saved logger
> > - logger: LoggerDictType
> > - for logger in Loggers:
> > + logger: _LoggerDictType
> > + for logger in _Loggers:
> > if logger["name"] == name and logger["node"] == node:
> > return logger["logger"]
> >
> > # return new logger
> > dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
> > - Loggers.append({"logger": dts_logger, "name": name, "node": node})
> > + _Loggers.append({"logger": dts_logger, "name": name, "node": node})
> > return dts_logger
> > diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> > index f0c916471c..5016e3be10 100644
> > --- a/dts/framework/utils.py
> > +++ b/dts/framework/utils.py
> > @@ -3,6 +3,16 @@
> > # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022-2023 University of New Hampshire
> >
> > +"""Various utility classes and functions.
> > +
> > +These are used in multiple modules across the framework. They're here because
> > +they provide some non-specific functionality, greatly simplify imports or just don't
> > +fit elsewhere.
> > +
> > +Attributes:
> > + REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
> > +"""
> > +
> > import atexit
> > import json
> > import os
> > @@ -19,12 +29,20 @@
> >
> >
> > def expand_range(range_str: str) -> list[int]:
> > - """
> > - Process range string into a list of integers. There are two possible formats:
> > - n - a single integer
> > - n-m - a range of integers
> > + """Process `range_str` into a list of integers.
> > +
> > + There are two possible formats of `range_str`:
> > +
> > + * ``n`` - a single integer,
> > + * ``n-m`` - a range of integers.
> >
> > - The returned range includes both n and m. Empty string returns an empty list.
> > + The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
> > +
> > + Args:
> > + range_str: The range to expand.
> > +
> > + Returns:
> > + All the numbers from the range.
> > """
> > expanded_range: list[int] = []
> > if range_str:
> > @@ -39,6 +57,14 @@ def expand_range(range_str: str) -> list[int]:
> >
> >
> > def get_packet_summaries(packets: list[Packet]) -> str:
> > + """Format a string summary from `packets`.
> > +
> > + Args:
> > + packets: The packets to format.
> > +
> > + Returns:
> > + The summary of `packets`.
> > + """
> > if len(packets) == 1:
> > packet_summaries = packets[0].summary()
> > else:
> > @@ -49,6 +75,8 @@ def get_packet_summaries(packets: list[Packet]) -> str:
> >
> >
> > class StrEnum(Enum):
> > + """Enum with members stored as strings."""
> > +
> > @staticmethod
> > def _generate_next_value_(
> > name: str, start: int, count: int, last_values: object
> > @@ -56,22 +84,29 @@ def _generate_next_value_(
> > return name
> >
> > def __str__(self) -> str:
> > + """The string representation is the name of the member."""
> > return self.name
> >
> >
> > class MesonArgs(object):
> > - """
> > - Aggregate the arguments needed to build DPDK:
> > - default_library: Default library type, Meson allows "shared", "static" and "both".
> > - Defaults to None, in which case the argument won't be used.
> > - Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
> > - Do not use -D with them, for example:
> > - meson_args = MesonArgs(enable_kmods=True).
> > - """
> > + """Aggregate the arguments needed to build DPDK."""
> >
> > _default_library: str
> >
> > def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
> > + """Initialize the meson arguments.
> > +
> > + Args:
> > + default_library: The default library type, Meson supports ``shared``, ``static`` and
> > + ``both``. Defaults to :data:`None`, in which case the argument won't be used.
> > + dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
> > + Do not use ``-D`` with them.
> > +
> > + Example:
> > + ::
> > +
> > + meson_args = MesonArgs(enable_kmods=True).
> > + """
> > self._default_library = (
> > f"--default-library={default_library}" if default_library else ""
> > )
> > @@ -83,6 +118,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
> > )
> >
> > def __str__(self) -> str:
> > + """The actual args."""
> > return " ".join(f"{self._default_library} {self._dpdk_args}".split())
> >
> >
> > @@ -104,24 +140,14 @@ class _TarCompressionFormat(StrEnum):
> >
> >
> > class DPDKGitTarball(object):
> > - """Create a compressed tarball of DPDK from the repository.
> > -
> > - The DPDK version is specified with git object git_ref.
> > - The tarball will be compressed with _TarCompressionFormat,
> > - which must be supported by the DTS execution environment.
> > - The resulting tarball will be put into output_dir.
> > + """Compressed tarball of DPDK from the repository.
> >
> > - The class supports the os.PathLike protocol,
> > + The class supports the :class:`os.PathLike` protocol,
> > which is used to get the Path of the tarball::
> >
> > from pathlib import Path
> > tarball = DPDKGitTarball("HEAD", "output")
> > tarball_path = Path(tarball)
> > -
> > - Arguments:
> > - git_ref: A git commit ID, tag ID or tree ID.
> > - output_dir: The directory where to put the resulting tarball.
> > - tar_compression_format: The compression format to use.
> > """
> >
> > _git_ref: str
> > @@ -136,6 +162,17 @@ def __init__(
> > output_dir: str,
> > tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
> > ):
> > + """Create the tarball during initialization.
> > +
> > + The DPDK version is specified with `git_ref`. The tarball will be compressed with
> > + `tar_compression_format`, which must be supported by the DTS execution environment.
> > + The resulting tarball will be put into `output_dir`.
> > +
> > + Args:
> > + git_ref: A git commit ID, tag ID or tree ID.
> > + output_dir: The directory where to put the resulting tarball.
> > + tar_compression_format: The compression format to use.
> > + """
> > self._git_ref = git_ref
> > self._tar_compression_format = tar_compression_format
> >
> > @@ -204,4 +241,5 @@ def _delete_tarball(self) -> None:
> > os.remove(self._tarball_path)
> >
> > def __fspath__(self) -> str:
> > + """The os.PathLike protocol implementation."""
> > return str(self._tarball_path)
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 07/21] dts: dts runner and main docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (5 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 06/21] dts: logger and utils " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-16 21:51 ` Jeremy Spewock
2023-11-20 17:43 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 08/21] dts: test suite " Juraj Linkeš
` (14 subsequent siblings)
21 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/dts.py | 128 ++++++++++++++++++++++++++++++++++++-------
dts/main.py | 8 ++-
2 files changed, 112 insertions(+), 24 deletions(-)
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 4c7fb0c40a..331fed7dc4 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,33 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+r"""Test suite runner module.
+
+A DTS run is split into stages:
+
+ #. Execution stage,
+ #. Build target stage,
+ #. Test suite stage,
+ #. Test case stage.
+
+The module is responsible for running tests on testbeds defined in the test run configuration.
+Each setup or teardown of each stage is recorded in a :class:`~framework.test_result.DTSResult` or
+one of its subclasses. The test case results are also recorded.
+
+If an error occurs, the current stage is aborted, the error is recorded and the run continues in
+the next iteration of the same stage. The return code is the highest `severity` of all
+:class:`~.framework.exception.DTSError`\s.
+
+Example:
+ An error occurs in a build target setup. The current build target is aborted and the run
+ continues with the next build target. If the errored build target was the last one in the given
+ execution, the next execution begins.
+
+Attributes:
+ dts_logger: The logger instance used in this module.
+ result: The top level result used in the module.
+"""
+
import sys
from .config import (
@@ -23,9 +50,38 @@
def run_all() -> None:
- """
- The main process of DTS. Runs all build targets in all executions from the main
- config file.
+ """Run all build targets in all executions from the test run configuration.
+
+ Before running test suites, executions and build targets are first set up.
+ The executions and build targets defined in the test run configuration are iterated over.
+ The executions define which tests to run and where to run them and build targets define
+ the DPDK build setup.
+
+ The tests suites are set up for each execution/build target tuple and each scheduled
+ test case within the test suite is set up, executed and torn down. After all test cases
+ have been executed, the test suite is torn down and the next build target will be tested.
+
+ All the nested steps look like this:
+
+ #. Execution setup
+
+ #. Build target setup
+
+ #. Test suite setup
+
+ #. Test case setup
+ #. Test case logic
+ #. Test case teardown
+
+ #. Test suite teardown
+
+ #. Build target teardown
+
+ #. Execution teardown
+
+ The test cases are filtered according to the specification in the test run configuration and
+ the :option:`--test-cases` command line argument or
+ the :envvar:`DTS_TESTCASES` environment variable.
"""
global dts_logger
global result
@@ -87,6 +143,8 @@ def run_all() -> None:
def _check_dts_python_version() -> None:
+ """Check the required Python version - v3.10."""
+
def RED(text: str) -> str:
return f"\u001B[31;1m{str(text)}\u001B[0m"
@@ -111,9 +169,16 @@ def _run_execution(
execution: ExecutionConfiguration,
result: DTSResult,
) -> None:
- """
- Run the given execution. This involves running the execution setup as well as
- running all build targets in the given execution.
+ """Run the given execution.
+
+ This involves running the execution setup as well as running all build targets
+ in the given execution. After that, execution teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: An execution's test run configuration.
+ result: The top level result object.
"""
dts_logger.info(
f"Running execution with SUT '{execution.system_under_test_node.name}'."
@@ -150,8 +215,18 @@ def _run_build_target(
execution: ExecutionConfiguration,
execution_result: ExecutionResult,
) -> None:
- """
- Run the given build target.
+ """Run the given build target.
+
+ This involves running the build target setup as well as running all test suites
+ in the given execution the build target is defined in.
+ After that, build target teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ build_target: A build target's test run configuration.
+ execution: The build target's execution's test run configuration.
+ execution_result: The execution level result object associated with the execution.
"""
dts_logger.info(f"Running build target '{build_target.name}'.")
build_target_result = execution_result.add_build_target(build_target)
@@ -183,10 +258,17 @@ def _run_all_suites(
execution: ExecutionConfiguration,
build_target_result: BuildTargetResult,
) -> None:
- """
- Use the given build_target to run execution's test suites
- with possibly only a subset of test cases.
- If no subset is specified, run all test cases.
+ """Run the execution's (possibly a subset) test suites using the current build_target.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
"""
end_build_target = False
if not execution.skip_smoke_tests:
@@ -215,16 +297,22 @@ def _run_single_suite(
build_target_result: BuildTargetResult,
test_suite_config: TestSuiteConfig,
) -> None:
- """Runs a single test suite.
+ """Run all test suite in a single test suite module.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
Args:
- sut_node: Node to run tests on.
- execution: Execution the test case belongs to.
- build_target_result: Build target configuration test case is run on
- test_suite_config: Test suite configuration
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
+ test_suite_config: Test suite test run configuration specifying the test suite module
+ and possibly a subset of test cases of test suites in that module.
Raises:
- BlockingTestSuiteError: If a test suite that was marked as blocking fails.
+ BlockingTestSuiteError: If a blocking test suite fails.
"""
try:
full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
@@ -248,9 +336,7 @@ def _run_single_suite(
def _exit_dts() -> None:
- """
- Process all errors and exit with the proper exit code.
- """
+ """Process all errors and exit with the proper exit code."""
result.process()
if dts_logger:
diff --git a/dts/main.py b/dts/main.py
index 5d4714b0c3..f703615d11 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -4,9 +4,7 @@
# Copyright(c) 2022 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
-"""
-A test framework for testing DPDK.
-"""
+"""The DTS executable."""
import logging
@@ -17,6 +15,10 @@ def main() -> None:
"""Set DTS settings, then run DTS.
The DTS settings are taken from the command line arguments and the environment variables.
+ The settings object is stored in the module-level variable settings.SETTINGS which the entire
+ framework uses. After importing the module (or the variable), any changes to the variable are
+ not going to be reflected without a re-import. This means that the SETTINGS variable must
+ be modified before the settings module is imported anywhere else in the framework.
"""
settings.SETTINGS = settings.get_settings()
from framework import dts
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 07/21] dts: dts runner and main docstring update
2023-11-15 13:09 ` [PATCH v7 07/21] dts: dts runner and main " Juraj Linkeš
@ 2023-11-16 21:51 ` Jeremy Spewock
2023-11-20 16:13 ` Juraj Linkeš
2023-11-20 17:43 ` Yoan Picchi
1 sibling, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-11-16 21:51 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
[-- Attachment #1: Type: text/plain, Size: 9699 bytes --]
On Wed, Nov 15, 2023 at 8:11 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/dts.py | 128 ++++++++++++++++++++++++++++++++++++-------
> dts/main.py | 8 ++-
> 2 files changed, 112 insertions(+), 24 deletions(-)
>
> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> index 4c7fb0c40a..331fed7dc4 100644
> --- a/dts/framework/dts.py
> +++ b/dts/framework/dts.py
> @@ -3,6 +3,33 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> +r"""Test suite runner module.
> +
> +A DTS run is split into stages:
> +
> + #. Execution stage,
> + #. Build target stage,
> + #. Test suite stage,
> + #. Test case stage.
> +
> +The module is responsible for running tests on testbeds defined in the
> test run configuration.
> +Each setup or teardown of each stage is recorded in a
> :class:`~framework.test_result.DTSResult` or
> +one of its subclasses. The test case results are also recorded.
> +
> +If an error occurs, the current stage is aborted, the error is recorded
> and the run continues in
> +the next iteration of the same stage. The return code is the highest
> `severity` of all
> +:class:`~.framework.exception.DTSError`\s.
> +
> +Example:
> + An error occurs in a build target setup. The current build target is
> aborted and the run
> + continues with the next build target. If the errored build target was
> the last one in the given
> + execution, the next execution begins.
> +
> +Attributes:
> + dts_logger: The logger instance used in this module.
> + result: The top level result used in the module.
> +"""
> +
> import sys
>
> from .config import (
> @@ -23,9 +50,38 @@
>
>
> def run_all() -> None:
> - """
> - The main process of DTS. Runs all build targets in all executions
> from the main
> - config file.
> + """Run all build targets in all executions from the test run
> configuration.
> +
> + Before running test suites, executions and build targets are first
> set up.
> + The executions and build targets defined in the test run
> configuration are iterated over.
> + The executions define which tests to run and where to run them and
> build targets define
> + the DPDK build setup.
> +
> + The tests suites are set up for each execution/build target tuple and
> each scheduled
> + test case within the test suite is set up, executed and torn down.
> After all test cases
> + have been executed, the test suite is torn down and the next build
> target will be tested.
> +
> + All the nested steps look like this:
> +
> + #. Execution setup
> +
> + #. Build target setup
> +
> + #. Test suite setup
> +
> + #. Test case setup
> + #. Test case logic
> + #. Test case teardown
> +
> + #. Test suite teardown
> +
> + #. Build target teardown
> +
> + #. Execution teardown
> +
> + The test cases are filtered according to the specification in the
> test run configuration and
> + the :option:`--test-cases` command line argument or
> + the :envvar:`DTS_TESTCASES` environment variable.
> """
> global dts_logger
> global result
> @@ -87,6 +143,8 @@ def run_all() -> None:
>
>
> def _check_dts_python_version() -> None:
> + """Check the required Python version - v3.10."""
> +
> def RED(text: str) -> str:
> return f"\u001B[31;1m{str(text)}\u001B[0m"
>
> @@ -111,9 +169,16 @@ def _run_execution(
> execution: ExecutionConfiguration,
> result: DTSResult,
> ) -> None:
> - """
> - Run the given execution. This involves running the execution setup as
> well as
> - running all build targets in the given execution.
> + """Run the given execution.
> +
> + This involves running the execution setup as well as running all
> build targets
> + in the given execution. After that, execution teardown is run.
> +
> + Args:
> + sut_node: The execution's SUT node.
> + tg_node: The execution's TG node.
> + execution: An execution's test run configuration.
> + result: The top level result object.
> """
> dts_logger.info(
> f"Running execution with SUT '{
> execution.system_under_test_node.name}'."
> @@ -150,8 +215,18 @@ def _run_build_target(
> execution: ExecutionConfiguration,
> execution_result: ExecutionResult,
> ) -> None:
> - """
> - Run the given build target.
> + """Run the given build target.
> +
> + This involves running the build target setup as well as running all
> test suites
> + in the given execution the build target is defined in.
> + After that, build target teardown is run.
> +
> + Args:
> + sut_node: The execution's SUT node.
> + tg_node: The execution's TG node.
> + build_target: A build target's test run configuration.
> + execution: The build target's execution's test run configuration.
> + execution_result: The execution level result object associated
> with the execution.
> """
> dts_logger.info(f"Running build target '{build_target.name}'.")
> build_target_result = execution_result.add_build_target(build_target)
> @@ -183,10 +258,17 @@ def _run_all_suites(
> execution: ExecutionConfiguration,
> build_target_result: BuildTargetResult,
> ) -> None:
> - """
> - Use the given build_target to run execution's test suites
> - with possibly only a subset of test cases.
> - If no subset is specified, run all test cases.
> + """Run the execution's (possibly a subset) test suites using the
> current build_target.
> +
> + The function assumes the build target we're testing has already been
> built on the SUT node.
> + The current build target thus corresponds to the current DPDK build
> present on the SUT node.
> +
> + Args:
> + sut_node: The execution's SUT node.
> + tg_node: The execution's TG node.
> + execution: The execution's test run configuration associated with
> the current build target.
> + build_target_result: The build target level result object
> associated
> + with the current build target.
> """
>
Is it worth mentioning in this method or the _run_build_target method that
when a blocking suite fails that no more suites will be run on that build
target?
> end_build_target = False
> if not execution.skip_smoke_tests:
> @@ -215,16 +297,22 @@ def _run_single_suite(
> build_target_result: BuildTargetResult,
> test_suite_config: TestSuiteConfig,
> ) -> None:
> - """Runs a single test suite.
> + """Run all test suite in a single test suite module.
> +
> + The function assumes the build target we're testing has already been
> built on the SUT node.
> + The current build target thus corresponds to the current DPDK build
> present on the SUT node.
>
> Args:
> - sut_node: Node to run tests on.
> - execution: Execution the test case belongs to.
> - build_target_result: Build target configuration test case is run
> on
> - test_suite_config: Test suite configuration
> + sut_node: The execution's SUT node.
> + tg_node: The execution's TG node.
> + execution: The execution's test run configuration associated with
> the current build target.
> + build_target_result: The build target level result object
> associated
> + with the current build target.
> + test_suite_config: Test suite test run configuration specifying
> the test suite module
> + and possibly a subset of test cases of test suites in that
> module.
>
> Raises:
> - BlockingTestSuiteError: If a test suite that was marked as
> blocking fails.
> + BlockingTestSuiteError: If a blocking test suite fails.
> """
> try:
> full_suite_path =
> f"tests.TestSuite_{test_suite_config.test_suite}"
> @@ -248,9 +336,7 @@ def _run_single_suite(
>
>
> def _exit_dts() -> None:
> - """
> - Process all errors and exit with the proper exit code.
> - """
> + """Process all errors and exit with the proper exit code."""
> result.process()
>
> if dts_logger:
> diff --git a/dts/main.py b/dts/main.py
> index 5d4714b0c3..f703615d11 100755
> --- a/dts/main.py
> +++ b/dts/main.py
> @@ -4,9 +4,7 @@
> # Copyright(c) 2022 PANTHEON.tech s.r.o.
> # Copyright(c) 2022 University of New Hampshire
>
> -"""
> -A test framework for testing DPDK.
> -"""
> +"""The DTS executable."""
>
> import logging
>
> @@ -17,6 +15,10 @@ def main() -> None:
> """Set DTS settings, then run DTS.
>
> The DTS settings are taken from the command line arguments and the
> environment variables.
> + The settings object is stored in the module-level variable
> settings.SETTINGS which the entire
> + framework uses. After importing the module (or the variable), any
> changes to the variable are
> + not going to be reflected without a re-import. This means that the
> SETTINGS variable must
> + be modified before the settings module is imported anywhere else in
> the framework.
> """
> settings.SETTINGS = settings.get_settings()
> from framework import dts
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 11957 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 07/21] dts: dts runner and main docstring update
2023-11-16 21:51 ` Jeremy Spewock
@ 2023-11-20 16:13 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-20 16:13 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
On Thu, Nov 16, 2023 at 10:51 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
>
>
>
> On Wed, Nov 15, 2023 at 8:11 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>>
>> Format according to the Google format and PEP257, with slight
>> deviations.
>>
>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>> ---
>> dts/framework/dts.py | 128 ++++++++++++++++++++++++++++++++++++-------
>> dts/main.py | 8 ++-
>> 2 files changed, 112 insertions(+), 24 deletions(-)
>>
>> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
>> index 4c7fb0c40a..331fed7dc4 100644
>> --- a/dts/framework/dts.py
>> +++ b/dts/framework/dts.py
>> @@ -3,6 +3,33 @@
>> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>> # Copyright(c) 2022-2023 University of New Hampshire
>>
>> +r"""Test suite runner module.
>> +
>> +A DTS run is split into stages:
>> +
>> + #. Execution stage,
>> + #. Build target stage,
>> + #. Test suite stage,
>> + #. Test case stage.
>> +
>> +The module is responsible for running tests on testbeds defined in the test run configuration.
>> +Each setup or teardown of each stage is recorded in a :class:`~framework.test_result.DTSResult` or
>> +one of its subclasses. The test case results are also recorded.
>> +
>> +If an error occurs, the current stage is aborted, the error is recorded and the run continues in
>> +the next iteration of the same stage. The return code is the highest `severity` of all
>> +:class:`~.framework.exception.DTSError`\s.
>> +
>> +Example:
>> + An error occurs in a build target setup. The current build target is aborted and the run
>> + continues with the next build target. If the errored build target was the last one in the given
>> + execution, the next execution begins.
>> +
>> +Attributes:
>> + dts_logger: The logger instance used in this module.
>> + result: The top level result used in the module.
>> +"""
>> +
>> import sys
>>
>> from .config import (
>> @@ -23,9 +50,38 @@
>>
>>
>> def run_all() -> None:
>> - """
>> - The main process of DTS. Runs all build targets in all executions from the main
>> - config file.
>> + """Run all build targets in all executions from the test run configuration.
>> +
>> + Before running test suites, executions and build targets are first set up.
>> + The executions and build targets defined in the test run configuration are iterated over.
>> + The executions define which tests to run and where to run them and build targets define
>> + the DPDK build setup.
>> +
>> + The tests suites are set up for each execution/build target tuple and each scheduled
>> + test case within the test suite is set up, executed and torn down. After all test cases
>> + have been executed, the test suite is torn down and the next build target will be tested.
>> +
>> + All the nested steps look like this:
>> +
>> + #. Execution setup
>> +
>> + #. Build target setup
>> +
>> + #. Test suite setup
>> +
>> + #. Test case setup
>> + #. Test case logic
>> + #. Test case teardown
>> +
>> + #. Test suite teardown
>> +
>> + #. Build target teardown
>> +
>> + #. Execution teardown
>> +
>> + The test cases are filtered according to the specification in the test run configuration and
>> + the :option:`--test-cases` command line argument or
>> + the :envvar:`DTS_TESTCASES` environment variable.
>> """
>> global dts_logger
>> global result
>> @@ -87,6 +143,8 @@ def run_all() -> None:
>>
>>
>> def _check_dts_python_version() -> None:
>> + """Check the required Python version - v3.10."""
>> +
>> def RED(text: str) -> str:
>> return f"\u001B[31;1m{str(text)}\u001B[0m"
>>
>> @@ -111,9 +169,16 @@ def _run_execution(
>> execution: ExecutionConfiguration,
>> result: DTSResult,
>> ) -> None:
>> - """
>> - Run the given execution. This involves running the execution setup as well as
>> - running all build targets in the given execution.
>> + """Run the given execution.
>> +
>> + This involves running the execution setup as well as running all build targets
>> + in the given execution. After that, execution teardown is run.
>> +
>> + Args:
>> + sut_node: The execution's SUT node.
>> + tg_node: The execution's TG node.
>> + execution: An execution's test run configuration.
>> + result: The top level result object.
>> """
>> dts_logger.info(
>> f"Running execution with SUT '{execution.system_under_test_node.name}'."
>> @@ -150,8 +215,18 @@ def _run_build_target(
>> execution: ExecutionConfiguration,
>> execution_result: ExecutionResult,
>> ) -> None:
>> - """
>> - Run the given build target.
>> + """Run the given build target.
>> +
>> + This involves running the build target setup as well as running all test suites
>> + in the given execution the build target is defined in.
>> + After that, build target teardown is run.
>> +
>> + Args:
>> + sut_node: The execution's SUT node.
>> + tg_node: The execution's TG node.
>> + build_target: A build target's test run configuration.
>> + execution: The build target's execution's test run configuration.
>> + execution_result: The execution level result object associated with the execution.
>> """
>> dts_logger.info(f"Running build target '{build_target.name}'.")
>> build_target_result = execution_result.add_build_target(build_target)
>> @@ -183,10 +258,17 @@ def _run_all_suites(
>> execution: ExecutionConfiguration,
>> build_target_result: BuildTargetResult,
>> ) -> None:
>> - """
>> - Use the given build_target to run execution's test suites
>> - with possibly only a subset of test cases.
>> - If no subset is specified, run all test cases.
>> + """Run the execution's (possibly a subset) test suites using the current build_target.
>> +
>> + The function assumes the build target we're testing has already been built on the SUT node.
>> + The current build target thus corresponds to the current DPDK build present on the SUT node.
>> +
>> + Args:
>> + sut_node: The execution's SUT node.
>> + tg_node: The execution's TG node.
>> + execution: The execution's test run configuration associated with the current build target.
>> + build_target_result: The build target level result object associated
>> + with the current build target.
>> """
>
>
> Is it worth mentioning in this method or the _run_build_target method that when a blocking suite fails that no more suites will be run on that build target?
>
Absolutely, I'll add that. Thanks for the catch.
>>
>> end_build_target = False
>> if not execution.skip_smoke_tests:
>> @@ -215,16 +297,22 @@ def _run_single_suite(
>> build_target_result: BuildTargetResult,
>> test_suite_config: TestSuiteConfig,
>> ) -> None:
>> - """Runs a single test suite.
>> + """Run all test suite in a single test suite module.
>> +
>> + The function assumes the build target we're testing has already been built on the SUT node.
>> + The current build target thus corresponds to the current DPDK build present on the SUT node.
>>
>> Args:
>> - sut_node: Node to run tests on.
>> - execution: Execution the test case belongs to.
>> - build_target_result: Build target configuration test case is run on
>> - test_suite_config: Test suite configuration
>> + sut_node: The execution's SUT node.
>> + tg_node: The execution's TG node.
>> + execution: The execution's test run configuration associated with the current build target.
>> + build_target_result: The build target level result object associated
>> + with the current build target.
>> + test_suite_config: Test suite test run configuration specifying the test suite module
>> + and possibly a subset of test cases of test suites in that module.
>>
>> Raises:
>> - BlockingTestSuiteError: If a test suite that was marked as blocking fails.
>> + BlockingTestSuiteError: If a blocking test suite fails.
>> """
>> try:
>> full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
>> @@ -248,9 +336,7 @@ def _run_single_suite(
>>
>>
>> def _exit_dts() -> None:
>> - """
>> - Process all errors and exit with the proper exit code.
>> - """
>> + """Process all errors and exit with the proper exit code."""
>> result.process()
>>
>> if dts_logger:
>> diff --git a/dts/main.py b/dts/main.py
>> index 5d4714b0c3..f703615d11 100755
>> --- a/dts/main.py
>> +++ b/dts/main.py
>> @@ -4,9 +4,7 @@
>> # Copyright(c) 2022 PANTHEON.tech s.r.o.
>> # Copyright(c) 2022 University of New Hampshire
>>
>> -"""
>> -A test framework for testing DPDK.
>> -"""
>> +"""The DTS executable."""
>>
>> import logging
>>
>> @@ -17,6 +15,10 @@ def main() -> None:
>> """Set DTS settings, then run DTS.
>>
>> The DTS settings are taken from the command line arguments and the environment variables.
>> + The settings object is stored in the module-level variable settings.SETTINGS which the entire
>> + framework uses. After importing the module (or the variable), any changes to the variable are
>> + not going to be reflected without a re-import. This means that the SETTINGS variable must
>> + be modified before the settings module is imported anywhere else in the framework.
>> """
>> settings.SETTINGS = settings.get_settings()
>> from framework import dts
>> --
>> 2.34.1
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 07/21] dts: dts runner and main docstring update
2023-11-15 13:09 ` [PATCH v7 07/21] dts: dts runner and main " Juraj Linkeš
2023-11-16 21:51 ` Jeremy Spewock
@ 2023-11-20 17:43 ` Yoan Picchi
2023-11-21 9:10 ` Juraj Linkeš
1 sibling, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-20 17:43 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/dts.py | 128 ++++++++++++++++++++++++++++++++++++-------
> dts/main.py | 8 ++-
> 2 files changed, 112 insertions(+), 24 deletions(-)
>
> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> index 4c7fb0c40a..331fed7dc4 100644
> --- a/dts/framework/dts.py
> +++ b/dts/framework/dts.py
> @@ -3,6 +3,33 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> +r"""Test suite runner module.
Is the r before the docstring intended?
> +
> +A DTS run is split into stages:
> +
> + #. Execution stage,
> + #. Build target stage,
> + #. Test suite stage,
> + #. Test case stage.
> +
> +The module is responsible for running tests on testbeds defined in the test run configuration.
> +Each setup or teardown of each stage is recorded in a :class:`~framework.test_result.DTSResult` or
> +one of its subclasses. The test case results are also recorded.
> +
> +If an error occurs, the current stage is aborted, the error is recorded and the run continues in
> +the next iteration of the same stage. The return code is the highest `severity` of all
> +:class:`~.framework.exception.DTSError`\s.
Is the . before the classname intended? considering the previous one
doesn't have one. (I've not yet built the doc to check if it affect the
rendered doc)
> +
> +Example:
> + An error occurs in a build target setup. The current build target is aborted and the run
> + continues with the next build target. If the errored build target was the last one in the given
> + execution, the next execution begins.
> +
> +Attributes:
> + dts_logger: The logger instance used in this module.
> + result: The top level result used in the module.
> +"""
> +
> import sys
>
> from .config import (
> @@ -23,9 +50,38 @@
>
>
> def run_all() -> None:
> - """
> - The main process of DTS. Runs all build targets in all executions from the main
> - config file.
> + """Run all build targets in all executions from the test run configuration.
> +
> + Before running test suites, executions and build targets are first set up.
> + The executions and build targets defined in the test run configuration are iterated over.
> + The executions define which tests to run and where to run them and build targets define
> + the DPDK build setup.
> +
> + The tests suites are set up for each execution/build target tuple and each scheduled
> + test case within the test suite is set up, executed and torn down. After all test cases
> + have been executed, the test suite is torn down and the next build target will be tested.
> +
> + All the nested steps look like this:
> +
> + #. Execution setup
> +
> + #. Build target setup
> +
> + #. Test suite setup
> +
> + #. Test case setup
> + #. Test case logic
> + #. Test case teardown
> +
> + #. Test suite teardown
> +
> + #. Build target teardown
> +
> + #. Execution teardown
> +
> + The test cases are filtered according to the specification in the test run configuration and
> + the :option:`--test-cases` command line argument or
> + the :envvar:`DTS_TESTCASES` environment variable.
> """
> global dts_logger
> global result
> @@ -87,6 +143,8 @@ def run_all() -> None:
>
>
> def _check_dts_python_version() -> None:
> + """Check the required Python version - v3.10."""
> +
> def RED(text: str) -> str:
> return f"\u001B[31;1m{str(text)}\u001B[0m"
>
> @@ -111,9 +169,16 @@ def _run_execution(
> execution: ExecutionConfiguration,
> result: DTSResult,
> ) -> None:
> - """
> - Run the given execution. This involves running the execution setup as well as
> - running all build targets in the given execution.
> + """Run the given execution.
> +
> + This involves running the execution setup as well as running all build targets
> + in the given execution. After that, execution teardown is run.
> +
> + Args:
> + sut_node: The execution's SUT node.
> + tg_node: The execution's TG node.
> + execution: An execution's test run configuration.
> + result: The top level result object.
> """
> dts_logger.info(
> f"Running execution with SUT '{execution.system_under_test_node.name}'."
> @@ -150,8 +215,18 @@ def _run_build_target(
> execution: ExecutionConfiguration,
> execution_result: ExecutionResult,
> ) -> None:
> - """
> - Run the given build target.
> + """Run the given build target.
> +
> + This involves running the build target setup as well as running all test suites
> + in the given execution the build target is defined in.
> + After that, build target teardown is run.
> +
> + Args:
> + sut_node: The execution's SUT node.
> + tg_node: The execution's TG node.
> + build_target: A build target's test run configuration.
> + execution: The build target's execution's test run configuration.
> + execution_result: The execution level result object associated with the execution.
> """
> dts_logger.info(f"Running build target '{build_target.name}'.")
> build_target_result = execution_result.add_build_target(build_target)
> @@ -183,10 +258,17 @@ def _run_all_suites(
> execution: ExecutionConfiguration,
> build_target_result: BuildTargetResult,
> ) -> None:
> - """
> - Use the given build_target to run execution's test suites
> - with possibly only a subset of test cases.
> - If no subset is specified, run all test cases.
> + """Run the execution's (possibly a subset) test suites using the current build_target.
> +
> + The function assumes the build target we're testing has already been built on the SUT node.
> + The current build target thus corresponds to the current DPDK build present on the SUT node.
> +
> + Args:
> + sut_node: The execution's SUT node.
> + tg_node: The execution's TG node.
> + execution: The execution's test run configuration associated with the current build target.
> + build_target_result: The build target level result object associated
> + with the current build target.
> """
> end_build_target = False
> if not execution.skip_smoke_tests:
> @@ -215,16 +297,22 @@ def _run_single_suite(
> build_target_result: BuildTargetResult,
> test_suite_config: TestSuiteConfig,
> ) -> None:
> - """Runs a single test suite.
> + """Run all test suite in a single test suite module.
> +
> + The function assumes the build target we're testing has already been built on the SUT node.
> + The current build target thus corresponds to the current DPDK build present on the SUT node.
>
> Args:
> - sut_node: Node to run tests on.
> - execution: Execution the test case belongs to.
> - build_target_result: Build target configuration test case is run on
> - test_suite_config: Test suite configuration
> + sut_node: The execution's SUT node.
> + tg_node: The execution's TG node.
> + execution: The execution's test run configuration associated with the current build target.
> + build_target_result: The build target level result object associated
> + with the current build target.
> + test_suite_config: Test suite test run configuration specifying the test suite module
> + and possibly a subset of test cases of test suites in that module.
>
> Raises:
> - BlockingTestSuiteError: If a test suite that was marked as blocking fails.
> + BlockingTestSuiteError: If a blocking test suite fails.
> """
> try:
> full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
> @@ -248,9 +336,7 @@ def _run_single_suite(
>
>
> def _exit_dts() -> None:
> - """
> - Process all errors and exit with the proper exit code.
> - """
> + """Process all errors and exit with the proper exit code."""
> result.process()
>
> if dts_logger:
> diff --git a/dts/main.py b/dts/main.py
> index 5d4714b0c3..f703615d11 100755
> --- a/dts/main.py
> +++ b/dts/main.py
> @@ -4,9 +4,7 @@
> # Copyright(c) 2022 PANTHEON.tech s.r.o.
> # Copyright(c) 2022 University of New Hampshire
>
> -"""
> -A test framework for testing DPDK.
> -"""
> +"""The DTS executable."""
>
> import logging
>
> @@ -17,6 +15,10 @@ def main() -> None:
> """Set DTS settings, then run DTS.
>
> The DTS settings are taken from the command line arguments and the environment variables.
> + The settings object is stored in the module-level variable settings.SETTINGS which the entire
> + framework uses. After importing the module (or the variable), any changes to the variable are
> + not going to be reflected without a re-import. This means that the SETTINGS variable must
> + be modified before the settings module is imported anywhere else in the framework.
> """
> settings.SETTINGS = settings.get_settings()
> from framework import dts
Nit: copyright notice update in main
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 07/21] dts: dts runner and main docstring update
2023-11-20 17:43 ` Yoan Picchi
@ 2023-11-21 9:10 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-21 9:10 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Mon, Nov 20, 2023 at 6:43 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/dts.py | 128 ++++++++++++++++++++++++++++++++++++-------
> > dts/main.py | 8 ++-
> > 2 files changed, 112 insertions(+), 24 deletions(-)
> >
> > diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> > index 4c7fb0c40a..331fed7dc4 100644
> > --- a/dts/framework/dts.py
> > +++ b/dts/framework/dts.py
> > @@ -3,6 +3,33 @@
> > # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022-2023 University of New Hampshire
> >
> > +r"""Test suite runner module.
>
> Is the r before the docstring intended?
>
Yes, this is because of :class:`~.framework.exception.DTSError`\s. Any
alphabetical characters after backticks must be escaped for Sphinx to
interpret the string correctly and the way to do that is to make the
string raw (with r before the string).
> > +
> > +A DTS run is split into stages:
> > +
> > + #. Execution stage,
> > + #. Build target stage,
> > + #. Test suite stage,
> > + #. Test case stage.
> > +
> > +The module is responsible for running tests on testbeds defined in the test run configuration.
> > +Each setup or teardown of each stage is recorded in a :class:`~framework.test_result.DTSResult` or
> > +one of its subclasses. The test case results are also recorded.
> > +
> > +If an error occurs, the current stage is aborted, the error is recorded and the run continues in
> > +the next iteration of the same stage. The return code is the highest `severity` of all
> > +:class:`~.framework.exception.DTSError`\s.
>
> Is the . before the classname intended? considering the previous one
> doesn't have one. (I've not yet built the doc to check if it affect the
> rendered doc)
>
Good catch. Not only is the dot suspect, but I looked at all
references starting with the framework dir and the ones that refer to
files in the same directory don't have to specify the full path if
starting with a dot, such as:
~framework.test_result.DTSResult -> ~.test_result.DTSResult
~.framework.exception.DTSError -> ~.exception.DTSError
test_result and exception are in the same dir as dts.py, so the above
work. I'll make these changes in other files as well.
> > +
> > +Example:
> > + An error occurs in a build target setup. The current build target is aborted and the run
> > + continues with the next build target. If the errored build target was the last one in the given
> > + execution, the next execution begins.
> > +
> > +Attributes:
> > + dts_logger: The logger instance used in this module.
> > + result: The top level result used in the module.
> > +"""
> > +
> > import sys
> >
> > from .config import (
> > @@ -23,9 +50,38 @@
> >
> >
> > def run_all() -> None:
> > - """
> > - The main process of DTS. Runs all build targets in all executions from the main
> > - config file.
> > + """Run all build targets in all executions from the test run configuration.
> > +
> > + Before running test suites, executions and build targets are first set up.
> > + The executions and build targets defined in the test run configuration are iterated over.
> > + The executions define which tests to run and where to run them and build targets define
> > + the DPDK build setup.
> > +
> > + The tests suites are set up for each execution/build target tuple and each scheduled
> > + test case within the test suite is set up, executed and torn down. After all test cases
> > + have been executed, the test suite is torn down and the next build target will be tested.
> > +
> > + All the nested steps look like this:
> > +
> > + #. Execution setup
> > +
> > + #. Build target setup
> > +
> > + #. Test suite setup
> > +
> > + #. Test case setup
> > + #. Test case logic
> > + #. Test case teardown
> > +
> > + #. Test suite teardown
> > +
> > + #. Build target teardown
> > +
> > + #. Execution teardown
> > +
> > + The test cases are filtered according to the specification in the test run configuration and
> > + the :option:`--test-cases` command line argument or
> > + the :envvar:`DTS_TESTCASES` environment variable.
> > """
> > global dts_logger
> > global result
> > @@ -87,6 +143,8 @@ def run_all() -> None:
> >
> >
> > def _check_dts_python_version() -> None:
> > + """Check the required Python version - v3.10."""
> > +
> > def RED(text: str) -> str:
> > return f"\u001B[31;1m{str(text)}\u001B[0m"
> >
> > @@ -111,9 +169,16 @@ def _run_execution(
> > execution: ExecutionConfiguration,
> > result: DTSResult,
> > ) -> None:
> > - """
> > - Run the given execution. This involves running the execution setup as well as
> > - running all build targets in the given execution.
> > + """Run the given execution.
> > +
> > + This involves running the execution setup as well as running all build targets
> > + in the given execution. After that, execution teardown is run.
> > +
> > + Args:
> > + sut_node: The execution's SUT node.
> > + tg_node: The execution's TG node.
> > + execution: An execution's test run configuration.
> > + result: The top level result object.
> > """
> > dts_logger.info(
> > f"Running execution with SUT '{execution.system_under_test_node.name}'."
> > @@ -150,8 +215,18 @@ def _run_build_target(
> > execution: ExecutionConfiguration,
> > execution_result: ExecutionResult,
> > ) -> None:
> > - """
> > - Run the given build target.
> > + """Run the given build target.
> > +
> > + This involves running the build target setup as well as running all test suites
> > + in the given execution the build target is defined in.
> > + After that, build target teardown is run.
> > +
> > + Args:
> > + sut_node: The execution's SUT node.
> > + tg_node: The execution's TG node.
> > + build_target: A build target's test run configuration.
> > + execution: The build target's execution's test run configuration.
> > + execution_result: The execution level result object associated with the execution.
> > """
> > dts_logger.info(f"Running build target '{build_target.name}'.")
> > build_target_result = execution_result.add_build_target(build_target)
> > @@ -183,10 +258,17 @@ def _run_all_suites(
> > execution: ExecutionConfiguration,
> > build_target_result: BuildTargetResult,
> > ) -> None:
> > - """
> > - Use the given build_target to run execution's test suites
> > - with possibly only a subset of test cases.
> > - If no subset is specified, run all test cases.
> > + """Run the execution's (possibly a subset) test suites using the current build_target.
> > +
> > + The function assumes the build target we're testing has already been built on the SUT node.
> > + The current build target thus corresponds to the current DPDK build present on the SUT node.
> > +
> > + Args:
> > + sut_node: The execution's SUT node.
> > + tg_node: The execution's TG node.
> > + execution: The execution's test run configuration associated with the current build target.
> > + build_target_result: The build target level result object associated
> > + with the current build target.
> > """
> > end_build_target = False
> > if not execution.skip_smoke_tests:
> > @@ -215,16 +297,22 @@ def _run_single_suite(
> > build_target_result: BuildTargetResult,
> > test_suite_config: TestSuiteConfig,
> > ) -> None:
> > - """Runs a single test suite.
> > + """Run all test suite in a single test suite module.
> > +
> > + The function assumes the build target we're testing has already been built on the SUT node.
> > + The current build target thus corresponds to the current DPDK build present on the SUT node.
> >
> > Args:
> > - sut_node: Node to run tests on.
> > - execution: Execution the test case belongs to.
> > - build_target_result: Build target configuration test case is run on
> > - test_suite_config: Test suite configuration
> > + sut_node: The execution's SUT node.
> > + tg_node: The execution's TG node.
> > + execution: The execution's test run configuration associated with the current build target.
> > + build_target_result: The build target level result object associated
> > + with the current build target.
> > + test_suite_config: Test suite test run configuration specifying the test suite module
> > + and possibly a subset of test cases of test suites in that module.
> >
> > Raises:
> > - BlockingTestSuiteError: If a test suite that was marked as blocking fails.
> > + BlockingTestSuiteError: If a blocking test suite fails.
> > """
> > try:
> > full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
> > @@ -248,9 +336,7 @@ def _run_single_suite(
> >
> >
> > def _exit_dts() -> None:
> > - """
> > - Process all errors and exit with the proper exit code.
> > - """
> > + """Process all errors and exit with the proper exit code."""
> > result.process()
> >
> > if dts_logger:
> > diff --git a/dts/main.py b/dts/main.py
> > index 5d4714b0c3..f703615d11 100755
> > --- a/dts/main.py
> > +++ b/dts/main.py
> > @@ -4,9 +4,7 @@
> > # Copyright(c) 2022 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022 University of New Hampshire
> >
> > -"""
> > -A test framework for testing DPDK.
> > -"""
> > +"""The DTS executable."""
> >
> > import logging
> >
> > @@ -17,6 +15,10 @@ def main() -> None:
> > """Set DTS settings, then run DTS.
> >
> > The DTS settings are taken from the command line arguments and the environment variables.
> > + The settings object is stored in the module-level variable settings.SETTINGS which the entire
> > + framework uses. After importing the module (or the variable), any changes to the variable are
> > + not going to be reflected without a re-import. This means that the SETTINGS variable must
> > + be modified before the settings module is imported anywhere else in the framework.
> > """
> > settings.SETTINGS = settings.get_settings()
> > from framework import dts
> Nit: copyright notice update in main
Nice catch again. Thanks.
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 08/21] dts: test suite docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (6 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 07/21] dts: dts runner and main " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-16 22:16 ` Jeremy Spewock
2023-11-15 13:09 ` [PATCH v7 09/21] dts: test result " Juraj Linkeš
` (13 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_suite.py | 223 +++++++++++++++++++++++++++---------
1 file changed, 168 insertions(+), 55 deletions(-)
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index d53553bf34..9e5251ffc6 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -2,8 +2,19 @@
# Copyright(c) 2010-2014 Intel Corporation
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Base class for creating DTS test cases.
+"""Features common to all test suites.
+
+The module defines the :class:`TestSuite` class which doesn't contain any test cases, and as such
+must be extended by subclasses which add test cases. The :class:`TestSuite` contains the basics
+needed by subclasses:
+
+ * Test suite and test case execution flow,
+ * Testbed (SUT, TG) configuration,
+ * Packet sending and verification,
+ * Test case verification.
+
+The module also defines a function, :func:`get_test_suites`,
+for gathering test suites from a Python module.
"""
import importlib
@@ -31,25 +42,44 @@
class TestSuite(object):
- """
- The base TestSuite class provides methods for handling basic flow of a test suite:
- * test case filtering and collection
- * test suite setup/cleanup
- * test setup/cleanup
- * test case execution
- * error handling and results storage
- Test cases are implemented by derived classes. Test cases are all methods
- starting with test_, further divided into performance test cases
- (starting with test_perf_) and functional test cases (all other test cases).
- By default, all test cases will be executed. A list of testcase str names
- may be specified in conf.yaml or on the command line
- to filter which test cases to run.
- The methods named [set_up|tear_down]_[suite|test_case] should be overridden
- in derived classes if the appropriate suite/test case fixtures are needed.
+ """The base class with methods for handling the basic flow of a test suite.
+
+ * Test case filtering and collection,
+ * Test suite setup/cleanup,
+ * Test setup/cleanup,
+ * Test case execution,
+ * Error handling and results storage.
+
+ Test cases are implemented by subclasses. Test cases are all methods starting with ``test_``,
+ further divided into performance test cases (starting with ``test_perf_``)
+ and functional test cases (all other test cases).
+
+ By default, all test cases will be executed. A list of testcase names may be specified
+ in the YAML test run configuration file and in the :option:`--test-cases` command line argument
+ or in the :envvar:`DTS_TESTCASES` environment variable to filter which test cases to run.
+ The union of both lists will be used. Any unknown test cases from the latter lists
+ will be silently ignored.
+
+ If the :option:`--re-run` command line argument or the :envvar:`DTS_RERUN` environment variable
+ is set, in case of a test case failure, the test case will be executed again until it passes
+ or it fails that many times in addition of the first failure.
+
+ The methods named ``[set_up|tear_down]_[suite|test_case]`` should be overridden in subclasses
+ if the appropriate test suite/test case fixtures are needed.
+
+ The test suite is aware of the testbed (the SUT and TG) it's running on. From this, it can
+ properly choose the IP addresses and other configuration that must be tailored to the testbed.
+
+ Attributes:
+ sut_node: The SUT node where the test suite is running.
+ tg_node: The TG node where the test suite is running.
+ is_blocking: Whether the test suite is blocking. A failure of a blocking test suite
+ will block the execution of all subsequent test suites in the current build target.
"""
sut_node: SutNode
- is_blocking = False
+ tg_node: TGNode
+ is_blocking: bool = False
_logger: DTSLOG
_test_cases_to_run: list[str]
_func: bool
@@ -72,6 +102,19 @@ def __init__(
func: bool,
build_target_result: BuildTargetResult,
):
+ """Initialize the test suite testbed information and basic configuration.
+
+ Process what test cases to run, create the associated :class:`TestSuiteResult`,
+ find links between ports and set up default IP addresses to be used when configuring them.
+
+ Args:
+ sut_node: The SUT node where the test suite will run.
+ tg_node: The TG node where the test suite will run.
+ test_cases: The list of test cases to execute.
+ If empty, all test cases will be executed.
+ func: Whether to run functional tests.
+ build_target_result: The build target result this test suite is run in.
+ """
self.sut_node = sut_node
self.tg_node = tg_node
self._logger = getLogger(self.__class__.__name__)
@@ -95,6 +138,7 @@ def __init__(
self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
def _process_links(self) -> None:
+ """Construct links between SUT and TG ports."""
for sut_port in self.sut_node.ports:
for tg_port in self.tg_node.ports:
if (sut_port.identifier, sut_port.peer) == (
@@ -106,27 +150,42 @@ def _process_links(self) -> None:
)
def set_up_suite(self) -> None:
- """
- Set up test fixtures common to all test cases; this is done before
- any test case is run.
+ """Set up test fixtures common to all test cases.
+
+ This is done before any test case has been run.
"""
def tear_down_suite(self) -> None:
- """
- Tear down the previously created test fixtures common to all test cases.
+ """Tear down the previously created test fixtures common to all test cases.
+
+ This is done after all test have been run.
"""
def set_up_test_case(self) -> None:
- """
- Set up test fixtures before each test case.
+ """Set up test fixtures before each test case.
+
+ This is done before *each* test case.
"""
def tear_down_test_case(self) -> None:
- """
- Tear down the previously created test fixtures after each test case.
+ """Tear down the previously created test fixtures after each test case.
+
+ This is done after *each* test case.
"""
def configure_testbed_ipv4(self, restore: bool = False) -> None:
+ """Configure IPv4 addresses on all testbed ports.
+
+ The configured ports are:
+
+ * SUT ingress port,
+ * SUT egress port,
+ * TG ingress port,
+ * TG egress port.
+
+ Args:
+ restore: If :data:`True`, will remove the configuration instead.
+ """
delete = True if restore else False
enable = False if restore else True
self._configure_ipv4_forwarding(enable)
@@ -153,11 +212,13 @@ def _configure_ipv4_forwarding(self, enable: bool) -> None:
def send_packet_and_capture(
self, packet: Packet, duration: float = 1
) -> list[Packet]:
- """
- Send a packet through the appropriate interface and
- receive on the appropriate interface.
- Modify the packet with l3/l2 addresses corresponding
- to the testbed and desired traffic.
+ """Send and receive `packet` using the associated TG.
+
+ Send `packet` through the appropriate interface and receive on the appropriate interface.
+ Modify the packet with l3/l2 addresses corresponding to the testbed and desired traffic.
+
+ Returns:
+ A list of received packets.
"""
packet = self._adjust_addresses(packet)
return self.tg_node.send_packet_and_capture(
@@ -165,13 +226,25 @@ def send_packet_and_capture(
)
def get_expected_packet(self, packet: Packet) -> Packet:
+ """Inject the proper L2/L3 addresses into `packet`.
+
+ Args:
+ packet: The packet to modify.
+
+ Returns:
+ `packet` with injected L2/L3 addresses.
+ """
return self._adjust_addresses(packet, expected=True)
def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
- """
+ """L2 and L3 address additions in both directions.
+
Assumptions:
- Two links between SUT and TG, one link is TG -> SUT,
- the other SUT -> TG.
+ Two links between SUT and TG, one link is TG -> SUT, the other SUT -> TG.
+
+ Args:
+ packet: The packet to modify.
+ expected: If :data:`True`, the direction is SUT -> TG, otherwise the direction is TG -> SUT.
"""
if expected:
# The packet enters the TG from SUT
@@ -197,6 +270,19 @@ def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
return Ether(packet.build())
def verify(self, condition: bool, failure_description: str) -> None:
+ """Verify `condition` and handle failures.
+
+ When `condition` is :data:`False`, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ condition: The condition to check.
+ failure_description: A short description of the failure
+ that will be stored in the raised exception.
+
+ Raises:
+ TestCaseVerifyError: `condition` is :data:`False`.
+ """
if not condition:
self._fail_test_case_verify(failure_description)
@@ -216,6 +302,19 @@ def _fail_test_case_verify(self, failure_description: str) -> None:
def verify_packets(
self, expected_packet: Packet, received_packets: list[Packet]
) -> None:
+ """Verify that `expected_packet` has been received.
+
+ Go through `received_packets` and check that `expected_packet` is among them.
+ If not, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ expected_packet: The packet we're expecting to receive.
+ received_packets: The packets where we're looking for `expected_packet`.
+
+ Raises:
+ TestCaseVerifyError: `expected_packet` is not among `received_packets`.
+ """
for received_packet in received_packets:
if self._compare_packets(expected_packet, received_packet):
break
@@ -303,10 +402,14 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
return True
def run(self) -> None:
- """
- Setup, execute and teardown the whole suite.
- Suite execution consists of running all test cases scheduled to be executed.
- A test cast run consists of setup, execution and teardown of said test case.
+ """Set up, execute and tear down the whole suite.
+
+ Test suite execution consists of running all test cases scheduled to be executed.
+ A test case run consists of setup, execution and teardown of said test case.
+
+ Record the setup and the teardown and handle failures.
+
+ The list of scheduled test cases is constructed when creating the :class:`TestSuite` object.
"""
test_suite_name = self.__class__.__name__
@@ -338,9 +441,7 @@ def run(self) -> None:
raise BlockingTestSuiteError(test_suite_name)
def _execute_test_suite(self) -> None:
- """
- Execute all test cases scheduled to be executed in this suite.
- """
+ """Execute all test cases scheduled to be executed in this suite."""
if self._func:
for test_case_method in self._get_functional_test_cases():
test_case_name = test_case_method.__name__
@@ -357,14 +458,18 @@ def _execute_test_suite(self) -> None:
self._run_test_case(test_case_method, test_case_result)
def _get_functional_test_cases(self) -> list[MethodType]:
- """
- Get all functional test cases.
+ """Get all functional test cases defined in this TestSuite.
+
+ Returns:
+ The list of functional test cases of this TestSuite.
"""
return self._get_test_cases(r"test_(?!perf_)")
def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
- """
- Return a list of test cases matching test_case_regex.
+ """Return a list of test cases matching test_case_regex.
+
+ Returns:
+ The list of test cases matching test_case_regex of this TestSuite.
"""
self._logger.debug(f"Searching for test cases in {self.__class__.__name__}.")
filtered_test_cases = []
@@ -378,9 +483,7 @@ def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
return filtered_test_cases
def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool:
- """
- Check whether the test case should be executed.
- """
+ """Check whether the test case should be scheduled to be executed."""
match = bool(re.match(test_case_regex, test_case_name))
if self._test_cases_to_run:
return match and test_case_name in self._test_cases_to_run
@@ -390,9 +493,9 @@ def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool
def _run_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Setup, execute and teardown a test case in this suite.
- Exceptions are caught and recorded in logs and results.
+ """Setup, execute and teardown a test case in this suite.
+
+ Record the result of the setup and the teardown and handle failures.
"""
test_case_name = test_case_method.__name__
@@ -427,9 +530,7 @@ def _run_test_case(
def _execute_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Execute one test case and handle failures.
- """
+ """Execute one test case, record the result and handle failures."""
test_case_name = test_case_method.__name__
try:
self._logger.info(f"Starting test case execution: {test_case_name}")
@@ -452,6 +553,18 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
+ r"""Find all :class:`TestSuite`\s in a Python module.
+
+ Args:
+ testsuite_module_path: The path to the Python module.
+
+ Returns:
+ The list of :class:`TestSuite`\s found within the Python module.
+
+ Raises:
+ ConfigurationError: The test suite module was not found.
+ """
+
def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 08/21] dts: test suite docstring update
2023-11-15 13:09 ` [PATCH v7 08/21] dts: test suite " Juraj Linkeš
@ 2023-11-16 22:16 ` Jeremy Spewock
2023-11-20 16:25 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-11-16 22:16 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
[-- Attachment #1: Type: text/plain, Size: 16583 bytes --]
On Wed, Nov 15, 2023 at 8:12 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/test_suite.py | 223 +++++++++++++++++++++++++++---------
> 1 file changed, 168 insertions(+), 55 deletions(-)
>
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index d53553bf34..9e5251ffc6 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -2,8 +2,19 @@
> # Copyright(c) 2010-2014 Intel Corporation
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> -"""
> -Base class for creating DTS test cases.
> +"""Features common to all test suites.
> +
> +The module defines the :class:`TestSuite` class which doesn't contain any
> test cases, and as such
> +must be extended by subclasses which add test cases. The
> :class:`TestSuite` contains the basics
> +needed by subclasses:
> +
> + * Test suite and test case execution flow,
> + * Testbed (SUT, TG) configuration,
> + * Packet sending and verification,
> + * Test case verification.
> +
> +The module also defines a function, :func:`get_test_suites`,
> +for gathering test suites from a Python module.
> """
>
> import importlib
> @@ -31,25 +42,44 @@
>
>
> class TestSuite(object):
> - """
> - The base TestSuite class provides methods for handling basic flow of
> a test suite:
> - * test case filtering and collection
> - * test suite setup/cleanup
> - * test setup/cleanup
> - * test case execution
> - * error handling and results storage
> - Test cases are implemented by derived classes. Test cases are all
> methods
> - starting with test_, further divided into performance test cases
> - (starting with test_perf_) and functional test cases (all other test
> cases).
> - By default, all test cases will be executed. A list of testcase str
> names
> - may be specified in conf.yaml or on the command line
> - to filter which test cases to run.
> - The methods named [set_up|tear_down]_[suite|test_case] should be
> overridden
> - in derived classes if the appropriate suite/test case fixtures are
> needed.
> + """The base class with methods for handling the basic flow of a test
> suite.
> +
> + * Test case filtering and collection,
> + * Test suite setup/cleanup,
> + * Test setup/cleanup,
> + * Test case execution,
> + * Error handling and results storage.
> +
> + Test cases are implemented by subclasses. Test cases are all methods
> starting with ``test_``,
> + further divided into performance test cases (starting with
> ``test_perf_``)
> + and functional test cases (all other test cases).
> +
> + By default, all test cases will be executed. A list of testcase names
> may be specified
> + in the YAML test run configuration file and in the
> :option:`--test-cases` command line argument
> + or in the :envvar:`DTS_TESTCASES` environment variable to filter
> which test cases to run.
> + The union of both lists will be used. Any unknown test cases from the
> latter lists
> + will be silently ignored.
> +
> + If the :option:`--re-run` command line argument or the
> :envvar:`DTS_RERUN` environment variable
> + is set, in case of a test case failure, the test case will be
> executed again until it passes
> + or it fails that many times in addition of the first failure.
> +
> + The methods named ``[set_up|tear_down]_[suite|test_case]`` should be
> overridden in subclasses
> + if the appropriate test suite/test case fixtures are needed.
> +
> + The test suite is aware of the testbed (the SUT and TG) it's running
> on. From this, it can
> + properly choose the IP addresses and other configuration that must be
> tailored to the testbed.
> +
> + Attributes:
> + sut_node: The SUT node where the test suite is running.
> + tg_node: The TG node where the test suite is running.
> + is_blocking: Whether the test suite is blocking. A failure of a
> blocking test suite
> + will block the execution of all subsequent test suites in the
> current build target.
> """
>
Should this attribute section instead be comments in the form "#:" because
they are class variables instead of instance ones?
>
> sut_node: SutNode
> - is_blocking = False
> + tg_node: TGNode
> + is_blocking: bool = False
> _logger: DTSLOG
> _test_cases_to_run: list[str]
> _func: bool
> @@ -72,6 +102,19 @@ def __init__(
> func: bool,
> build_target_result: BuildTargetResult,
> ):
> + """Initialize the test suite testbed information and basic
> configuration.
> +
> + Process what test cases to run, create the associated
> :class:`TestSuiteResult`,
> + find links between ports and set up default IP addresses to be
> used when configuring them.
> +
> + Args:
> + sut_node: The SUT node where the test suite will run.
> + tg_node: The TG node where the test suite will run.
> + test_cases: The list of test cases to execute.
> + If empty, all test cases will be executed.
> + func: Whether to run functional tests.
> + build_target_result: The build target result this test suite
> is run in.
> + """
> self.sut_node = sut_node
> self.tg_node = tg_node
> self._logger = getLogger(self.__class__.__name__)
> @@ -95,6 +138,7 @@ def __init__(
> self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
>
> def _process_links(self) -> None:
> + """Construct links between SUT and TG ports."""
> for sut_port in self.sut_node.ports:
> for tg_port in self.tg_node.ports:
> if (sut_port.identifier, sut_port.peer) == (
> @@ -106,27 +150,42 @@ def _process_links(self) -> None:
> )
>
> def set_up_suite(self) -> None:
> - """
> - Set up test fixtures common to all test cases; this is done before
> - any test case is run.
> + """Set up test fixtures common to all test cases.
> +
> + This is done before any test case has been run.
> """
>
> def tear_down_suite(self) -> None:
> - """
> - Tear down the previously created test fixtures common to all test
> cases.
> + """Tear down the previously created test fixtures common to all
> test cases.
> +
> + This is done after all test have been run.
> """
>
> def set_up_test_case(self) -> None:
> - """
> - Set up test fixtures before each test case.
> + """Set up test fixtures before each test case.
> +
> + This is done before *each* test case.
> """
>
> def tear_down_test_case(self) -> None:
> - """
> - Tear down the previously created test fixtures after each test
> case.
> + """Tear down the previously created test fixtures after each test
> case.
> +
> + This is done after *each* test case.
> """
>
> def configure_testbed_ipv4(self, restore: bool = False) -> None:
> + """Configure IPv4 addresses on all testbed ports.
> +
> + The configured ports are:
> +
> + * SUT ingress port,
> + * SUT egress port,
> + * TG ingress port,
> + * TG egress port.
> +
> + Args:
> + restore: If :data:`True`, will remove the configuration
> instead.
> + """
> delete = True if restore else False
> enable = False if restore else True
> self._configure_ipv4_forwarding(enable)
> @@ -153,11 +212,13 @@ def _configure_ipv4_forwarding(self, enable: bool)
> -> None:
> def send_packet_and_capture(
> self, packet: Packet, duration: float = 1
> ) -> list[Packet]:
> - """
> - Send a packet through the appropriate interface and
> - receive on the appropriate interface.
> - Modify the packet with l3/l2 addresses corresponding
> - to the testbed and desired traffic.
> + """Send and receive `packet` using the associated TG.
> +
> + Send `packet` through the appropriate interface and receive on
> the appropriate interface.
> + Modify the packet with l3/l2 addresses corresponding to the
> testbed and desired traffic.
> +
> + Returns:
> + A list of received packets.
> """
> packet = self._adjust_addresses(packet)
> return self.tg_node.send_packet_and_capture(
> @@ -165,13 +226,25 @@ def send_packet_and_capture(
> )
>
> def get_expected_packet(self, packet: Packet) -> Packet:
> + """Inject the proper L2/L3 addresses into `packet`.
> +
> + Args:
> + packet: The packet to modify.
> +
> + Returns:
> + `packet` with injected L2/L3 addresses.
> + """
> return self._adjust_addresses(packet, expected=True)
>
> def _adjust_addresses(self, packet: Packet, expected: bool = False)
> -> Packet:
> - """
> + """L2 and L3 address additions in both directions.
> +
> Assumptions:
> - Two links between SUT and TG, one link is TG -> SUT,
> - the other SUT -> TG.
> + Two links between SUT and TG, one link is TG -> SUT, the
> other SUT -> TG.
> +
> + Args:
> + packet: The packet to modify.
> + expected: If :data:`True`, the direction is SUT -> TG,
> otherwise the direction is TG -> SUT.
> """
> if expected:
> # The packet enters the TG from SUT
> @@ -197,6 +270,19 @@ def _adjust_addresses(self, packet: Packet, expected:
> bool = False) -> Packet:
> return Ether(packet.build())
>
> def verify(self, condition: bool, failure_description: str) -> None:
> + """Verify `condition` and handle failures.
> +
> + When `condition` is :data:`False`, raise an exception and log the
> last 10 commands
> + executed on both the SUT and TG.
> +
> + Args:
> + condition: The condition to check.
> + failure_description: A short description of the failure
> + that will be stored in the raised exception.
> +
> + Raises:
> + TestCaseVerifyError: `condition` is :data:`False`.
> + """
> if not condition:
> self._fail_test_case_verify(failure_description)
>
> @@ -216,6 +302,19 @@ def _fail_test_case_verify(self, failure_description:
> str) -> None:
> def verify_packets(
> self, expected_packet: Packet, received_packets: list[Packet]
> ) -> None:
> + """Verify that `expected_packet` has been received.
> +
> + Go through `received_packets` and check that `expected_packet` is
> among them.
> + If not, raise an exception and log the last 10 commands
> + executed on both the SUT and TG.
> +
> + Args:
> + expected_packet: The packet we're expecting to receive.
> + received_packets: The packets where we're looking for
> `expected_packet`.
> +
> + Raises:
> + TestCaseVerifyError: `expected_packet` is not among
> `received_packets`.
> + """
> for received_packet in received_packets:
> if self._compare_packets(expected_packet, received_packet):
> break
> @@ -303,10 +402,14 @@ def _verify_l3_packet(self, received_packet: IP,
> expected_packet: IP) -> bool:
> return True
>
> def run(self) -> None:
> - """
> - Setup, execute and teardown the whole suite.
> - Suite execution consists of running all test cases scheduled to
> be executed.
> - A test cast run consists of setup, execution and teardown of said
> test case.
> + """Set up, execute and tear down the whole suite.
> +
> + Test suite execution consists of running all test cases scheduled
> to be executed.
> + A test case run consists of setup, execution and teardown of said
> test case.
> +
> + Record the setup and the teardown and handle failures.
> +
> + The list of scheduled test cases is constructed when creating the
> :class:`TestSuite` object.
> """
> test_suite_name = self.__class__.__name__
>
> @@ -338,9 +441,7 @@ def run(self) -> None:
> raise BlockingTestSuiteError(test_suite_name)
>
> def _execute_test_suite(self) -> None:
> - """
> - Execute all test cases scheduled to be executed in this suite.
> - """
> + """Execute all test cases scheduled to be executed in this
> suite."""
> if self._func:
> for test_case_method in self._get_functional_test_cases():
> test_case_name = test_case_method.__name__
> @@ -357,14 +458,18 @@ def _execute_test_suite(self) -> None:
> self._run_test_case(test_case_method,
> test_case_result)
>
> def _get_functional_test_cases(self) -> list[MethodType]:
> - """
> - Get all functional test cases.
> + """Get all functional test cases defined in this TestSuite.
> +
> + Returns:
> + The list of functional test cases of this TestSuite.
> """
> return self._get_test_cases(r"test_(?!perf_)")
>
> def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
> - """
> - Return a list of test cases matching test_case_regex.
> + """Return a list of test cases matching test_case_regex.
> +
> + Returns:
> + The list of test cases matching test_case_regex of this
> TestSuite.
> """
> self._logger.debug(f"Searching for test cases in
> {self.__class__.__name__}.")
> filtered_test_cases = []
> @@ -378,9 +483,7 @@ def _get_test_cases(self, test_case_regex: str) ->
> list[MethodType]:
> return filtered_test_cases
>
> def _should_be_executed(self, test_case_name: str, test_case_regex:
> str) -> bool:
> - """
> - Check whether the test case should be executed.
> - """
> + """Check whether the test case should be scheduled to be
> executed."""
> match = bool(re.match(test_case_regex, test_case_name))
> if self._test_cases_to_run:
> return match and test_case_name in self._test_cases_to_run
> @@ -390,9 +493,9 @@ def _should_be_executed(self, test_case_name: str,
> test_case_regex: str) -> bool
> def _run_test_case(
> self, test_case_method: MethodType, test_case_result:
> TestCaseResult
> ) -> None:
> - """
> - Setup, execute and teardown a test case in this suite.
> - Exceptions are caught and recorded in logs and results.
> + """Setup, execute and teardown a test case in this suite.
> +
> + Record the result of the setup and the teardown and handle
> failures.
> """
> test_case_name = test_case_method.__name__
>
> @@ -427,9 +530,7 @@ def _run_test_case(
> def _execute_test_case(
> self, test_case_method: MethodType, test_case_result:
> TestCaseResult
> ) -> None:
> - """
> - Execute one test case and handle failures.
> - """
> + """Execute one test case, record the result and handle
> failures."""
> test_case_name = test_case_method.__name__
> try:
> self._logger.info(f"Starting test case execution:
> {test_case_name}")
> @@ -452,6 +553,18 @@ def _execute_test_case(
>
>
> def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
> + r"""Find all :class:`TestSuite`\s in a Python module.
> +
> + Args:
> + testsuite_module_path: The path to the Python module.
> +
> + Returns:
> + The list of :class:`TestSuite`\s found within the Python module.
> +
> + Raises:
> + ConfigurationError: The test suite module was not found.
> + """
> +
> def is_test_suite(object: Any) -> bool:
> try:
> if issubclass(object, TestSuite) and object is not TestSuite:
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 20080 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 08/21] dts: test suite docstring update
2023-11-16 22:16 ` Jeremy Spewock
@ 2023-11-20 16:25 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-20 16:25 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
On Thu, Nov 16, 2023 at 11:16 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
>
>
>
> On Wed, Nov 15, 2023 at 8:12 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>>
>> Format according to the Google format and PEP257, with slight
>> deviations.
>>
>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>> ---
>> dts/framework/test_suite.py | 223 +++++++++++++++++++++++++++---------
>> 1 file changed, 168 insertions(+), 55 deletions(-)
>>
>> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
>> index d53553bf34..9e5251ffc6 100644
>> --- a/dts/framework/test_suite.py
>> +++ b/dts/framework/test_suite.py
>> @@ -2,8 +2,19 @@
>> # Copyright(c) 2010-2014 Intel Corporation
>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>>
>> -"""
>> -Base class for creating DTS test cases.
>> +"""Features common to all test suites.
>> +
>> +The module defines the :class:`TestSuite` class which doesn't contain any test cases, and as such
>> +must be extended by subclasses which add test cases. The :class:`TestSuite` contains the basics
>> +needed by subclasses:
>> +
>> + * Test suite and test case execution flow,
>> + * Testbed (SUT, TG) configuration,
>> + * Packet sending and verification,
>> + * Test case verification.
>> +
>> +The module also defines a function, :func:`get_test_suites`,
>> +for gathering test suites from a Python module.
>> """
>>
>> import importlib
>> @@ -31,25 +42,44 @@
>>
>>
>> class TestSuite(object):
>> - """
>> - The base TestSuite class provides methods for handling basic flow of a test suite:
>> - * test case filtering and collection
>> - * test suite setup/cleanup
>> - * test setup/cleanup
>> - * test case execution
>> - * error handling and results storage
>> - Test cases are implemented by derived classes. Test cases are all methods
>> - starting with test_, further divided into performance test cases
>> - (starting with test_perf_) and functional test cases (all other test cases).
>> - By default, all test cases will be executed. A list of testcase str names
>> - may be specified in conf.yaml or on the command line
>> - to filter which test cases to run.
>> - The methods named [set_up|tear_down]_[suite|test_case] should be overridden
>> - in derived classes if the appropriate suite/test case fixtures are needed.
>> + """The base class with methods for handling the basic flow of a test suite.
>> +
>> + * Test case filtering and collection,
>> + * Test suite setup/cleanup,
>> + * Test setup/cleanup,
>> + * Test case execution,
>> + * Error handling and results storage.
>> +
>> + Test cases are implemented by subclasses. Test cases are all methods starting with ``test_``,
>> + further divided into performance test cases (starting with ``test_perf_``)
>> + and functional test cases (all other test cases).
>> +
>> + By default, all test cases will be executed. A list of testcase names may be specified
>> + in the YAML test run configuration file and in the :option:`--test-cases` command line argument
>> + or in the :envvar:`DTS_TESTCASES` environment variable to filter which test cases to run.
>> + The union of both lists will be used. Any unknown test cases from the latter lists
>> + will be silently ignored.
>> +
>> + If the :option:`--re-run` command line argument or the :envvar:`DTS_RERUN` environment variable
>> + is set, in case of a test case failure, the test case will be executed again until it passes
>> + or it fails that many times in addition of the first failure.
>> +
>> + The methods named ``[set_up|tear_down]_[suite|test_case]`` should be overridden in subclasses
>> + if the appropriate test suite/test case fixtures are needed.
>> +
>> + The test suite is aware of the testbed (the SUT and TG) it's running on. From this, it can
>> + properly choose the IP addresses and other configuration that must be tailored to the testbed.
>> +
>> + Attributes:
>> + sut_node: The SUT node where the test suite is running.
>> + tg_node: The TG node where the test suite is running.
>> + is_blocking: Whether the test suite is blocking. A failure of a blocking test suite
>> + will block the execution of all subsequent test suites in the current build target.
>> """
>
>
> Should this attribute section instead be comments in the form "#:" because they are class variables instead of instance ones?
>
Yes and no. The first two are not class variables, but the last one
is, so I'll change is_blocking to ClassVar. Thankfully the resulting
generated docs look just fine, the instance variables are listed
first, then class variables.
>>
>>
>> sut_node: SutNode
>> - is_blocking = False
>> + tg_node: TGNode
>> + is_blocking: bool = False
>> _logger: DTSLOG
>> _test_cases_to_run: list[str]
>> _func: bool
>> @@ -72,6 +102,19 @@ def __init__(
>> func: bool,
>> build_target_result: BuildTargetResult,
>> ):
>> + """Initialize the test suite testbed information and basic configuration.
>> +
>> + Process what test cases to run, create the associated :class:`TestSuiteResult`,
>> + find links between ports and set up default IP addresses to be used when configuring them.
>> +
>> + Args:
>> + sut_node: The SUT node where the test suite will run.
>> + tg_node: The TG node where the test suite will run.
>> + test_cases: The list of test cases to execute.
>> + If empty, all test cases will be executed.
>> + func: Whether to run functional tests.
>> + build_target_result: The build target result this test suite is run in.
>> + """
>> self.sut_node = sut_node
>> self.tg_node = tg_node
>> self._logger = getLogger(self.__class__.__name__)
>> @@ -95,6 +138,7 @@ def __init__(
>> self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
>>
>> def _process_links(self) -> None:
>> + """Construct links between SUT and TG ports."""
>> for sut_port in self.sut_node.ports:
>> for tg_port in self.tg_node.ports:
>> if (sut_port.identifier, sut_port.peer) == (
>> @@ -106,27 +150,42 @@ def _process_links(self) -> None:
>> )
>>
>> def set_up_suite(self) -> None:
>> - """
>> - Set up test fixtures common to all test cases; this is done before
>> - any test case is run.
>> + """Set up test fixtures common to all test cases.
>> +
>> + This is done before any test case has been run.
>> """
>>
>> def tear_down_suite(self) -> None:
>> - """
>> - Tear down the previously created test fixtures common to all test cases.
>> + """Tear down the previously created test fixtures common to all test cases.
>> +
>> + This is done after all test have been run.
>> """
>>
>> def set_up_test_case(self) -> None:
>> - """
>> - Set up test fixtures before each test case.
>> + """Set up test fixtures before each test case.
>> +
>> + This is done before *each* test case.
>> """
>>
>> def tear_down_test_case(self) -> None:
>> - """
>> - Tear down the previously created test fixtures after each test case.
>> + """Tear down the previously created test fixtures after each test case.
>> +
>> + This is done after *each* test case.
>> """
>>
>> def configure_testbed_ipv4(self, restore: bool = False) -> None:
>> + """Configure IPv4 addresses on all testbed ports.
>> +
>> + The configured ports are:
>> +
>> + * SUT ingress port,
>> + * SUT egress port,
>> + * TG ingress port,
>> + * TG egress port.
>> +
>> + Args:
>> + restore: If :data:`True`, will remove the configuration instead.
>> + """
>> delete = True if restore else False
>> enable = False if restore else True
>> self._configure_ipv4_forwarding(enable)
>> @@ -153,11 +212,13 @@ def _configure_ipv4_forwarding(self, enable: bool) -> None:
>> def send_packet_and_capture(
>> self, packet: Packet, duration: float = 1
>> ) -> list[Packet]:
>> - """
>> - Send a packet through the appropriate interface and
>> - receive on the appropriate interface.
>> - Modify the packet with l3/l2 addresses corresponding
>> - to the testbed and desired traffic.
>> + """Send and receive `packet` using the associated TG.
>> +
>> + Send `packet` through the appropriate interface and receive on the appropriate interface.
>> + Modify the packet with l3/l2 addresses corresponding to the testbed and desired traffic.
>> +
>> + Returns:
>> + A list of received packets.
>> """
>> packet = self._adjust_addresses(packet)
>> return self.tg_node.send_packet_and_capture(
>> @@ -165,13 +226,25 @@ def send_packet_and_capture(
>> )
>>
>> def get_expected_packet(self, packet: Packet) -> Packet:
>> + """Inject the proper L2/L3 addresses into `packet`.
>> +
>> + Args:
>> + packet: The packet to modify.
>> +
>> + Returns:
>> + `packet` with injected L2/L3 addresses.
>> + """
>> return self._adjust_addresses(packet, expected=True)
>>
>> def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
>> - """
>> + """L2 and L3 address additions in both directions.
>> +
>> Assumptions:
>> - Two links between SUT and TG, one link is TG -> SUT,
>> - the other SUT -> TG.
>> + Two links between SUT and TG, one link is TG -> SUT, the other SUT -> TG.
>> +
>> + Args:
>> + packet: The packet to modify.
>> + expected: If :data:`True`, the direction is SUT -> TG, otherwise the direction is TG -> SUT.
>> """
>> if expected:
>> # The packet enters the TG from SUT
>> @@ -197,6 +270,19 @@ def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
>> return Ether(packet.build())
>>
>> def verify(self, condition: bool, failure_description: str) -> None:
>> + """Verify `condition` and handle failures.
>> +
>> + When `condition` is :data:`False`, raise an exception and log the last 10 commands
>> + executed on both the SUT and TG.
>> +
>> + Args:
>> + condition: The condition to check.
>> + failure_description: A short description of the failure
>> + that will be stored in the raised exception.
>> +
>> + Raises:
>> + TestCaseVerifyError: `condition` is :data:`False`.
>> + """
>> if not condition:
>> self._fail_test_case_verify(failure_description)
>>
>> @@ -216,6 +302,19 @@ def _fail_test_case_verify(self, failure_description: str) -> None:
>> def verify_packets(
>> self, expected_packet: Packet, received_packets: list[Packet]
>> ) -> None:
>> + """Verify that `expected_packet` has been received.
>> +
>> + Go through `received_packets` and check that `expected_packet` is among them.
>> + If not, raise an exception and log the last 10 commands
>> + executed on both the SUT and TG.
>> +
>> + Args:
>> + expected_packet: The packet we're expecting to receive.
>> + received_packets: The packets where we're looking for `expected_packet`.
>> +
>> + Raises:
>> + TestCaseVerifyError: `expected_packet` is not among `received_packets`.
>> + """
>> for received_packet in received_packets:
>> if self._compare_packets(expected_packet, received_packet):
>> break
>> @@ -303,10 +402,14 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
>> return True
>>
>> def run(self) -> None:
>> - """
>> - Setup, execute and teardown the whole suite.
>> - Suite execution consists of running all test cases scheduled to be executed.
>> - A test cast run consists of setup, execution and teardown of said test case.
>> + """Set up, execute and tear down the whole suite.
>> +
>> + Test suite execution consists of running all test cases scheduled to be executed.
>> + A test case run consists of setup, execution and teardown of said test case.
>> +
>> + Record the setup and the teardown and handle failures.
>> +
>> + The list of scheduled test cases is constructed when creating the :class:`TestSuite` object.
>> """
>> test_suite_name = self.__class__.__name__
>>
>> @@ -338,9 +441,7 @@ def run(self) -> None:
>> raise BlockingTestSuiteError(test_suite_name)
>>
>> def _execute_test_suite(self) -> None:
>> - """
>> - Execute all test cases scheduled to be executed in this suite.
>> - """
>> + """Execute all test cases scheduled to be executed in this suite."""
>> if self._func:
>> for test_case_method in self._get_functional_test_cases():
>> test_case_name = test_case_method.__name__
>> @@ -357,14 +458,18 @@ def _execute_test_suite(self) -> None:
>> self._run_test_case(test_case_method, test_case_result)
>>
>> def _get_functional_test_cases(self) -> list[MethodType]:
>> - """
>> - Get all functional test cases.
>> + """Get all functional test cases defined in this TestSuite.
>> +
>> + Returns:
>> + The list of functional test cases of this TestSuite.
>> """
>> return self._get_test_cases(r"test_(?!perf_)")
>>
>> def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
>> - """
>> - Return a list of test cases matching test_case_regex.
>> + """Return a list of test cases matching test_case_regex.
>> +
>> + Returns:
>> + The list of test cases matching test_case_regex of this TestSuite.
>> """
>> self._logger.debug(f"Searching for test cases in {self.__class__.__name__}.")
>> filtered_test_cases = []
>> @@ -378,9 +483,7 @@ def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
>> return filtered_test_cases
>>
>> def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool:
>> - """
>> - Check whether the test case should be executed.
>> - """
>> + """Check whether the test case should be scheduled to be executed."""
>> match = bool(re.match(test_case_regex, test_case_name))
>> if self._test_cases_to_run:
>> return match and test_case_name in self._test_cases_to_run
>> @@ -390,9 +493,9 @@ def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool
>> def _run_test_case(
>> self, test_case_method: MethodType, test_case_result: TestCaseResult
>> ) -> None:
>> - """
>> - Setup, execute and teardown a test case in this suite.
>> - Exceptions are caught and recorded in logs and results.
>> + """Setup, execute and teardown a test case in this suite.
>> +
>> + Record the result of the setup and the teardown and handle failures.
>> """
>> test_case_name = test_case_method.__name__
>>
>> @@ -427,9 +530,7 @@ def _run_test_case(
>> def _execute_test_case(
>> self, test_case_method: MethodType, test_case_result: TestCaseResult
>> ) -> None:
>> - """
>> - Execute one test case and handle failures.
>> - """
>> + """Execute one test case, record the result and handle failures."""
>> test_case_name = test_case_method.__name__
>> try:
>> self._logger.info(f"Starting test case execution: {test_case_name}")
>> @@ -452,6 +553,18 @@ def _execute_test_case(
>>
>>
>> def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
>> + r"""Find all :class:`TestSuite`\s in a Python module.
>> +
>> + Args:
>> + testsuite_module_path: The path to the Python module.
>> +
>> + Returns:
>> + The list of :class:`TestSuite`\s found within the Python module.
>> +
>> + Raises:
>> + ConfigurationError: The test suite module was not found.
>> + """
>> +
>> def is_test_suite(object: Any) -> bool:
>> try:
>> if issubclass(object, TestSuite) and object is not TestSuite:
>> --
>> 2.34.1
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 09/21] dts: test result docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (7 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 08/21] dts: test suite " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-16 22:47 ` Jeremy Spewock
2023-11-15 13:09 ` [PATCH v7 10/21] dts: config " Juraj Linkeš
` (12 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_result.py | 292 ++++++++++++++++++++++++++++-------
1 file changed, 234 insertions(+), 58 deletions(-)
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 603e18872c..05e210f6e7 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -2,8 +2,25 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-Generic result container and reporters
+r"""Record and process DTS results.
+
+The results are recorded in a hierarchical manner:
+
+ * :class:`DTSResult` contains
+ * :class:`ExecutionResult` contains
+ * :class:`BuildTargetResult` contains
+ * :class:`TestSuiteResult` contains
+ * :class:`TestCaseResult`
+
+Each result may contain multiple lower level results, e.g. there are multiple
+:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
+The results have common parts, such as setup and teardown results, captured in :class:`BaseResult`,
+which also defines some common behaviors in its methods.
+
+Each result class has its own idiosyncrasies which they implement in overridden methods.
+
+The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+variable modify the directory where the files with results will be stored.
"""
import os.path
@@ -26,26 +43,34 @@
class Result(Enum):
- """
- An Enum defining the possible states that
- a setup, a teardown or a test case may end up in.
- """
+ """The possible states that a setup, a teardown or a test case may end up in."""
+ #:
PASS = auto()
+ #:
FAIL = auto()
+ #:
ERROR = auto()
+ #:
SKIP = auto()
def __bool__(self) -> bool:
+ """Only PASS is True."""
return self is self.PASS
class FixtureResult(object):
- """
- A record that stored the result of a setup or a teardown.
- The default is FAIL because immediately after creating the object
- the setup of the corresponding stage will be executed, which also guarantees
- the execution of teardown.
+ """A record that stores the result of a setup or a teardown.
+
+ FAIL is a sensible default since it prevents false positives
+ (which could happen if the default was PASS).
+
+ Preventing false positives or other false results is preferable since a failure
+ is mostly likely to be investigated (the other false results may not be investigated at all).
+
+ Attributes:
+ result: The associated result.
+ error: The error in case of a failure.
"""
result: Result
@@ -56,21 +81,32 @@ def __init__(
result: Result = Result.FAIL,
error: Exception | None = None,
):
+ """Initialize the constructor with the fixture result and store a possible error.
+
+ Args:
+ result: The result to store.
+ error: The error which happened when a failure occurred.
+ """
self.result = result
self.error = error
def __bool__(self) -> bool:
+ """A wrapper around the stored :class:`Result`."""
return bool(self.result)
class Statistics(dict):
- """
- A helper class used to store the number of test cases by its result
- along a few other basic information.
- Using a dict provides a convenient way to format the data.
+ """How many test cases ended in which result state along some other basic information.
+
+ Subclassing :class:`dict` provides a convenient way to format the data.
"""
def __init__(self, dpdk_version: str | None):
+ """Extend the constructor with relevant keys.
+
+ Args:
+ dpdk_version: The version of tested DPDK.
+ """
super(Statistics, self).__init__()
for result in Result:
self[result.name] = 0
@@ -78,8 +114,17 @@ def __init__(self, dpdk_version: str | None):
self["DPDK VERSION"] = dpdk_version
def __iadd__(self, other: Result) -> "Statistics":
- """
- Add a Result to the final count.
+ """Add a Result to the final count.
+
+ Example:
+ stats: Statistics = Statistics() # empty Statistics
+ stats += Result.PASS # add a Result to `stats`
+
+ Args:
+ other: The Result to add to this statistics object.
+
+ Returns:
+ The modified statistics object.
"""
self[other.name] += 1
self["PASS RATE"] = (
@@ -90,9 +135,7 @@ def __iadd__(self, other: Result) -> "Statistics":
return self
def __str__(self) -> str:
- """
- Provide a string representation of the data.
- """
+ """Each line contains the formatted key = value pair."""
stats_str = ""
for key, value in self.items():
stats_str += f"{key:<12} = {value}\n"
@@ -102,10 +145,16 @@ def __str__(self) -> str:
class BaseResult(object):
- """
- The Base class for all results. Stores the results of
- the setup and teardown portions of the corresponding stage
- and a list of results from each inner stage in _inner_results.
+ """Common data and behavior of DTS results.
+
+ Stores the results of the setup and teardown portions of the corresponding stage.
+ The hierarchical nature of DTS results is captured recursively in an internal list.
+ A stage is each level in this particular hierarchy (pre-execution or the top-most level,
+ execution, build target, test suite and test case.)
+
+ Attributes:
+ setup_result: The result of the setup of the particular stage.
+ teardown_result: The results of the teardown of the particular stage.
"""
setup_result: FixtureResult
@@ -113,15 +162,28 @@ class BaseResult(object):
_inner_results: MutableSequence["BaseResult"]
def __init__(self):
+ """Initialize the constructor."""
self.setup_result = FixtureResult()
self.teardown_result = FixtureResult()
self._inner_results = []
def update_setup(self, result: Result, error: Exception | None = None) -> None:
+ """Store the setup result.
+
+ Args:
+ result: The result of the setup.
+ error: The error that occurred in case of a failure.
+ """
self.setup_result.result = result
self.setup_result.error = error
def update_teardown(self, result: Result, error: Exception | None = None) -> None:
+ """Store the teardown result.
+
+ Args:
+ result: The result of the teardown.
+ error: The error that occurred in case of a failure.
+ """
self.teardown_result.result = result
self.teardown_result.error = error
@@ -141,27 +203,55 @@ def _get_inner_errors(self) -> list[Exception]:
]
def get_errors(self) -> list[Exception]:
+ """Compile errors from the whole result hierarchy.
+
+ Returns:
+ The errors from setup, teardown and all errors found in the whole result hierarchy.
+ """
return self._get_setup_teardown_errors() + self._get_inner_errors()
def add_stats(self, statistics: Statistics) -> None:
+ """Collate stats from the whole result hierarchy.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be collated.
+ """
for inner_result in self._inner_results:
inner_result.add_stats(statistics)
class TestCaseResult(BaseResult, FixtureResult):
- """
- The test case specific result.
- Stores the result of the actual test case.
- Also stores the test case name.
+ r"""The test case specific result.
+
+ Stores the result of the actual test case. This is done by adding an extra superclass
+ in :class:`FixtureResult`. The setup and teardown results are :class:`FixtureResult`\s and
+ the class is itself a record of the test case.
+
+ Attributes:
+ test_case_name: The test case name.
"""
test_case_name: str
def __init__(self, test_case_name: str):
+ """Extend the constructor with `test_case_name`.
+
+ Args:
+ test_case_name: The test case's name.
+ """
super(TestCaseResult, self).__init__()
self.test_case_name = test_case_name
def update(self, result: Result, error: Exception | None = None) -> None:
+ """Update the test case result.
+
+ This updates the result of the test case itself and doesn't affect
+ the results of the setup and teardown steps in any way.
+
+ Args:
+ result: The result of the test case.
+ error: The error that occurred in case of a failure.
+ """
self.result = result
self.error = error
@@ -171,38 +261,66 @@ def _get_inner_errors(self) -> list[Exception]:
return []
def add_stats(self, statistics: Statistics) -> None:
+ r"""Add the test case result to statistics.
+
+ The base method goes through the hierarchy recursively and this method is here to stop
+ the recursion, as the :class:`TestCaseResult`\s are the leaves of the hierarchy tree.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be added.
+ """
statistics += self.result
def __bool__(self) -> bool:
+ """The test case passed only if setup, teardown and the test case itself passed."""
return (
bool(self.setup_result) and bool(self.teardown_result) and bool(self.result)
)
class TestSuiteResult(BaseResult):
- """
- The test suite specific result.
- The _inner_results list stores results of test cases in a given test suite.
- Also stores the test suite name.
+ """The test suite specific result.
+
+ The internal list stores the results of all test cases in a given test suite.
+
+ Attributes:
+ suite_name: The test suite name.
"""
suite_name: str
def __init__(self, suite_name: str):
+ """Extend the constructor with `suite_name`.
+
+ Args:
+ suite_name: The test suite's name.
+ """
super(TestSuiteResult, self).__init__()
self.suite_name = suite_name
def add_test_case(self, test_case_name: str) -> TestCaseResult:
+ """Add and return the inner result (test case).
+
+ Returns:
+ The test case's result.
+ """
test_case_result = TestCaseResult(test_case_name)
self._inner_results.append(test_case_result)
return test_case_result
class BuildTargetResult(BaseResult):
- """
- The build target specific result.
- The _inner_results list stores results of test suites in a given build target.
- Also stores build target specifics, such as compiler used to build DPDK.
+ """The build target specific result.
+
+ The internal list stores the results of all test suites in a given build target.
+
+ Attributes:
+ arch: The DPDK build target architecture.
+ os: The DPDK build target operating system.
+ cpu: The DPDK build target CPU.
+ compiler: The DPDK build target compiler.
+ compiler_version: The DPDK build target compiler version.
+ dpdk_version: The built DPDK version.
"""
arch: Architecture
@@ -213,6 +331,11 @@ class BuildTargetResult(BaseResult):
dpdk_version: str | None
def __init__(self, build_target: BuildTargetConfiguration):
+ """Extend the constructor with the `build_target`'s build target config.
+
+ Args:
+ build_target: The build target's test run configuration.
+ """
super(BuildTargetResult, self).__init__()
self.arch = build_target.arch
self.os = build_target.os
@@ -222,20 +345,35 @@ def __init__(self, build_target: BuildTargetConfiguration):
self.dpdk_version = None
def add_build_target_info(self, versions: BuildTargetInfo) -> None:
+ """Add information about the build target gathered at runtime.
+
+ Args:
+ versions: The additional information.
+ """
self.compiler_version = versions.compiler_version
self.dpdk_version = versions.dpdk_version
def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
+ """Add and return the inner result (test suite).
+
+ Returns:
+ The test suite's result.
+ """
test_suite_result = TestSuiteResult(test_suite_name)
self._inner_results.append(test_suite_result)
return test_suite_result
class ExecutionResult(BaseResult):
- """
- The execution specific result.
- The _inner_results list stores results of build targets in a given execution.
- Also stores the SUT node configuration.
+ """The execution specific result.
+
+ The internal list stores the results of all build targets in a given execution.
+
+ Attributes:
+ sut_node: The SUT node used in the execution.
+ sut_os_name: The operating system of the SUT node.
+ sut_os_version: The operating system version of the SUT node.
+ sut_kernel_version: The operating system kernel version of the SUT node.
"""
sut_node: NodeConfiguration
@@ -244,36 +382,55 @@ class ExecutionResult(BaseResult):
sut_kernel_version: str
def __init__(self, sut_node: NodeConfiguration):
+ """Extend the constructor with the `sut_node`'s config.
+
+ Args:
+ sut_node: The SUT node's test run configuration used in the execution.
+ """
super(ExecutionResult, self).__init__()
self.sut_node = sut_node
def add_build_target(
self, build_target: BuildTargetConfiguration
) -> BuildTargetResult:
+ """Add and return the inner result (build target).
+
+ Args:
+ build_target: The build target's test run configuration.
+
+ Returns:
+ The build target's result.
+ """
build_target_result = BuildTargetResult(build_target)
self._inner_results.append(build_target_result)
return build_target_result
def add_sut_info(self, sut_info: NodeInfo) -> None:
+ """Add SUT information gathered at runtime.
+
+ Args:
+ sut_info: The additional SUT node information.
+ """
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
class DTSResult(BaseResult):
- """
- Stores environment information and test results from a DTS run, which are:
- * Execution level information, such as SUT and TG hardware.
- * Build target level information, such as compiler, target OS and cpu.
- * Test suite results.
- * All errors that are caught and recorded during DTS execution.
+ """Stores environment information and test results from a DTS run.
- The information is stored in nested objects.
+ * Execution level information, such as testbed and the test suite list,
+ * Build target level information, such as compiler, target OS and cpu,
+ * Test suite and test case results,
+ * All errors that are caught and recorded during DTS execution.
- The class is capable of computing the return code used to exit DTS with
- from the stored error.
+ The information is stored hierarchically. This is the first level of the hierarchy
+ and as such is where the data form the whole hierarchy is collated or processed.
- It also provides a brief statistical summary of passed/failed test cases.
+ The internal list stores the results of all executions.
+
+ Attributes:
+ dpdk_version: The DPDK version to record.
"""
dpdk_version: str | None
@@ -284,6 +441,11 @@ class DTSResult(BaseResult):
_stats_filename: str
def __init__(self, logger: DTSLOG):
+ """Extend the constructor with top-level specifics.
+
+ Args:
+ logger: The logger instance the whole result will use.
+ """
super(DTSResult, self).__init__()
self.dpdk_version = None
self._logger = logger
@@ -293,21 +455,33 @@ def __init__(self, logger: DTSLOG):
self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
+ """Add and return the inner result (execution).
+
+ Args:
+ sut_node: The SUT node's test run configuration.
+
+ Returns:
+ The execution's result.
+ """
execution_result = ExecutionResult(sut_node)
self._inner_results.append(execution_result)
return execution_result
def add_error(self, error: Exception) -> None:
+ """Record an error that occurred outside any execution.
+
+ Args:
+ error: The exception to record.
+ """
self._errors.append(error)
def process(self) -> None:
- """
- Process the data after a DTS run.
- The data is added to nested objects during runtime and this parent object
- is not updated at that time. This requires us to process the nested data
- after it's all been gathered.
+ """Process the data after a whole DTS run.
+
+ The data is added to inner objects during runtime and this object is not updated
+ at that time. This requires us to process the inner data after it's all been gathered.
- The processing gathers all errors and the result statistics of test cases.
+ The processing gathers all errors and the statistics of test case results.
"""
self._errors += self.get_errors()
if self._errors and self._logger:
@@ -321,8 +495,10 @@ def process(self) -> None:
stats_file.write(str(self._stats_result))
def get_return_code(self) -> int:
- """
- Go through all stored Exceptions and return the highest error code found.
+ """Go through all stored Exceptions and return the final DTS error code.
+
+ Returns:
+ The highest error code found.
"""
for error in self._errors:
error_return_code = ErrorSeverity.GENERIC_ERR
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 09/21] dts: test result docstring update
2023-11-15 13:09 ` [PATCH v7 09/21] dts: test result " Juraj Linkeš
@ 2023-11-16 22:47 ` Jeremy Spewock
2023-11-20 16:33 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-11-16 22:47 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
[-- Attachment #1: Type: text/plain, Size: 21137 bytes --]
The only comments I had on this were a few places where I think attribute
sections should be class variables instead. I tried to mark all of the
places I saw it and it could be a difference where because of the way they
are subclassed they might do it differently but I'm unsure.
On Wed, Nov 15, 2023 at 8:12 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/test_result.py | 292 ++++++++++++++++++++++++++++-------
> 1 file changed, 234 insertions(+), 58 deletions(-)
>
> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
> index 603e18872c..05e210f6e7 100644
> --- a/dts/framework/test_result.py
> +++ b/dts/framework/test_result.py
> @@ -2,8 +2,25 @@
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2023 University of New Hampshire
>
> -"""
> -Generic result container and reporters
> +r"""Record and process DTS results.
> +
> +The results are recorded in a hierarchical manner:
> +
> + * :class:`DTSResult` contains
> + * :class:`ExecutionResult` contains
> + * :class:`BuildTargetResult` contains
> + * :class:`TestSuiteResult` contains
> + * :class:`TestCaseResult`
> +
> +Each result may contain multiple lower level results, e.g. there are
> multiple
> +:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
> +The results have common parts, such as setup and teardown results,
> captured in :class:`BaseResult`,
> +which also defines some common behaviors in its methods.
> +
> +Each result class has its own idiosyncrasies which they implement in
> overridden methods.
> +
> +The :option:`--output` command line argument and the
> :envvar:`DTS_OUTPUT_DIR` environment
> +variable modify the directory where the files with results will be stored.
> """
>
> import os.path
> @@ -26,26 +43,34 @@
>
>
> class Result(Enum):
> - """
> - An Enum defining the possible states that
> - a setup, a teardown or a test case may end up in.
> - """
> + """The possible states that a setup, a teardown or a test case may
> end up in."""
>
> + #:
> PASS = auto()
> + #:
> FAIL = auto()
> + #:
> ERROR = auto()
> + #:
> SKIP = auto()
>
> def __bool__(self) -> bool:
> + """Only PASS is True."""
> return self is self.PASS
>
>
> class FixtureResult(object):
> - """
> - A record that stored the result of a setup or a teardown.
> - The default is FAIL because immediately after creating the object
> - the setup of the corresponding stage will be executed, which also
> guarantees
> - the execution of teardown.
> + """A record that stores the result of a setup or a teardown.
> +
> + FAIL is a sensible default since it prevents false positives
> + (which could happen if the default was PASS).
> +
> + Preventing false positives or other false results is preferable since
> a failure
> + is mostly likely to be investigated (the other false results may not
> be investigated at all).
> +
> + Attributes:
> + result: The associated result.
> + error: The error in case of a failure.
> """
>
I think the items in the attributes section should instead be "#:" because
they are class variables.
>
> result: Result
> @@ -56,21 +81,32 @@ def __init__(
> result: Result = Result.FAIL,
> error: Exception | None = None,
> ):
> + """Initialize the constructor with the fixture result and store a
> possible error.
> +
> + Args:
> + result: The result to store.
> + error: The error which happened when a failure occurred.
> + """
> self.result = result
> self.error = error
>
> def __bool__(self) -> bool:
> + """A wrapper around the stored :class:`Result`."""
> return bool(self.result)
>
>
> class Statistics(dict):
> - """
> - A helper class used to store the number of test cases by its result
> - along a few other basic information.
> - Using a dict provides a convenient way to format the data.
> + """How many test cases ended in which result state along some other
> basic information.
> +
> + Subclassing :class:`dict` provides a convenient way to format the
> data.
> """
>
> def __init__(self, dpdk_version: str | None):
> + """Extend the constructor with relevant keys.
> +
> + Args:
> + dpdk_version: The version of tested DPDK.
> + """
>
Should we maybe mark the "PASS RATE" and the "DPDK VERSION" as instance
variables of the class?
> super(Statistics, self).__init__()
> for result in Result:
> self[result.name] = 0
> @@ -78,8 +114,17 @@ def __init__(self, dpdk_version: str | None):
> self["DPDK VERSION"] = dpdk_version
>
> def __iadd__(self, other: Result) -> "Statistics":
> - """
> - Add a Result to the final count.
> + """Add a Result to the final count.
> +
> + Example:
> + stats: Statistics = Statistics() # empty Statistics
> + stats += Result.PASS # add a Result to `stats`
> +
> + Args:
> + other: The Result to add to this statistics object.
> +
> + Returns:
> + The modified statistics object.
> """
> self[other.name] += 1
> self["PASS RATE"] = (
> @@ -90,9 +135,7 @@ def __iadd__(self, other: Result) -> "Statistics":
> return self
>
> def __str__(self) -> str:
> - """
> - Provide a string representation of the data.
> - """
> + """Each line contains the formatted key = value pair."""
> stats_str = ""
> for key, value in self.items():
> stats_str += f"{key:<12} = {value}\n"
> @@ -102,10 +145,16 @@ def __str__(self) -> str:
>
>
> class BaseResult(object):
> - """
> - The Base class for all results. Stores the results of
> - the setup and teardown portions of the corresponding stage
> - and a list of results from each inner stage in _inner_results.
> + """Common data and behavior of DTS results.
> +
> + Stores the results of the setup and teardown portions of the
> corresponding stage.
> + The hierarchical nature of DTS results is captured recursively in an
> internal list.
> + A stage is each level in this particular hierarchy (pre-execution or
> the top-most level,
> + execution, build target, test suite and test case.)
> +
> + Attributes:
> + setup_result: The result of the setup of the particular stage.
> + teardown_result: The results of the teardown of the particular
> stage.
> """
>
I think this might be another case of the attributes should be marked as
class variables instead of instance variables.
>
> setup_result: FixtureResult
> @@ -113,15 +162,28 @@ class BaseResult(object):
> _inner_results: MutableSequence["BaseResult"]
>
> def __init__(self):
> + """Initialize the constructor."""
> self.setup_result = FixtureResult()
> self.teardown_result = FixtureResult()
> self._inner_results = []
>
> def update_setup(self, result: Result, error: Exception | None =
> None) -> None:
> + """Store the setup result.
> +
> + Args:
> + result: The result of the setup.
> + error: The error that occurred in case of a failure.
> + """
> self.setup_result.result = result
> self.setup_result.error = error
>
> def update_teardown(self, result: Result, error: Exception | None =
> None) -> None:
> + """Store the teardown result.
> +
> + Args:
> + result: The result of the teardown.
> + error: The error that occurred in case of a failure.
> + """
> self.teardown_result.result = result
> self.teardown_result.error = error
>
> @@ -141,27 +203,55 @@ def _get_inner_errors(self) -> list[Exception]:
> ]
>
> def get_errors(self) -> list[Exception]:
> + """Compile errors from the whole result hierarchy.
> +
> + Returns:
> + The errors from setup, teardown and all errors found in the
> whole result hierarchy.
> + """
> return self._get_setup_teardown_errors() +
> self._get_inner_errors()
>
> def add_stats(self, statistics: Statistics) -> None:
> + """Collate stats from the whole result hierarchy.
> +
> + Args:
> + statistics: The :class:`Statistics` object where the stats
> will be collated.
> + """
> for inner_result in self._inner_results:
> inner_result.add_stats(statistics)
>
>
> class TestCaseResult(BaseResult, FixtureResult):
> - """
> - The test case specific result.
> - Stores the result of the actual test case.
> - Also stores the test case name.
> + r"""The test case specific result.
> +
> + Stores the result of the actual test case. This is done by adding an
> extra superclass
> + in :class:`FixtureResult`. The setup and teardown results are
> :class:`FixtureResult`\s and
> + the class is itself a record of the test case.
> +
> + Attributes:
> + test_case_name: The test case name.
> """
>
>
Another spot where I think this should have a class variable comment.
> test_case_name: str
>
> def __init__(self, test_case_name: str):
> + """Extend the constructor with `test_case_name`.
> +
> + Args:
> + test_case_name: The test case's name.
> + """
> super(TestCaseResult, self).__init__()
> self.test_case_name = test_case_name
>
> def update(self, result: Result, error: Exception | None = None) ->
> None:
> + """Update the test case result.
> +
> + This updates the result of the test case itself and doesn't affect
> + the results of the setup and teardown steps in any way.
> +
> + Args:
> + result: The result of the test case.
> + error: The error that occurred in case of a failure.
> + """
> self.result = result
> self.error = error
>
> @@ -171,38 +261,66 @@ def _get_inner_errors(self) -> list[Exception]:
> return []
>
> def add_stats(self, statistics: Statistics) -> None:
> + r"""Add the test case result to statistics.
> +
> + The base method goes through the hierarchy recursively and this
> method is here to stop
> + the recursion, as the :class:`TestCaseResult`\s are the leaves of
> the hierarchy tree.
> +
> + Args:
> + statistics: The :class:`Statistics` object where the stats
> will be added.
> + """
> statistics += self.result
>
> def __bool__(self) -> bool:
> + """The test case passed only if setup, teardown and the test case
> itself passed."""
> return (
> bool(self.setup_result) and bool(self.teardown_result) and
> bool(self.result)
> )
>
>
> class TestSuiteResult(BaseResult):
> - """
> - The test suite specific result.
> - The _inner_results list stores results of test cases in a given test
> suite.
> - Also stores the test suite name.
> + """The test suite specific result.
> +
> + The internal list stores the results of all test cases in a given
> test suite.
> +
> + Attributes:
> + suite_name: The test suite name.
> """
>
>
I think this should also be a class variable.
> suite_name: str
>
> def __init__(self, suite_name: str):
> + """Extend the constructor with `suite_name`.
> +
> + Args:
> + suite_name: The test suite's name.
> + """
> super(TestSuiteResult, self).__init__()
> self.suite_name = suite_name
>
> def add_test_case(self, test_case_name: str) -> TestCaseResult:
> + """Add and return the inner result (test case).
> +
> + Returns:
> + The test case's result.
> + """
> test_case_result = TestCaseResult(test_case_name)
> self._inner_results.append(test_case_result)
> return test_case_result
>
>
> class BuildTargetResult(BaseResult):
> - """
> - The build target specific result.
> - The _inner_results list stores results of test suites in a given
> build target.
> - Also stores build target specifics, such as compiler used to build
> DPDK.
> + """The build target specific result.
> +
> + The internal list stores the results of all test suites in a given
> build target.
> +
> + Attributes:
> + arch: The DPDK build target architecture.
> + os: The DPDK build target operating system.
> + cpu: The DPDK build target CPU.
> + compiler: The DPDK build target compiler.
> + compiler_version: The DPDK build target compiler version.
> + dpdk_version: The built DPDK version.
> """
>
I think this should be broken into class variables as well.
>
> arch: Architecture
> @@ -213,6 +331,11 @@ class BuildTargetResult(BaseResult):
> dpdk_version: str | None
>
> def __init__(self, build_target: BuildTargetConfiguration):
> + """Extend the constructor with the `build_target`'s build target
> config.
> +
> + Args:
> + build_target: The build target's test run configuration.
> + """
> super(BuildTargetResult, self).__init__()
> self.arch = build_target.arch
> self.os = build_target.os
> @@ -222,20 +345,35 @@ def __init__(self, build_target:
> BuildTargetConfiguration):
> self.dpdk_version = None
>
> def add_build_target_info(self, versions: BuildTargetInfo) -> None:
> + """Add information about the build target gathered at runtime.
> +
> + Args:
> + versions: The additional information.
> + """
> self.compiler_version = versions.compiler_version
> self.dpdk_version = versions.dpdk_version
>
> def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
> + """Add and return the inner result (test suite).
> +
> + Returns:
> + The test suite's result.
> + """
> test_suite_result = TestSuiteResult(test_suite_name)
> self._inner_results.append(test_suite_result)
> return test_suite_result
>
>
> class ExecutionResult(BaseResult):
> - """
> - The execution specific result.
> - The _inner_results list stores results of build targets in a given
> execution.
> - Also stores the SUT node configuration.
> + """The execution specific result.
> +
> + The internal list stores the results of all build targets in a given
> execution.
> +
> + Attributes:
> + sut_node: The SUT node used in the execution.
> + sut_os_name: The operating system of the SUT node.
> + sut_os_version: The operating system version of the SUT node.
> + sut_kernel_version: The operating system kernel version of the
> SUT node.
> """
>
>
I think these should be class variables as well.
> sut_node: NodeConfiguration
> @@ -244,36 +382,55 @@ class ExecutionResult(BaseResult):
> sut_kernel_version: str
>
> def __init__(self, sut_node: NodeConfiguration):
> + """Extend the constructor with the `sut_node`'s config.
> +
> + Args:
> + sut_node: The SUT node's test run configuration used in the
> execution.
> + """
> super(ExecutionResult, self).__init__()
> self.sut_node = sut_node
>
> def add_build_target(
> self, build_target: BuildTargetConfiguration
> ) -> BuildTargetResult:
> + """Add and return the inner result (build target).
> +
> + Args:
> + build_target: The build target's test run configuration.
> +
> + Returns:
> + The build target's result.
> + """
> build_target_result = BuildTargetResult(build_target)
> self._inner_results.append(build_target_result)
> return build_target_result
>
> def add_sut_info(self, sut_info: NodeInfo) -> None:
> + """Add SUT information gathered at runtime.
> +
> + Args:
> + sut_info: The additional SUT node information.
> + """
> self.sut_os_name = sut_info.os_name
> self.sut_os_version = sut_info.os_version
> self.sut_kernel_version = sut_info.kernel_version
>
>
> class DTSResult(BaseResult):
> - """
> - Stores environment information and test results from a DTS run, which
> are:
> - * Execution level information, such as SUT and TG hardware.
> - * Build target level information, such as compiler, target OS and cpu.
> - * Test suite results.
> - * All errors that are caught and recorded during DTS execution.
> + """Stores environment information and test results from a DTS run.
>
> - The information is stored in nested objects.
> + * Execution level information, such as testbed and the test suite
> list,
> + * Build target level information, such as compiler, target OS and
> cpu,
> + * Test suite and test case results,
> + * All errors that are caught and recorded during DTS execution.
>
> - The class is capable of computing the return code used to exit DTS
> with
> - from the stored error.
> + The information is stored hierarchically. This is the first level of
> the hierarchy
> + and as such is where the data form the whole hierarchy is collated or
> processed.
>
> - It also provides a brief statistical summary of passed/failed test
> cases.
> + The internal list stores the results of all executions.
> +
> + Attributes:
> + dpdk_version: The DPDK version to record.
> """
>
>
I think this should be a class variable as well.
> dpdk_version: str | None
> @@ -284,6 +441,11 @@ class DTSResult(BaseResult):
> _stats_filename: str
>
> def __init__(self, logger: DTSLOG):
> + """Extend the constructor with top-level specifics.
> +
> + Args:
> + logger: The logger instance the whole result will use.
> + """
> super(DTSResult, self).__init__()
> self.dpdk_version = None
> self._logger = logger
> @@ -293,21 +455,33 @@ def __init__(self, logger: DTSLOG):
> self._stats_filename = os.path.join(SETTINGS.output_dir,
> "statistics.txt")
>
> def add_execution(self, sut_node: NodeConfiguration) ->
> ExecutionResult:
> + """Add and return the inner result (execution).
> +
> + Args:
> + sut_node: The SUT node's test run configuration.
> +
> + Returns:
> + The execution's result.
> + """
> execution_result = ExecutionResult(sut_node)
> self._inner_results.append(execution_result)
> return execution_result
>
> def add_error(self, error: Exception) -> None:
> + """Record an error that occurred outside any execution.
> +
> + Args:
> + error: The exception to record.
> + """
> self._errors.append(error)
>
> def process(self) -> None:
> - """
> - Process the data after a DTS run.
> - The data is added to nested objects during runtime and this
> parent object
> - is not updated at that time. This requires us to process the
> nested data
> - after it's all been gathered.
> + """Process the data after a whole DTS run.
> +
> + The data is added to inner objects during runtime and this object
> is not updated
> + at that time. This requires us to process the inner data after
> it's all been gathered.
>
> - The processing gathers all errors and the result statistics of
> test cases.
> + The processing gathers all errors and the statistics of test case
> results.
> """
> self._errors += self.get_errors()
> if self._errors and self._logger:
> @@ -321,8 +495,10 @@ def process(self) -> None:
> stats_file.write(str(self._stats_result))
>
> def get_return_code(self) -> int:
> - """
> - Go through all stored Exceptions and return the highest error
> code found.
> + """Go through all stored Exceptions and return the final DTS
> error code.
> +
> + Returns:
> + The highest error code found.
> """
> for error in self._errors:
> error_return_code = ErrorSeverity.GENERIC_ERR
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 27447 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 09/21] dts: test result docstring update
2023-11-16 22:47 ` Jeremy Spewock
@ 2023-11-20 16:33 ` Juraj Linkeš
2023-11-30 21:20 ` Jeremy Spewock
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-20 16:33 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
On Thu, Nov 16, 2023 at 11:47 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
>
> The only comments I had on this were a few places where I think attribute sections should be class variables instead. I tried to mark all of the places I saw it and it could be a difference where because of the way they are subclassed they might do it differently but I'm unsure.
>
> On Wed, Nov 15, 2023 at 8:12 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>>
>> Format according to the Google format and PEP257, with slight
>> deviations.
>>
>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>> ---
>> dts/framework/test_result.py | 292 ++++++++++++++++++++++++++++-------
>> 1 file changed, 234 insertions(+), 58 deletions(-)
>>
>> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
>> index 603e18872c..05e210f6e7 100644
>> --- a/dts/framework/test_result.py
>> +++ b/dts/framework/test_result.py
>> @@ -2,8 +2,25 @@
>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>> # Copyright(c) 2023 University of New Hampshire
>>
>> -"""
>> -Generic result container and reporters
>> +r"""Record and process DTS results.
>> +
>> +The results are recorded in a hierarchical manner:
>> +
>> + * :class:`DTSResult` contains
>> + * :class:`ExecutionResult` contains
>> + * :class:`BuildTargetResult` contains
>> + * :class:`TestSuiteResult` contains
>> + * :class:`TestCaseResult`
>> +
>> +Each result may contain multiple lower level results, e.g. there are multiple
>> +:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
>> +The results have common parts, such as setup and teardown results, captured in :class:`BaseResult`,
>> +which also defines some common behaviors in its methods.
>> +
>> +Each result class has its own idiosyncrasies which they implement in overridden methods.
>> +
>> +The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
>> +variable modify the directory where the files with results will be stored.
>> """
>>
>> import os.path
>> @@ -26,26 +43,34 @@
>>
>>
>> class Result(Enum):
>> - """
>> - An Enum defining the possible states that
>> - a setup, a teardown or a test case may end up in.
>> - """
>> + """The possible states that a setup, a teardown or a test case may end up in."""
>>
>> + #:
>> PASS = auto()
>> + #:
>> FAIL = auto()
>> + #:
>> ERROR = auto()
>> + #:
>> SKIP = auto()
>>
>> def __bool__(self) -> bool:
>> + """Only PASS is True."""
>> return self is self.PASS
>>
>>
>> class FixtureResult(object):
>> - """
>> - A record that stored the result of a setup or a teardown.
>> - The default is FAIL because immediately after creating the object
>> - the setup of the corresponding stage will be executed, which also guarantees
>> - the execution of teardown.
>> + """A record that stores the result of a setup or a teardown.
>> +
>> + FAIL is a sensible default since it prevents false positives
>> + (which could happen if the default was PASS).
>> +
>> + Preventing false positives or other false results is preferable since a failure
>> + is mostly likely to be investigated (the other false results may not be investigated at all).
>> +
>> + Attributes:
>> + result: The associated result.
>> + error: The error in case of a failure.
>> """
>
>
> I think the items in the attributes section should instead be "#:" because they are class variables.
>
Making these class variables would make the value the same for all
instances, of which there are plenty. Why do you think these should be
class variables?
>>
>>
>> result: Result
>> @@ -56,21 +81,32 @@ def __init__(
>> result: Result = Result.FAIL,
>> error: Exception | None = None,
>> ):
>> + """Initialize the constructor with the fixture result and store a possible error.
>> +
>> + Args:
>> + result: The result to store.
>> + error: The error which happened when a failure occurred.
>> + """
>> self.result = result
>> self.error = error
>>
>> def __bool__(self) -> bool:
>> + """A wrapper around the stored :class:`Result`."""
>> return bool(self.result)
>>
>>
>> class Statistics(dict):
>> - """
>> - A helper class used to store the number of test cases by its result
>> - along a few other basic information.
>> - Using a dict provides a convenient way to format the data.
>> + """How many test cases ended in which result state along some other basic information.
>> +
>> + Subclassing :class:`dict` provides a convenient way to format the data.
>> """
>>
>> def __init__(self, dpdk_version: str | None):
>> + """Extend the constructor with relevant keys.
>> +
>> + Args:
>> + dpdk_version: The version of tested DPDK.
>> + """
>
>
> Should we maybe mark the "PASS RATE" and the "DPDK VERSION" as instance variables of the class?
>
This is a dict, so these won't work as instance variables, but it
makes sense to document these keys, so I'll add that.
>>
>> super(Statistics, self).__init__()
>> for result in Result:
>> self[result.name] = 0
>> @@ -78,8 +114,17 @@ def __init__(self, dpdk_version: str | None):
>> self["DPDK VERSION"] = dpdk_version
>>
>> def __iadd__(self, other: Result) -> "Statistics":
>> - """
>> - Add a Result to the final count.
>> + """Add a Result to the final count.
>> +
>> + Example:
>> + stats: Statistics = Statistics() # empty Statistics
>> + stats += Result.PASS # add a Result to `stats`
>> +
>> + Args:
>> + other: The Result to add to this statistics object.
>> +
>> + Returns:
>> + The modified statistics object.
>> """
>> self[other.name] += 1
>> self["PASS RATE"] = (
>> @@ -90,9 +135,7 @@ def __iadd__(self, other: Result) -> "Statistics":
>> return self
>>
>> def __str__(self) -> str:
>> - """
>> - Provide a string representation of the data.
>> - """
>> + """Each line contains the formatted key = value pair."""
>> stats_str = ""
>> for key, value in self.items():
>> stats_str += f"{key:<12} = {value}\n"
>> @@ -102,10 +145,16 @@ def __str__(self) -> str:
>>
>>
>> class BaseResult(object):
>> - """
>> - The Base class for all results. Stores the results of
>> - the setup and teardown portions of the corresponding stage
>> - and a list of results from each inner stage in _inner_results.
>> + """Common data and behavior of DTS results.
>> +
>> + Stores the results of the setup and teardown portions of the corresponding stage.
>> + The hierarchical nature of DTS results is captured recursively in an internal list.
>> + A stage is each level in this particular hierarchy (pre-execution or the top-most level,
>> + execution, build target, test suite and test case.)
>> +
>> + Attributes:
>> + setup_result: The result of the setup of the particular stage.
>> + teardown_result: The results of the teardown of the particular stage.
>> """
>
>
> I think this might be another case of the attributes should be marked as class variables instead of instance variables.
>
This is the same as in FixtureResult. For example, there could be
multiple build targets with different results.
>>
>>
>> setup_result: FixtureResult
>> @@ -113,15 +162,28 @@ class BaseResult(object):
>> _inner_results: MutableSequence["BaseResult"]
>>
>> def __init__(self):
>> + """Initialize the constructor."""
>> self.setup_result = FixtureResult()
>> self.teardown_result = FixtureResult()
>> self._inner_results = []
>>
>> def update_setup(self, result: Result, error: Exception | None = None) -> None:
>> + """Store the setup result.
>> +
>> + Args:
>> + result: The result of the setup.
>> + error: The error that occurred in case of a failure.
>> + """
>> self.setup_result.result = result
>> self.setup_result.error = error
>>
>> def update_teardown(self, result: Result, error: Exception | None = None) -> None:
>> + """Store the teardown result.
>> +
>> + Args:
>> + result: The result of the teardown.
>> + error: The error that occurred in case of a failure.
>> + """
>> self.teardown_result.result = result
>> self.teardown_result.error = error
>>
>> @@ -141,27 +203,55 @@ def _get_inner_errors(self) -> list[Exception]:
>> ]
>>
>> def get_errors(self) -> list[Exception]:
>> + """Compile errors from the whole result hierarchy.
>> +
>> + Returns:
>> + The errors from setup, teardown and all errors found in the whole result hierarchy.
>> + """
>> return self._get_setup_teardown_errors() + self._get_inner_errors()
>>
>> def add_stats(self, statistics: Statistics) -> None:
>> + """Collate stats from the whole result hierarchy.
>> +
>> + Args:
>> + statistics: The :class:`Statistics` object where the stats will be collated.
>> + """
>> for inner_result in self._inner_results:
>> inner_result.add_stats(statistics)
>>
>>
>> class TestCaseResult(BaseResult, FixtureResult):
>> - """
>> - The test case specific result.
>> - Stores the result of the actual test case.
>> - Also stores the test case name.
>> + r"""The test case specific result.
>> +
>> + Stores the result of the actual test case. This is done by adding an extra superclass
>> + in :class:`FixtureResult`. The setup and teardown results are :class:`FixtureResult`\s and
>> + the class is itself a record of the test case.
>> +
>> + Attributes:
>> + test_case_name: The test case name.
>> """
>>
>
> Another spot where I think this should have a class variable comment.
>
>>
>> test_case_name: str
>>
>> def __init__(self, test_case_name: str):
>> + """Extend the constructor with `test_case_name`.
>> +
>> + Args:
>> + test_case_name: The test case's name.
>> + """
>> super(TestCaseResult, self).__init__()
>> self.test_case_name = test_case_name
>>
>> def update(self, result: Result, error: Exception | None = None) -> None:
>> + """Update the test case result.
>> +
>> + This updates the result of the test case itself and doesn't affect
>> + the results of the setup and teardown steps in any way.
>> +
>> + Args:
>> + result: The result of the test case.
>> + error: The error that occurred in case of a failure.
>> + """
>> self.result = result
>> self.error = error
>>
>> @@ -171,38 +261,66 @@ def _get_inner_errors(self) -> list[Exception]:
>> return []
>>
>> def add_stats(self, statistics: Statistics) -> None:
>> + r"""Add the test case result to statistics.
>> +
>> + The base method goes through the hierarchy recursively and this method is here to stop
>> + the recursion, as the :class:`TestCaseResult`\s are the leaves of the hierarchy tree.
>> +
>> + Args:
>> + statistics: The :class:`Statistics` object where the stats will be added.
>> + """
>> statistics += self.result
>>
>> def __bool__(self) -> bool:
>> + """The test case passed only if setup, teardown and the test case itself passed."""
>> return (
>> bool(self.setup_result) and bool(self.teardown_result) and bool(self.result)
>> )
>>
>>
>> class TestSuiteResult(BaseResult):
>> - """
>> - The test suite specific result.
>> - The _inner_results list stores results of test cases in a given test suite.
>> - Also stores the test suite name.
>> + """The test suite specific result.
>> +
>> + The internal list stores the results of all test cases in a given test suite.
>> +
>> + Attributes:
>> + suite_name: The test suite name.
>> """
>>
>
> I think this should also be a class variable.
>
>
>>
>> suite_name: str
>>
>> def __init__(self, suite_name: str):
>> + """Extend the constructor with `suite_name`.
>> +
>> + Args:
>> + suite_name: The test suite's name.
>> + """
>> super(TestSuiteResult, self).__init__()
>> self.suite_name = suite_name
>>
>> def add_test_case(self, test_case_name: str) -> TestCaseResult:
>> + """Add and return the inner result (test case).
>> +
>> + Returns:
>> + The test case's result.
>> + """
>> test_case_result = TestCaseResult(test_case_name)
>> self._inner_results.append(test_case_result)
>> return test_case_result
>>
>>
>> class BuildTargetResult(BaseResult):
>> - """
>> - The build target specific result.
>> - The _inner_results list stores results of test suites in a given build target.
>> - Also stores build target specifics, such as compiler used to build DPDK.
>> + """The build target specific result.
>> +
>> + The internal list stores the results of all test suites in a given build target.
>> +
>> + Attributes:
>> + arch: The DPDK build target architecture.
>> + os: The DPDK build target operating system.
>> + cpu: The DPDK build target CPU.
>> + compiler: The DPDK build target compiler.
>> + compiler_version: The DPDK build target compiler version.
>> + dpdk_version: The built DPDK version.
>> """
>
>
> I think this should be broken into class variables as well.
>
>>
>>
>> arch: Architecture
>> @@ -213,6 +331,11 @@ class BuildTargetResult(BaseResult):
>> dpdk_version: str | None
>>
>> def __init__(self, build_target: BuildTargetConfiguration):
>> + """Extend the constructor with the `build_target`'s build target config.
>> +
>> + Args:
>> + build_target: The build target's test run configuration.
>> + """
>> super(BuildTargetResult, self).__init__()
>> self.arch = build_target.arch
>> self.os = build_target.os
>> @@ -222,20 +345,35 @@ def __init__(self, build_target: BuildTargetConfiguration):
>> self.dpdk_version = None
>>
>> def add_build_target_info(self, versions: BuildTargetInfo) -> None:
>> + """Add information about the build target gathered at runtime.
>> +
>> + Args:
>> + versions: The additional information.
>> + """
>> self.compiler_version = versions.compiler_version
>> self.dpdk_version = versions.dpdk_version
>>
>> def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
>> + """Add and return the inner result (test suite).
>> +
>> + Returns:
>> + The test suite's result.
>> + """
>> test_suite_result = TestSuiteResult(test_suite_name)
>> self._inner_results.append(test_suite_result)
>> return test_suite_result
>>
>>
>> class ExecutionResult(BaseResult):
>> - """
>> - The execution specific result.
>> - The _inner_results list stores results of build targets in a given execution.
>> - Also stores the SUT node configuration.
>> + """The execution specific result.
>> +
>> + The internal list stores the results of all build targets in a given execution.
>> +
>> + Attributes:
>> + sut_node: The SUT node used in the execution.
>> + sut_os_name: The operating system of the SUT node.
>> + sut_os_version: The operating system version of the SUT node.
>> + sut_kernel_version: The operating system kernel version of the SUT node.
>> """
>>
>
> I think these should be class variables as well.
>
>>
>> sut_node: NodeConfiguration
>> @@ -244,36 +382,55 @@ class ExecutionResult(BaseResult):
>> sut_kernel_version: str
>>
>> def __init__(self, sut_node: NodeConfiguration):
>> + """Extend the constructor with the `sut_node`'s config.
>> +
>> + Args:
>> + sut_node: The SUT node's test run configuration used in the execution.
>> + """
>> super(ExecutionResult, self).__init__()
>> self.sut_node = sut_node
>>
>> def add_build_target(
>> self, build_target: BuildTargetConfiguration
>> ) -> BuildTargetResult:
>> + """Add and return the inner result (build target).
>> +
>> + Args:
>> + build_target: The build target's test run configuration.
>> +
>> + Returns:
>> + The build target's result.
>> + """
>> build_target_result = BuildTargetResult(build_target)
>> self._inner_results.append(build_target_result)
>> return build_target_result
>>
>> def add_sut_info(self, sut_info: NodeInfo) -> None:
>> + """Add SUT information gathered at runtime.
>> +
>> + Args:
>> + sut_info: The additional SUT node information.
>> + """
>> self.sut_os_name = sut_info.os_name
>> self.sut_os_version = sut_info.os_version
>> self.sut_kernel_version = sut_info.kernel_version
>>
>>
>> class DTSResult(BaseResult):
>> - """
>> - Stores environment information and test results from a DTS run, which are:
>> - * Execution level information, such as SUT and TG hardware.
>> - * Build target level information, such as compiler, target OS and cpu.
>> - * Test suite results.
>> - * All errors that are caught and recorded during DTS execution.
>> + """Stores environment information and test results from a DTS run.
>>
>> - The information is stored in nested objects.
>> + * Execution level information, such as testbed and the test suite list,
>> + * Build target level information, such as compiler, target OS and cpu,
>> + * Test suite and test case results,
>> + * All errors that are caught and recorded during DTS execution.
>>
>> - The class is capable of computing the return code used to exit DTS with
>> - from the stored error.
>> + The information is stored hierarchically. This is the first level of the hierarchy
>> + and as such is where the data form the whole hierarchy is collated or processed.
>>
>> - It also provides a brief statistical summary of passed/failed test cases.
>> + The internal list stores the results of all executions.
>> +
>> + Attributes:
>> + dpdk_version: The DPDK version to record.
>> """
>>
>
> I think this should be a class variable as well.
>
This is the only place where making this a class variable would work,
but I don't see a reason for it. An instance variable works just as
well.
>>
>> dpdk_version: str | None
>> @@ -284,6 +441,11 @@ class DTSResult(BaseResult):
>> _stats_filename: str
>>
>> def __init__(self, logger: DTSLOG):
>> + """Extend the constructor with top-level specifics.
>> +
>> + Args:
>> + logger: The logger instance the whole result will use.
>> + """
>> super(DTSResult, self).__init__()
>> self.dpdk_version = None
>> self._logger = logger
>> @@ -293,21 +455,33 @@ def __init__(self, logger: DTSLOG):
>> self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
>>
>> def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
>> + """Add and return the inner result (execution).
>> +
>> + Args:
>> + sut_node: The SUT node's test run configuration.
>> +
>> + Returns:
>> + The execution's result.
>> + """
>> execution_result = ExecutionResult(sut_node)
>> self._inner_results.append(execution_result)
>> return execution_result
>>
>> def add_error(self, error: Exception) -> None:
>> + """Record an error that occurred outside any execution.
>> +
>> + Args:
>> + error: The exception to record.
>> + """
>> self._errors.append(error)
>>
>> def process(self) -> None:
>> - """
>> - Process the data after a DTS run.
>> - The data is added to nested objects during runtime and this parent object
>> - is not updated at that time. This requires us to process the nested data
>> - after it's all been gathered.
>> + """Process the data after a whole DTS run.
>> +
>> + The data is added to inner objects during runtime and this object is not updated
>> + at that time. This requires us to process the inner data after it's all been gathered.
>>
>> - The processing gathers all errors and the result statistics of test cases.
>> + The processing gathers all errors and the statistics of test case results.
>> """
>> self._errors += self.get_errors()
>> if self._errors and self._logger:
>> @@ -321,8 +495,10 @@ def process(self) -> None:
>> stats_file.write(str(self._stats_result))
>>
>> def get_return_code(self) -> int:
>> - """
>> - Go through all stored Exceptions and return the highest error code found.
>> + """Go through all stored Exceptions and return the final DTS error code.
>> +
>> + Returns:
>> + The highest error code found.
>> """
>> for error in self._errors:
>> error_return_code = ErrorSeverity.GENERIC_ERR
>> --
>> 2.34.1
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 09/21] dts: test result docstring update
2023-11-20 16:33 ` Juraj Linkeš
@ 2023-11-30 21:20 ` Jeremy Spewock
0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2023-11-30 21:20 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek, yoan.picchi, dev
[-- Attachment #1: Type: text/plain, Size: 24047 bytes --]
On Mon, Nov 20, 2023 at 11:33 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> On Thu, Nov 16, 2023 at 11:47 PM Jeremy Spewock <jspewock@iol.unh.edu>
> wrote:
> >
> > The only comments I had on this were a few places where I think
> attribute sections should be class variables instead. I tried to mark all
> of the places I saw it and it could be a difference where because of the
> way they are subclassed they might do it differently but I'm unsure.
> >
> > On Wed, Nov 15, 2023 at 8:12 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
> wrote:
> >>
> >> Format according to the Google format and PEP257, with slight
> >> deviations.
> >>
> >> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> >> ---
> >> dts/framework/test_result.py | 292 ++++++++++++++++++++++++++++-------
> >> 1 file changed, 234 insertions(+), 58 deletions(-)
> >>
> >> diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
> >> index 603e18872c..05e210f6e7 100644
> >> --- a/dts/framework/test_result.py
> >> +++ b/dts/framework/test_result.py
> >> @@ -2,8 +2,25 @@
> >> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >> # Copyright(c) 2023 University of New Hampshire
> >>
> >> -"""
> >> -Generic result container and reporters
> >> +r"""Record and process DTS results.
> >> +
> >> +The results are recorded in a hierarchical manner:
> >> +
> >> + * :class:`DTSResult` contains
> >> + * :class:`ExecutionResult` contains
> >> + * :class:`BuildTargetResult` contains
> >> + * :class:`TestSuiteResult` contains
> >> + * :class:`TestCaseResult`
> >> +
> >> +Each result may contain multiple lower level results, e.g. there are
> multiple
> >> +:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
> >> +The results have common parts, such as setup and teardown results,
> captured in :class:`BaseResult`,
> >> +which also defines some common behaviors in its methods.
> >> +
> >> +Each result class has its own idiosyncrasies which they implement in
> overridden methods.
> >> +
> >> +The :option:`--output` command line argument and the
> :envvar:`DTS_OUTPUT_DIR` environment
> >> +variable modify the directory where the files with results will be
> stored.
> >> """
> >>
> >> import os.path
> >> @@ -26,26 +43,34 @@
> >>
> >>
> >> class Result(Enum):
> >> - """
> >> - An Enum defining the possible states that
> >> - a setup, a teardown or a test case may end up in.
> >> - """
> >> + """The possible states that a setup, a teardown or a test case may
> end up in."""
> >>
> >> + #:
> >> PASS = auto()
> >> + #:
> >> FAIL = auto()
> >> + #:
> >> ERROR = auto()
> >> + #:
> >> SKIP = auto()
> >>
> >> def __bool__(self) -> bool:
> >> + """Only PASS is True."""
> >> return self is self.PASS
> >>
> >>
> >> class FixtureResult(object):
> >> - """
> >> - A record that stored the result of a setup or a teardown.
> >> - The default is FAIL because immediately after creating the object
> >> - the setup of the corresponding stage will be executed, which also
> guarantees
> >> - the execution of teardown.
> >> + """A record that stores the result of a setup or a teardown.
> >> +
> >> + FAIL is a sensible default since it prevents false positives
> >> + (which could happen if the default was PASS).
> >> +
> >> + Preventing false positives or other false results is preferable
> since a failure
> >> + is mostly likely to be investigated (the other false results may
> not be investigated at all).
> >> +
> >> + Attributes:
> >> + result: The associated result.
> >> + error: The error in case of a failure.
> >> """
> >
> >
> > I think the items in the attributes section should instead be "#:"
> because they are class variables.
> >
>
> Making these class variables would make the value the same for all
> instances, of which there are plenty. Why do you think these should be
> class variables?
>
That explanation makes more sense. I guess I was thinking of class
variables as anything we statically define as part of the class (i.e., like
we say the class will always have a `result` and an `error` attribute), but
I could have just been mistaken. Using the definition of instance variables
as they can differ between instances I agree makes this comment and the
other ones you touched on obsolete.
>
> >>
> >>
> >> result: Result
> >> @@ -56,21 +81,32 @@ def __init__(
> >> result: Result = Result.FAIL,
> >> error: Exception | None = None,
> >> ):
> >> + """Initialize the constructor with the fixture result and
> store a possible error.
> >> +
> >> + Args:
> >> + result: The result to store.
> >> + error: The error which happened when a failure occurred.
> >> + """
> >> self.result = result
> >> self.error = error
> >>
> >> def __bool__(self) -> bool:
> >> + """A wrapper around the stored :class:`Result`."""
> >> return bool(self.result)
> >>
> >>
> >> class Statistics(dict):
> >> - """
> >> - A helper class used to store the number of test cases by its result
> >> - along a few other basic information.
> >> - Using a dict provides a convenient way to format the data.
> >> + """How many test cases ended in which result state along some
> other basic information.
> >> +
> >> + Subclassing :class:`dict` provides a convenient way to format the
> data.
> >> """
> >>
> >> def __init__(self, dpdk_version: str | None):
> >> + """Extend the constructor with relevant keys.
> >> +
> >> + Args:
> >> + dpdk_version: The version of tested DPDK.
> >> + """
> >
> >
> > Should we maybe mark the "PASS RATE" and the "DPDK VERSION" as instance
> variables of the class?
> >
>
> This is a dict, so these won't work as instance variables, but it
> makes sense to document these keys, so I'll add that.
>
> >>
> >> super(Statistics, self).__init__()
> >> for result in Result:
> >> self[result.name] = 0
> >> @@ -78,8 +114,17 @@ def __init__(self, dpdk_version: str | None):
> >> self["DPDK VERSION"] = dpdk_version
> >>
> >> def __iadd__(self, other: Result) -> "Statistics":
> >> - """
> >> - Add a Result to the final count.
> >> + """Add a Result to the final count.
> >> +
> >> + Example:
> >> + stats: Statistics = Statistics() # empty Statistics
> >> + stats += Result.PASS # add a Result to `stats`
> >> +
> >> + Args:
> >> + other: The Result to add to this statistics object.
> >> +
> >> + Returns:
> >> + The modified statistics object.
> >> """
> >> self[other.name] += 1
> >> self["PASS RATE"] = (
> >> @@ -90,9 +135,7 @@ def __iadd__(self, other: Result) -> "Statistics":
> >> return self
> >>
> >> def __str__(self) -> str:
> >> - """
> >> - Provide a string representation of the data.
> >> - """
> >> + """Each line contains the formatted key = value pair."""
> >> stats_str = ""
> >> for key, value in self.items():
> >> stats_str += f"{key:<12} = {value}\n"
> >> @@ -102,10 +145,16 @@ def __str__(self) -> str:
> >>
> >>
> >> class BaseResult(object):
> >> - """
> >> - The Base class for all results. Stores the results of
> >> - the setup and teardown portions of the corresponding stage
> >> - and a list of results from each inner stage in _inner_results.
> >> + """Common data and behavior of DTS results.
> >> +
> >> + Stores the results of the setup and teardown portions of the
> corresponding stage.
> >> + The hierarchical nature of DTS results is captured recursively in
> an internal list.
> >> + A stage is each level in this particular hierarchy (pre-execution
> or the top-most level,
> >> + execution, build target, test suite and test case.)
> >> +
> >> + Attributes:
> >> + setup_result: The result of the setup of the particular stage.
> >> + teardown_result: The results of the teardown of the particular
> stage.
> >> """
> >
> >
> > I think this might be another case of the attributes should be marked as
> class variables instead of instance variables.
> >
>
> This is the same as in FixtureResult. For example, there could be
> multiple build targets with different results.
>
> >>
> >>
> >> setup_result: FixtureResult
> >> @@ -113,15 +162,28 @@ class BaseResult(object):
> >> _inner_results: MutableSequence["BaseResult"]
> >>
> >> def __init__(self):
> >> + """Initialize the constructor."""
> >> self.setup_result = FixtureResult()
> >> self.teardown_result = FixtureResult()
> >> self._inner_results = []
> >>
> >> def update_setup(self, result: Result, error: Exception | None =
> None) -> None:
> >> + """Store the setup result.
> >> +
> >> + Args:
> >> + result: The result of the setup.
> >> + error: The error that occurred in case of a failure.
> >> + """
> >> self.setup_result.result = result
> >> self.setup_result.error = error
> >>
> >> def update_teardown(self, result: Result, error: Exception | None
> = None) -> None:
> >> + """Store the teardown result.
> >> +
> >> + Args:
> >> + result: The result of the teardown.
> >> + error: The error that occurred in case of a failure.
> >> + """
> >> self.teardown_result.result = result
> >> self.teardown_result.error = error
> >>
> >> @@ -141,27 +203,55 @@ def _get_inner_errors(self) -> list[Exception]:
> >> ]
> >>
> >> def get_errors(self) -> list[Exception]:
> >> + """Compile errors from the whole result hierarchy.
> >> +
> >> + Returns:
> >> + The errors from setup, teardown and all errors found in
> the whole result hierarchy.
> >> + """
> >> return self._get_setup_teardown_errors() +
> self._get_inner_errors()
> >>
> >> def add_stats(self, statistics: Statistics) -> None:
> >> + """Collate stats from the whole result hierarchy.
> >> +
> >> + Args:
> >> + statistics: The :class:`Statistics` object where the stats
> will be collated.
> >> + """
> >> for inner_result in self._inner_results:
> >> inner_result.add_stats(statistics)
> >>
> >>
> >> class TestCaseResult(BaseResult, FixtureResult):
> >> - """
> >> - The test case specific result.
> >> - Stores the result of the actual test case.
> >> - Also stores the test case name.
> >> + r"""The test case specific result.
> >> +
> >> + Stores the result of the actual test case. This is done by adding
> an extra superclass
> >> + in :class:`FixtureResult`. The setup and teardown results are
> :class:`FixtureResult`\s and
> >> + the class is itself a record of the test case.
> >> +
> >> + Attributes:
> >> + test_case_name: The test case name.
> >> """
> >>
> >
> > Another spot where I think this should have a class variable comment.
> >
> >>
> >> test_case_name: str
> >>
> >> def __init__(self, test_case_name: str):
> >> + """Extend the constructor with `test_case_name`.
> >> +
> >> + Args:
> >> + test_case_name: The test case's name.
> >> + """
> >> super(TestCaseResult, self).__init__()
> >> self.test_case_name = test_case_name
> >>
> >> def update(self, result: Result, error: Exception | None = None)
> -> None:
> >> + """Update the test case result.
> >> +
> >> + This updates the result of the test case itself and doesn't
> affect
> >> + the results of the setup and teardown steps in any way.
> >> +
> >> + Args:
> >> + result: The result of the test case.
> >> + error: The error that occurred in case of a failure.
> >> + """
> >> self.result = result
> >> self.error = error
> >>
> >> @@ -171,38 +261,66 @@ def _get_inner_errors(self) -> list[Exception]:
> >> return []
> >>
> >> def add_stats(self, statistics: Statistics) -> None:
> >> + r"""Add the test case result to statistics.
> >> +
> >> + The base method goes through the hierarchy recursively and
> this method is here to stop
> >> + the recursion, as the :class:`TestCaseResult`\s are the leaves
> of the hierarchy tree.
> >> +
> >> + Args:
> >> + statistics: The :class:`Statistics` object where the stats
> will be added.
> >> + """
> >> statistics += self.result
> >>
> >> def __bool__(self) -> bool:
> >> + """The test case passed only if setup, teardown and the test
> case itself passed."""
> >> return (
> >> bool(self.setup_result) and bool(self.teardown_result) and
> bool(self.result)
> >> )
> >>
> >>
> >> class TestSuiteResult(BaseResult):
> >> - """
> >> - The test suite specific result.
> >> - The _inner_results list stores results of test cases in a given
> test suite.
> >> - Also stores the test suite name.
> >> + """The test suite specific result.
> >> +
> >> + The internal list stores the results of all test cases in a given
> test suite.
> >> +
> >> + Attributes:
> >> + suite_name: The test suite name.
> >> """
> >>
> >
> > I think this should also be a class variable.
> >
> >
> >>
> >> suite_name: str
> >>
> >> def __init__(self, suite_name: str):
> >> + """Extend the constructor with `suite_name`.
> >> +
> >> + Args:
> >> + suite_name: The test suite's name.
> >> + """
> >> super(TestSuiteResult, self).__init__()
> >> self.suite_name = suite_name
> >>
> >> def add_test_case(self, test_case_name: str) -> TestCaseResult:
> >> + """Add and return the inner result (test case).
> >> +
> >> + Returns:
> >> + The test case's result.
> >> + """
> >> test_case_result = TestCaseResult(test_case_name)
> >> self._inner_results.append(test_case_result)
> >> return test_case_result
> >>
> >>
> >> class BuildTargetResult(BaseResult):
> >> - """
> >> - The build target specific result.
> >> - The _inner_results list stores results of test suites in a given
> build target.
> >> - Also stores build target specifics, such as compiler used to build
> DPDK.
> >> + """The build target specific result.
> >> +
> >> + The internal list stores the results of all test suites in a given
> build target.
> >> +
> >> + Attributes:
> >> + arch: The DPDK build target architecture.
> >> + os: The DPDK build target operating system.
> >> + cpu: The DPDK build target CPU.
> >> + compiler: The DPDK build target compiler.
> >> + compiler_version: The DPDK build target compiler version.
> >> + dpdk_version: The built DPDK version.
> >> """
> >
> >
> > I think this should be broken into class variables as well.
> >
> >>
> >>
> >> arch: Architecture
> >> @@ -213,6 +331,11 @@ class BuildTargetResult(BaseResult):
> >> dpdk_version: str | None
> >>
> >> def __init__(self, build_target: BuildTargetConfiguration):
> >> + """Extend the constructor with the `build_target`'s build
> target config.
> >> +
> >> + Args:
> >> + build_target: The build target's test run configuration.
> >> + """
> >> super(BuildTargetResult, self).__init__()
> >> self.arch = build_target.arch
> >> self.os = build_target.os
> >> @@ -222,20 +345,35 @@ def __init__(self, build_target:
> BuildTargetConfiguration):
> >> self.dpdk_version = None
> >>
> >> def add_build_target_info(self, versions: BuildTargetInfo) -> None:
> >> + """Add information about the build target gathered at runtime.
> >> +
> >> + Args:
> >> + versions: The additional information.
> >> + """
> >> self.compiler_version = versions.compiler_version
> >> self.dpdk_version = versions.dpdk_version
> >>
> >> def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
> >> + """Add and return the inner result (test suite).
> >> +
> >> + Returns:
> >> + The test suite's result.
> >> + """
> >> test_suite_result = TestSuiteResult(test_suite_name)
> >> self._inner_results.append(test_suite_result)
> >> return test_suite_result
> >>
> >>
> >> class ExecutionResult(BaseResult):
> >> - """
> >> - The execution specific result.
> >> - The _inner_results list stores results of build targets in a given
> execution.
> >> - Also stores the SUT node configuration.
> >> + """The execution specific result.
> >> +
> >> + The internal list stores the results of all build targets in a
> given execution.
> >> +
> >> + Attributes:
> >> + sut_node: The SUT node used in the execution.
> >> + sut_os_name: The operating system of the SUT node.
> >> + sut_os_version: The operating system version of the SUT node.
> >> + sut_kernel_version: The operating system kernel version of the
> SUT node.
> >> """
> >>
> >
> > I think these should be class variables as well.
> >
> >>
> >> sut_node: NodeConfiguration
> >> @@ -244,36 +382,55 @@ class ExecutionResult(BaseResult):
> >> sut_kernel_version: str
> >>
> >> def __init__(self, sut_node: NodeConfiguration):
> >> + """Extend the constructor with the `sut_node`'s config.
> >> +
> >> + Args:
> >> + sut_node: The SUT node's test run configuration used in
> the execution.
> >> + """
> >> super(ExecutionResult, self).__init__()
> >> self.sut_node = sut_node
> >>
> >> def add_build_target(
> >> self, build_target: BuildTargetConfiguration
> >> ) -> BuildTargetResult:
> >> + """Add and return the inner result (build target).
> >> +
> >> + Args:
> >> + build_target: The build target's test run configuration.
> >> +
> >> + Returns:
> >> + The build target's result.
> >> + """
> >> build_target_result = BuildTargetResult(build_target)
> >> self._inner_results.append(build_target_result)
> >> return build_target_result
> >>
> >> def add_sut_info(self, sut_info: NodeInfo) -> None:
> >> + """Add SUT information gathered at runtime.
> >> +
> >> + Args:
> >> + sut_info: The additional SUT node information.
> >> + """
> >> self.sut_os_name = sut_info.os_name
> >> self.sut_os_version = sut_info.os_version
> >> self.sut_kernel_version = sut_info.kernel_version
> >>
> >>
> >> class DTSResult(BaseResult):
> >> - """
> >> - Stores environment information and test results from a DTS run,
> which are:
> >> - * Execution level information, such as SUT and TG hardware.
> >> - * Build target level information, such as compiler, target OS and
> cpu.
> >> - * Test suite results.
> >> - * All errors that are caught and recorded during DTS execution.
> >> + """Stores environment information and test results from a DTS run.
> >>
> >> - The information is stored in nested objects.
> >> + * Execution level information, such as testbed and the test
> suite list,
> >> + * Build target level information, such as compiler, target OS
> and cpu,
> >> + * Test suite and test case results,
> >> + * All errors that are caught and recorded during DTS execution.
> >>
> >> - The class is capable of computing the return code used to exit DTS
> with
> >> - from the stored error.
> >> + The information is stored hierarchically. This is the first level
> of the hierarchy
> >> + and as such is where the data form the whole hierarchy is collated
> or processed.
> >>
> >> - It also provides a brief statistical summary of passed/failed test
> cases.
> >> + The internal list stores the results of all executions.
> >> +
> >> + Attributes:
> >> + dpdk_version: The DPDK version to record.
> >> """
> >>
> >
> > I think this should be a class variable as well.
> >
>
> This is the only place where making this a class variable would work,
> but I don't see a reason for it. An instance variable works just as
> well.
>
> >>
> >> dpdk_version: str | None
> >> @@ -284,6 +441,11 @@ class DTSResult(BaseResult):
> >> _stats_filename: str
> >>
> >> def __init__(self, logger: DTSLOG):
> >> + """Extend the constructor with top-level specifics.
> >> +
> >> + Args:
> >> + logger: The logger instance the whole result will use.
> >> + """
> >> super(DTSResult, self).__init__()
> >> self.dpdk_version = None
> >> self._logger = logger
> >> @@ -293,21 +455,33 @@ def __init__(self, logger: DTSLOG):
> >> self._stats_filename = os.path.join(SETTINGS.output_dir,
> "statistics.txt")
> >>
> >> def add_execution(self, sut_node: NodeConfiguration) ->
> ExecutionResult:
> >> + """Add and return the inner result (execution).
> >> +
> >> + Args:
> >> + sut_node: The SUT node's test run configuration.
> >> +
> >> + Returns:
> >> + The execution's result.
> >> + """
> >> execution_result = ExecutionResult(sut_node)
> >> self._inner_results.append(execution_result)
> >> return execution_result
> >>
> >> def add_error(self, error: Exception) -> None:
> >> + """Record an error that occurred outside any execution.
> >> +
> >> + Args:
> >> + error: The exception to record.
> >> + """
> >> self._errors.append(error)
> >>
> >> def process(self) -> None:
> >> - """
> >> - Process the data after a DTS run.
> >> - The data is added to nested objects during runtime and this
> parent object
> >> - is not updated at that time. This requires us to process the
> nested data
> >> - after it's all been gathered.
> >> + """Process the data after a whole DTS run.
> >> +
> >> + The data is added to inner objects during runtime and this
> object is not updated
> >> + at that time. This requires us to process the inner data after
> it's all been gathered.
> >>
> >> - The processing gathers all errors and the result statistics of
> test cases.
> >> + The processing gathers all errors and the statistics of test
> case results.
> >> """
> >> self._errors += self.get_errors()
> >> if self._errors and self._logger:
> >> @@ -321,8 +495,10 @@ def process(self) -> None:
> >> stats_file.write(str(self._stats_result))
> >>
> >> def get_return_code(self) -> int:
> >> - """
> >> - Go through all stored Exceptions and return the highest error
> code found.
> >> + """Go through all stored Exceptions and return the final DTS
> error code.
> >> +
> >> + Returns:
> >> + The highest error code found.
> >> """
> >> for error in self._errors:
> >> error_return_code = ErrorSeverity.GENERIC_ERR
> >> --
> >> 2.34.1
> >>
>
[-- Attachment #2: Type: text/html, Size: 32100 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 10/21] dts: config docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (8 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 09/21] dts: test result " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-21 15:08 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 11/21] dts: remote session " Juraj Linkeš
` (11 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 371 ++++++++++++++++++++++++++-----
dts/framework/config/types.py | 132 +++++++++++
2 files changed, 446 insertions(+), 57 deletions(-)
create mode 100644 dts/framework/config/types.py
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 2044c82611..0aa149a53d 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -3,8 +3,34 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Yaml config parsing methods
+"""Testbed configuration and test suite specification.
+
+This package offers classes that hold real-time information about the testbed, hold test run
+configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
+the YAML test run configuration file
+and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+
+The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
+this package. The allowed keys and types inside this dictionary are defined in
+the :doc:`types <framework.config.types>` module.
+
+The test run configuration has two main sections:
+
+ * The :class:`ExecutionConfiguration` which defines what tests are going to be run
+ and how DPDK will be built. It also references the testbed where these tests and DPDK
+ are going to be run,
+ * The nodes of the testbed are defined in the other section,
+ a :class:`list` of :class:`NodeConfiguration` objects.
+
+The real-time information about testbed is supposed to be gathered at runtime.
+
+The classes defined in this package make heavy use of :mod:`dataclasses`.
+All of them use slots and are frozen:
+
+ * Slots enables some optimizations, by pre-allocating space for the defined
+ attributes in the underlying data structure,
+ * Frozen makes the object immutable. This enables further optimizations,
+ and makes it thread safe should we every want to move in that direction.
"""
import json
@@ -12,11 +38,20 @@
import pathlib
from dataclasses import dataclass
from enum import auto, unique
-from typing import Any, TypedDict, Union
+from typing import Union
import warlock # type: ignore[import]
import yaml
+from framework.config.types import (
+ BuildTargetConfigDict,
+ ConfigurationDict,
+ ExecutionConfigDict,
+ NodeConfigDict,
+ PortConfigDict,
+ TestSuiteConfigDict,
+ TrafficGeneratorConfigDict,
+)
from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -24,55 +59,97 @@
@unique
class Architecture(StrEnum):
+ r"""The supported architectures of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
i686 = auto()
+ #:
x86_64 = auto()
+ #:
x86_32 = auto()
+ #:
arm64 = auto()
+ #:
ppc64le = auto()
@unique
class OS(StrEnum):
+ r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
linux = auto()
+ #:
freebsd = auto()
+ #:
windows = auto()
@unique
class CPUType(StrEnum):
+ r"""The supported CPUs of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
native = auto()
+ #:
armv8a = auto()
+ #:
dpaa2 = auto()
+ #:
thunderx = auto()
+ #:
xgene1 = auto()
@unique
class Compiler(StrEnum):
+ r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
gcc = auto()
+ #:
clang = auto()
+ #:
icc = auto()
+ #:
msvc = auto()
@unique
class TrafficGeneratorType(StrEnum):
+ """The supported traffic generators."""
+
+ #:
SCAPY = auto()
-# Slots enables some optimizations, by pre-allocating space for the defined
-# attributes in the underlying data structure.
-#
-# Frozen makes the object immutable. This enables further optimizations,
-# and makes it thread safe should we every want to move in that direction.
@dataclass(slots=True, frozen=True)
class HugepageConfiguration:
+ r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ amount: The number of hugepages.
+ force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
+ """
+
amount: int
force_first_numa: bool
@dataclass(slots=True, frozen=True)
class PortConfig:
+ r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ node: The :class:`~framework.testbed_model.node.Node` where this port exists.
+ pci: The PCI address of the port.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK.
+ os_driver: The operating system driver name when the operating system controls the port.
+ peer_node: The :class:`~framework.testbed_model.node.Node` of the port
+ connected to this port.
+ peer_pci: The PCI address of the port connected to this port.
+ """
+
node: str
pci: str
os_driver_for_dpdk: str
@@ -81,18 +158,44 @@ class PortConfig:
peer_pci: str
@staticmethod
- def from_dict(node: str, d: dict) -> "PortConfig":
+ def from_dict(node: str, d: PortConfigDict) -> "PortConfig":
+ """A convenience method that creates the object from fewer inputs.
+
+ Args:
+ node: The node where this port exists.
+ d: The configuration dictionary.
+
+ Returns:
+ The port configuration instance.
+ """
return PortConfig(node=node, **d)
@dataclass(slots=True, frozen=True)
class TrafficGeneratorConfig:
+ """The configuration of traffic generators.
+
+ The class will be expanded when more configuration is needed.
+
+ Attributes:
+ traffic_generator_type: The type of the traffic generator.
+ """
+
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
- # This looks useless now, but is designed to allow expansion to traffic
- # generators that require more configuration later.
+ def from_dict(d: TrafficGeneratorConfigDict) -> "ScapyTrafficGeneratorConfig":
+ """A convenience method that produces traffic generator config of the proper type.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The traffic generator configuration instance.
+
+ Raises:
+ ConfigurationError: An unknown traffic generator type was encountered.
+ """
match TrafficGeneratorType(d["type"]):
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGeneratorConfig(
@@ -106,11 +209,31 @@ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
@dataclass(slots=True, frozen=True)
class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+ """Scapy traffic generator specific configuration."""
+
pass
@dataclass(slots=True, frozen=True)
class NodeConfiguration:
+ r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ name: The name of the :class:`~framework.testbed_model.node.Node`.
+ hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
+ Can be an IP or a domain name.
+ user: The name of the user used to connect to
+ the :class:`~framework.testbed_model.node.Node`.
+ password: The password of the user. The use of passwords is heavily discouraged.
+ Please use keys instead.
+ arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
+ os: The operating system of the :class:`~framework.testbed_model.node.Node`.
+ lcores: A comma delimited list of logical cores to use when running DPDK.
+ use_first_core: If :data:`True`, the first logical core won't be used.
+ hugepages: An optional hugepage configuration.
+ ports: The ports that can be used in testing.
+ """
+
name: str
hostname: str
user: str
@@ -123,57 +246,91 @@ class NodeConfiguration:
ports: list[PortConfig]
@staticmethod
- def from_dict(d: dict) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
- hugepage_config = d.get("hugepages")
- if hugepage_config:
- if "force_first_numa" not in hugepage_config:
- hugepage_config["force_first_numa"] = False
- hugepage_config = HugepageConfiguration(**hugepage_config)
-
- common_config = {
- "name": d["name"],
- "hostname": d["hostname"],
- "user": d["user"],
- "password": d.get("password"),
- "arch": Architecture(d["arch"]),
- "os": OS(d["os"]),
- "lcores": d.get("lcores", "1"),
- "use_first_core": d.get("use_first_core", False),
- "hugepages": hugepage_config,
- "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
- }
-
+ def from_dict(
+ d: NodeConfigDict,
+ ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
+ """A convenience method that processes the inputs before creating a specialized instance.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ Either an SUT or TG configuration instance.
+ """
+ hugepage_config = None
+ if "hugepages" in d:
+ hugepage_config_dict = d["hugepages"]
+ if "force_first_numa" not in hugepage_config_dict:
+ hugepage_config_dict["force_first_numa"] = False
+ hugepage_config = HugepageConfiguration(**hugepage_config_dict)
+
+ # The calls here contain duplicated code which is here because Mypy doesn't
+ # properly support dictionary unpacking with TypedDicts
if "traffic_generator" in d:
return TGNodeConfiguration(
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
traffic_generator=TrafficGeneratorConfig.from_dict(
d["traffic_generator"]
),
- **common_config,
)
else:
return SutNodeConfiguration(
- memory_channels=d.get("memory_channels", 1), **common_config
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
+ memory_channels=d.get("memory_channels", 1),
)
@dataclass(slots=True, frozen=True)
class SutNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
+
+ Attributes:
+ memory_channels: The number of memory channels to use when running DPDK.
+ """
+
memory_channels: int
@dataclass(slots=True, frozen=True)
class TGNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
+
+ Attributes:
+ traffic_generator: The configuration of the traffic generator present on the TG node.
+ """
+
traffic_generator: ScapyTrafficGeneratorConfig
@dataclass(slots=True, frozen=True)
class NodeInfo:
- """Class to hold important versions within the node.
-
- This class, unlike the NodeConfiguration class, cannot be generated at the start.
- This is because we need to initialize a connection with the node before we can
- collect the information needed in this class. Therefore, it cannot be a part of
- the configuration class above.
+ """Supplemental node information.
+
+ Attributes:
+ os_name: The name of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ os_version: The version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ kernel_version: The kernel version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
"""
os_name: str
@@ -183,6 +340,20 @@ class NodeInfo:
@dataclass(slots=True, frozen=True)
class BuildTargetConfiguration:
+ """DPDK build configuration.
+
+ The configuration used for building DPDK.
+
+ Attributes:
+ arch: The target architecture to build for.
+ os: The target os to build for.
+ cpu: The target CPU to build for.
+ compiler: The compiler executable to use.
+ compiler_wrapper: This string will be put in front of the compiler when
+ executing the build. Useful for adding wrapper commands, such as ``ccache``.
+ name: The name of the compiler.
+ """
+
arch: Architecture
os: OS
cpu: CPUType
@@ -191,7 +362,18 @@ class BuildTargetConfiguration:
name: str
@staticmethod
- def from_dict(d: dict) -> "BuildTargetConfiguration":
+ def from_dict(d: BuildTargetConfigDict) -> "BuildTargetConfiguration":
+ r"""A convenience method that processes the inputs before creating an instance.
+
+ `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
+ `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The build target configuration instance.
+ """
return BuildTargetConfiguration(
arch=Architecture(d["arch"]),
os=OS(d["os"]),
@@ -204,23 +386,29 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
@dataclass(slots=True, frozen=True)
class BuildTargetInfo:
- """Class to hold important versions within the build target.
+ """Various versions and other information about a build target.
- This is very similar to the NodeInfo class, it just instead holds information
- for the build target.
+ Attributes:
+ dpdk_version: The DPDK version that was built.
+ compiler_version: The version of the compiler used to build DPDK.
"""
dpdk_version: str
compiler_version: str
-class TestSuiteConfigDict(TypedDict):
- suite: str
- cases: list[str]
-
-
@dataclass(slots=True, frozen=True)
class TestSuiteConfig:
+ """Test suite configuration.
+
+ Information about a single test suite to be executed.
+
+ Attributes:
+ test_suite: The name of the test suite module without the starting ``TestSuite_``.
+ test_cases: The names of test cases from this test suite to execute.
+ If empty, all test cases will be executed.
+ """
+
test_suite: str
test_cases: list[str]
@@ -228,6 +416,14 @@ class TestSuiteConfig:
def from_dict(
entry: str | TestSuiteConfigDict,
) -> "TestSuiteConfig":
+ """Create an instance from two different types.
+
+ Args:
+ entry: Either a suite name or a dictionary containing the config.
+
+ Returns:
+ The test suite configuration instance.
+ """
if isinstance(entry, str):
return TestSuiteConfig(test_suite=entry, test_cases=[])
elif isinstance(entry, dict):
@@ -238,19 +434,49 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class ExecutionConfiguration:
+ """The configuration of an execution.
+
+ The configuration contains testbed information, what tests to execute
+ and with what DPDK build.
+
+ Attributes:
+ build_targets: A list of DPDK builds to test.
+ perf: Whether to run performance tests.
+ func: Whether to run functional tests.
+ skip_smoke_tests: Whether to skip smoke tests.
+ test_suites: The names of test suites and/or test cases to execute.
+ system_under_test_node: The SUT node to use in this execution.
+ traffic_generator_node: The TG node to use in this execution.
+ vdevs: The names of virtual devices to test.
+ """
+
build_targets: list[BuildTargetConfiguration]
perf: bool
func: bool
+ skip_smoke_tests: bool
test_suites: list[TestSuiteConfig]
system_under_test_node: SutNodeConfiguration
traffic_generator_node: TGNodeConfiguration
vdevs: list[str]
- skip_smoke_tests: bool
@staticmethod
def from_dict(
- d: dict, node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]]
+ d: ExecutionConfigDict,
+ node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]],
) -> "ExecutionConfiguration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ The build target and the test suite config is transformed into their respective objects.
+ SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
+ just stored.
+
+ Args:
+ d: The configuration dictionary.
+ node_map: A dictionary mapping node names to their config objects.
+
+ Returns:
+ The execution configuration instance.
+ """
build_targets: list[BuildTargetConfiguration] = list(
map(BuildTargetConfiguration.from_dict, d["build_targets"])
)
@@ -291,10 +517,31 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class Configuration:
+ """DTS testbed and test configuration.
+
+ The node configuration is not stored in this object. Rather, all used node configurations
+ are stored inside the execution configuration where the nodes are actually used.
+
+ Attributes:
+ executions: Execution configurations.
+ """
+
executions: list[ExecutionConfiguration]
@staticmethod
- def from_dict(d: dict) -> "Configuration":
+ def from_dict(d: ConfigurationDict) -> "Configuration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ Build target and test suite config is transformed into their respective objects.
+ SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
+ just stored.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The whole configuration instance.
+ """
nodes: list[Union[SutNodeConfiguration | TGNodeConfiguration]] = list(
map(NodeConfiguration.from_dict, d["nodes"])
)
@@ -313,9 +560,17 @@ def from_dict(d: dict) -> "Configuration":
def load_config() -> Configuration:
- """
- Loads the configuration file and the configuration file schema,
- validates the configuration file, and creates a configuration object.
+ """Load DTS test run configuration from a file.
+
+ Load the YAML test run configuration file
+ and :download:`the configuration file schema <conf_yaml_schema.json>`,
+ validate the test run configuration file, and create a test run configuration object.
+
+ The YAML test run configuration file is specified in the :option:`--config-file` command line
+ argument or the :envvar:`DTS_CFG_FILE` environment variable.
+
+ Returns:
+ The parsed test run configuration.
"""
with open(SETTINGS.config_file_path, "r") as f:
config_data = yaml.safe_load(f)
@@ -326,6 +581,8 @@ def load_config() -> Configuration:
with open(schema_path, "r") as f:
schema = json.load(f)
- config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
- config_obj: Configuration = Configuration.from_dict(dict(config))
+ config = warlock.model_factory(schema, name="_Config")(config_data)
+ config_obj: Configuration = Configuration.from_dict(
+ dict(config) # type: ignore[arg-type]
+ )
return config_obj
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
new file mode 100644
index 0000000000..1927910d88
--- /dev/null
+++ b/dts/framework/config/types.py
@@ -0,0 +1,132 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+"""Configuration dictionary contents specification.
+
+These type definitions serve as documentation of the configuration dictionary contents.
+
+The definitions use the built-in :class:`~typing.TypedDict` construct.
+"""
+
+from typing import TypedDict
+
+
+class PortConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ pci: str
+ #:
+ os_driver_for_dpdk: str
+ #:
+ os_driver: str
+ #:
+ peer_node: str
+ #:
+ peer_pci: str
+
+
+class TrafficGeneratorConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ type: str
+
+
+class HugepageConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ amount: int
+ #:
+ force_first_numa: bool
+
+
+class NodeConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ hugepages: HugepageConfigurationDict
+ #:
+ name: str
+ #:
+ hostname: str
+ #:
+ user: str
+ #:
+ password: str
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ lcores: str
+ #:
+ use_first_core: bool
+ #:
+ ports: list[PortConfigDict]
+ #:
+ memory_channels: int
+ #:
+ traffic_generator: TrafficGeneratorConfigDict
+
+
+class BuildTargetConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ cpu: str
+ #:
+ compiler: str
+ #:
+ compiler_wrapper: str
+
+
+class TestSuiteConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ suite: str
+ #:
+ cases: list[str]
+
+
+class ExecutionSUTConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ node_name: str
+ #:
+ vdevs: list[str]
+
+
+class ExecutionConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ build_targets: list[BuildTargetConfigDict]
+ #:
+ perf: bool
+ #:
+ func: bool
+ #:
+ skip_smoke_tests: bool
+ #:
+ test_suites: TestSuiteConfigDict
+ #:
+ system_under_test_node: ExecutionSUTConfigDict
+ #:
+ traffic_generator_node: str
+
+
+class ConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ nodes: list[NodeConfigDict]
+ #:
+ executions: list[ExecutionConfigDict]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 10/21] dts: config docstring update
2023-11-15 13:09 ` [PATCH v7 10/21] dts: config " Juraj Linkeš
@ 2023-11-21 15:08 ` Yoan Picchi
2023-11-22 10:42 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-21 15:08 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/config/__init__.py | 371 ++++++++++++++++++++++++++-----
> dts/framework/config/types.py | 132 +++++++++++
> 2 files changed, 446 insertions(+), 57 deletions(-)
> create mode 100644 dts/framework/config/types.py
>
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index 2044c82611..0aa149a53d 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -3,8 +3,34 @@
> # Copyright(c) 2022-2023 University of New Hampshire
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> -"""
> -Yaml config parsing methods
> +"""Testbed configuration and test suite specification.
> +
> +This package offers classes that hold real-time information about the testbed, hold test run
> +configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
> +the YAML test run configuration file
> +and validates it according to :download:`the schema <conf_yaml_schema.json>`.
> +
> +The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
> +this package. The allowed keys and types inside this dictionary are defined in
> +the :doc:`types <framework.config.types>` module.
> +
> +The test run configuration has two main sections:
> +
> + * The :class:`ExecutionConfiguration` which defines what tests are going to be run
> + and how DPDK will be built. It also references the testbed where these tests and DPDK
> + are going to be run,
> + * The nodes of the testbed are defined in the other section,
> + a :class:`list` of :class:`NodeConfiguration` objects.
> +
> +The real-time information about testbed is supposed to be gathered at runtime.
> +
> +The classes defined in this package make heavy use of :mod:`dataclasses`.
> +All of them use slots and are frozen:
> +
> + * Slots enables some optimizations, by pre-allocating space for the defined
> + attributes in the underlying data structure,
> + * Frozen makes the object immutable. This enables further optimizations,
> + and makes it thread safe should we every want to move in that direction.
every -> ever ?
> """
>
> import json
> @@ -12,11 +38,20 @@
> import pathlib
> from dataclasses import dataclass
> from enum import auto, unique
> -from typing import Any, TypedDict, Union
> +from typing import Union
>
> import warlock # type: ignore[import]
> import yaml
>
> +from framework.config.types import (
> + BuildTargetConfigDict,
> + ConfigurationDict,
> + ExecutionConfigDict,
> + NodeConfigDict,
> + PortConfigDict,
> + TestSuiteConfigDict,
> + TrafficGeneratorConfigDict,
> +)
> from framework.exception import ConfigurationError
> from framework.settings import SETTINGS
> from framework.utils import StrEnum
> @@ -24,55 +59,97 @@
>
> @unique
> class Architecture(StrEnum):
> + r"""The supported architectures of :class:`~framework.testbed_model.node.Node`\s."""
> +
> + #:
> i686 = auto()
> + #:
> x86_64 = auto()
> + #:
> x86_32 = auto()
> + #:
> arm64 = auto()
> + #:
> ppc64le = auto()
>
>
> @unique
> class OS(StrEnum):
> + r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
> +
> + #:
> linux = auto()
> + #:
> freebsd = auto()
> + #:
> windows = auto()
>
>
> @unique
> class CPUType(StrEnum):
> + r"""The supported CPUs of :class:`~framework.testbed_model.node.Node`\s."""
> +
> + #:
> native = auto()
> + #:
> armv8a = auto()
> + #:
> dpaa2 = auto()
> + #:
> thunderx = auto()
> + #:
> xgene1 = auto()
>
>
> @unique
> class Compiler(StrEnum):
> + r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
> +
> + #:
> gcc = auto()
> + #:
> clang = auto()
> + #:
> icc = auto()
> + #:
> msvc = auto()
>
>
> @unique
> class TrafficGeneratorType(StrEnum):
> + """The supported traffic generators."""
> +
> + #:
> SCAPY = auto()
>
>
> -# Slots enables some optimizations, by pre-allocating space for the defined
> -# attributes in the underlying data structure.
> -#
> -# Frozen makes the object immutable. This enables further optimizations,
> -# and makes it thread safe should we every want to move in that direction.
> @dataclass(slots=True, frozen=True)
> class HugepageConfiguration:
> + r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
> +
> + Attributes:
> + amount: The number of hugepages.
> + force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
> + """
> +
> amount: int
> force_first_numa: bool
>
>
> @dataclass(slots=True, frozen=True)
> class PortConfig:
> + r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
> +
> + Attributes:
> + node: The :class:`~framework.testbed_model.node.Node` where this port exists.
> + pci: The PCI address of the port.
> + os_driver_for_dpdk: The operating system driver name for use with DPDK.
> + os_driver: The operating system driver name when the operating system controls the port.
> + peer_node: The :class:`~framework.testbed_model.node.Node` of the port
> + connected to this port.
> + peer_pci: The PCI address of the port connected to this port.
> + """
> +
> node: str
> pci: str
> os_driver_for_dpdk: str
> @@ -81,18 +158,44 @@ class PortConfig:
> peer_pci: str
>
> @staticmethod
> - def from_dict(node: str, d: dict) -> "PortConfig":
> + def from_dict(node: str, d: PortConfigDict) -> "PortConfig":
> + """A convenience method that creates the object from fewer inputs.
> +
> + Args:
> + node: The node where this port exists.
> + d: The configuration dictionary.
> +
> + Returns:
> + The port configuration instance.
> + """
> return PortConfig(node=node, **d)
>
>
> @dataclass(slots=True, frozen=True)
> class TrafficGeneratorConfig:
> + """The configuration of traffic generators.
> +
> + The class will be expanded when more configuration is needed.
> +
> + Attributes:
> + traffic_generator_type: The type of the traffic generator.
> + """
> +
> traffic_generator_type: TrafficGeneratorType
>
> @staticmethod
> - def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
> - # This looks useless now, but is designed to allow expansion to traffic
> - # generators that require more configuration later.
> + def from_dict(d: TrafficGeneratorConfigDict) -> "ScapyTrafficGeneratorConfig":
> + """A convenience method that produces traffic generator config of the proper type.
> +
> + Args:
> + d: The configuration dictionary.
> +
> + Returns:
> + The traffic generator configuration instance.
> +
> + Raises:
> + ConfigurationError: An unknown traffic generator type was encountered.
> + """
> match TrafficGeneratorType(d["type"]):
> case TrafficGeneratorType.SCAPY:
> return ScapyTrafficGeneratorConfig(
> @@ -106,11 +209,31 @@ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
>
> @dataclass(slots=True, frozen=True)
> class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
> + """Scapy traffic generator specific configuration."""
> +
> pass
>
>
> @dataclass(slots=True, frozen=True)
> class NodeConfiguration:
> + r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
> +
> + Attributes:
> + name: The name of the :class:`~framework.testbed_model.node.Node`.
> + hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
> + Can be an IP or a domain name.
> + user: The name of the user used to connect to
> + the :class:`~framework.testbed_model.node.Node`.
> + password: The password of the user. The use of passwords is heavily discouraged.
> + Please use keys instead.
> + arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
> + os: The operating system of the :class:`~framework.testbed_model.node.Node`.
> + lcores: A comma delimited list of logical cores to use when running DPDK.
> + use_first_core: If :data:`True`, the first logical core won't be used.
> + hugepages: An optional hugepage configuration.
> + ports: The ports that can be used in testing.
> + """
> +
> name: str
> hostname: str
> user: str
> @@ -123,57 +246,91 @@ class NodeConfiguration:
> ports: list[PortConfig]
>
> @staticmethod
> - def from_dict(d: dict) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
> - hugepage_config = d.get("hugepages")
> - if hugepage_config:
> - if "force_first_numa" not in hugepage_config:
> - hugepage_config["force_first_numa"] = False
> - hugepage_config = HugepageConfiguration(**hugepage_config)
> -
> - common_config = {
> - "name": d["name"],
> - "hostname": d["hostname"],
> - "user": d["user"],
> - "password": d.get("password"),
> - "arch": Architecture(d["arch"]),
> - "os": OS(d["os"]),
> - "lcores": d.get("lcores", "1"),
> - "use_first_core": d.get("use_first_core", False),
> - "hugepages": hugepage_config,
> - "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> - }
> -
> + def from_dict(
> + d: NodeConfigDict,
> + ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
> + """A convenience method that processes the inputs before creating a specialized instance.
> +
> + Args:
> + d: The configuration dictionary.
> +
> + Returns:
> + Either an SUT or TG configuration instance.
> + """
> + hugepage_config = None
> + if "hugepages" in d:
> + hugepage_config_dict = d["hugepages"]
> + if "force_first_numa" not in hugepage_config_dict:
> + hugepage_config_dict["force_first_numa"] = False
> + hugepage_config = HugepageConfiguration(**hugepage_config_dict)
> +
> + # The calls here contain duplicated code which is here because Mypy doesn't
> + # properly support dictionary unpacking with TypedDicts
> if "traffic_generator" in d:
> return TGNodeConfiguration(
> + name=d["name"],
> + hostname=d["hostname"],
> + user=d["user"],
> + password=d.get("password"),
> + arch=Architecture(d["arch"]),
> + os=OS(d["os"]),
> + lcores=d.get("lcores", "1"),
> + use_first_core=d.get("use_first_core", False),
> + hugepages=hugepage_config,
> + ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> traffic_generator=TrafficGeneratorConfig.from_dict(
> d["traffic_generator"]
> ),
> - **common_config,
> )
> else:
> return SutNodeConfiguration(
> - memory_channels=d.get("memory_channels", 1), **common_config
> + name=d["name"],
> + hostname=d["hostname"],
> + user=d["user"],
> + password=d.get("password"),
> + arch=Architecture(d["arch"]),
> + os=OS(d["os"]),
> + lcores=d.get("lcores", "1"),
> + use_first_core=d.get("use_first_core", False),
> + hugepages=hugepage_config,
> + ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> + memory_channels=d.get("memory_channels", 1),
> )
>
>
> @dataclass(slots=True, frozen=True)
> class SutNodeConfiguration(NodeConfiguration):
> + """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
> +
> + Attributes:
> + memory_channels: The number of memory channels to use when running DPDK.
> + """
> +
> memory_channels: int
>
>
> @dataclass(slots=True, frozen=True)
> class TGNodeConfiguration(NodeConfiguration):
> + """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
> +
> + Attributes:
> + traffic_generator: The configuration of the traffic generator present on the TG node.
> + """
> +
> traffic_generator: ScapyTrafficGeneratorConfig
>
>
> @dataclass(slots=True, frozen=True)
> class NodeInfo:
> - """Class to hold important versions within the node.
> -
> - This class, unlike the NodeConfiguration class, cannot be generated at the start.
> - This is because we need to initialize a connection with the node before we can
> - collect the information needed in this class. Therefore, it cannot be a part of
> - the configuration class above.
> + """Supplemental node information.
> +
> + Attributes:
> + os_name: The name of the running operating system of
> + the :class:`~framework.testbed_model.node.Node`.
> + os_version: The version of the running operating system of
> + the :class:`~framework.testbed_model.node.Node`.
> + kernel_version: The kernel version of the running operating system of
> + the :class:`~framework.testbed_model.node.Node`.
> """
>
> os_name: str
> @@ -183,6 +340,20 @@ class NodeInfo:
>
> @dataclass(slots=True, frozen=True)
> class BuildTargetConfiguration:
> + """DPDK build configuration.
> +
> + The configuration used for building DPDK.
> +
> + Attributes:
> + arch: The target architecture to build for.
> + os: The target os to build for.
> + cpu: The target CPU to build for.
> + compiler: The compiler executable to use.
> + compiler_wrapper: This string will be put in front of the compiler when
> + executing the build. Useful for adding wrapper commands, such as ``ccache``.
> + name: The name of the compiler.
> + """
> +
> arch: Architecture
> os: OS
> cpu: CPUType
> @@ -191,7 +362,18 @@ class BuildTargetConfiguration:
> name: str
>
> @staticmethod
> - def from_dict(d: dict) -> "BuildTargetConfiguration":
> + def from_dict(d: BuildTargetConfigDict) -> "BuildTargetConfiguration":
> + r"""A convenience method that processes the inputs before creating an instance.
> +
> + `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
> + `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
> +
> + Args:
> + d: The configuration dictionary.
> +
> + Returns:
> + The build target configuration instance.
> + """
> return BuildTargetConfiguration(
> arch=Architecture(d["arch"]),
> os=OS(d["os"]),
> @@ -204,23 +386,29 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
>
> @dataclass(slots=True, frozen=True)
> class BuildTargetInfo:
> - """Class to hold important versions within the build target.
> + """Various versions and other information about a build target.
>
> - This is very similar to the NodeInfo class, it just instead holds information
> - for the build target.
> + Attributes:
> + dpdk_version: The DPDK version that was built.
> + compiler_version: The version of the compiler used to build DPDK.
> """
>
> dpdk_version: str
> compiler_version: str
>
>
> -class TestSuiteConfigDict(TypedDict):
> - suite: str
> - cases: list[str]
> -
> -
> @dataclass(slots=True, frozen=True)
> class TestSuiteConfig:
> + """Test suite configuration.
> +
> + Information about a single test suite to be executed.
> +
> + Attributes:
> + test_suite: The name of the test suite module without the starting ``TestSuite_``.
> + test_cases: The names of test cases from this test suite to execute.
> + If empty, all test cases will be executed.
> + """
> +
> test_suite: str
> test_cases: list[str]
>
> @@ -228,6 +416,14 @@ class TestSuiteConfig:
> def from_dict(
> entry: str | TestSuiteConfigDict,
> ) -> "TestSuiteConfig":
> + """Create an instance from two different types.
> +
> + Args:
> + entry: Either a suite name or a dictionary containing the config.
> +
> + Returns:
> + The test suite configuration instance.
> + """
> if isinstance(entry, str):
> return TestSuiteConfig(test_suite=entry, test_cases=[])
> elif isinstance(entry, dict):
> @@ -238,19 +434,49 @@ def from_dict(
>
> @dataclass(slots=True, frozen=True)
> class ExecutionConfiguration:
> + """The configuration of an execution.
> +
> + The configuration contains testbed information, what tests to execute
> + and with what DPDK build.
> +
> + Attributes:
> + build_targets: A list of DPDK builds to test.
> + perf: Whether to run performance tests.
> + func: Whether to run functional tests.
> + skip_smoke_tests: Whether to skip smoke tests.
> + test_suites: The names of test suites and/or test cases to execute.
> + system_under_test_node: The SUT node to use in this execution.
> + traffic_generator_node: The TG node to use in this execution.
> + vdevs: The names of virtual devices to test.
> + """
> +
> build_targets: list[BuildTargetConfiguration]
> perf: bool
> func: bool
> + skip_smoke_tests: bool
> test_suites: list[TestSuiteConfig]
> system_under_test_node: SutNodeConfiguration
> traffic_generator_node: TGNodeConfiguration
> vdevs: list[str]
> - skip_smoke_tests: bool
>
> @staticmethod
> def from_dict(
> - d: dict, node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]]
> + d: ExecutionConfigDict,
> + node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]],
> ) -> "ExecutionConfiguration":
> + """A convenience method that processes the inputs before creating an instance.
> +
> + The build target and the test suite config is transformed into their respective objects.
is -> are
> + SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
configuration*s*
> + just stored.
> +
> + Args:
> + d: The configuration dictionary.
> + node_map: A dictionary mapping node names to their config objects.
> +
> + Returns:
> + The execution configuration instance.
> + """
> build_targets: list[BuildTargetConfiguration] = list(
> map(BuildTargetConfiguration.from_dict, d["build_targets"])
> )
> @@ -291,10 +517,31 @@ def from_dict(
>
> @dataclass(slots=True, frozen=True)
> class Configuration:
> + """DTS testbed and test configuration.
> +
> + The node configuration is not stored in this object. Rather, all used node configurations
> + are stored inside the execution configuration where the nodes are actually used.
> +
> + Attributes:
> + executions: Execution configurations.
> + """
> +
> executions: list[ExecutionConfiguration]
>
> @staticmethod
> - def from_dict(d: dict) -> "Configuration":
> + def from_dict(d: ConfigurationDict) -> "Configuration":
> + """A convenience method that processes the inputs before creating an instance.
> +
> + Build target and test suite config is transformed into their respective objects.
is -> are
> + SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
configuration*s*
> + just stored.
> +
> + Args:
> + d: The configuration dictionary.
> +
> + Returns:
> + The whole configuration instance.
> + """
> nodes: list[Union[SutNodeConfiguration | TGNodeConfiguration]] = list(
> map(NodeConfiguration.from_dict, d["nodes"])
> )
> @@ -313,9 +560,17 @@ def from_dict(d: dict) -> "Configuration":
>
>
> def load_config() -> Configuration:
> - """
> - Loads the configuration file and the configuration file schema,
> - validates the configuration file, and creates a configuration object.
> + """Load DTS test run configuration from a file.
> +
> + Load the YAML test run configuration file
> + and :download:`the configuration file schema <conf_yaml_schema.json>`,
> + validate the test run configuration file, and create a test run configuration object.
> +
> + The YAML test run configuration file is specified in the :option:`--config-file` command line
> + argument or the :envvar:`DTS_CFG_FILE` environment variable.
> +
> + Returns:
> + The parsed test run configuration.
> """
> with open(SETTINGS.config_file_path, "r") as f:
> config_data = yaml.safe_load(f)
> @@ -326,6 +581,8 @@ def load_config() -> Configuration:
>
> with open(schema_path, "r") as f:
> schema = json.load(f)
> - config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
> - config_obj: Configuration = Configuration.from_dict(dict(config))
> + config = warlock.model_factory(schema, name="_Config")(config_data)
> + config_obj: Configuration = Configuration.from_dict(
> + dict(config) # type: ignore[arg-type]
> + )
> return config_obj
> diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
> new file mode 100644
> index 0000000000..1927910d88
> --- /dev/null
> +++ b/dts/framework/config/types.py
> @@ -0,0 +1,132 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +"""Configuration dictionary contents specification.
> +
> +These type definitions serve as documentation of the configuration dictionary contents.
> +
> +The definitions use the built-in :class:`~typing.TypedDict` construct.
> +"""
> +
> +from typing import TypedDict
> +
> +
> +class PortConfigDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + pci: str
> + #:
> + os_driver_for_dpdk: str
> + #:
> + os_driver: str
> + #:
> + peer_node: str
> + #:
> + peer_pci: str
> +
> +
> +class TrafficGeneratorConfigDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + type: str
> +
> +
> +class HugepageConfigurationDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + amount: int
> + #:
> + force_first_numa: bool
> +
> +
> +class NodeConfigDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + hugepages: HugepageConfigurationDict
> + #:
> + name: str
> + #:
> + hostname: str
> + #:
> + user: str
> + #:
> + password: str
> + #:
> + arch: str
> + #:
> + os: str
> + #:
> + lcores: str
> + #:
> + use_first_core: bool
> + #:
> + ports: list[PortConfigDict]
> + #:
> + memory_channels: int
> + #:
> + traffic_generator: TrafficGeneratorConfigDict
> +
> +
> +class BuildTargetConfigDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + arch: str
> + #:
> + os: str
> + #:
> + cpu: str
> + #:
> + compiler: str
> + #:
> + compiler_wrapper: str
> +
> +
> +class TestSuiteConfigDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + suite: str
> + #:
> + cases: list[str]
> +
> +
> +class ExecutionSUTConfigDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + node_name: str
> + #:
> + vdevs: list[str]
> +
> +
> +class ExecutionConfigDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + build_targets: list[BuildTargetConfigDict]
> + #:
> + perf: bool
> + #:
> + func: bool
> + #:
> + skip_smoke_tests: bool
> + #:
> + test_suites: TestSuiteConfigDict
> + #:
> + system_under_test_node: ExecutionSUTConfigDict
> + #:
> + traffic_generator_node: str
> +
> +
> +class ConfigurationDict(TypedDict):
> + """Allowed keys and values."""
> +
> + #:
> + nodes: list[NodeConfigDict]
> + #:
> + executions: list[ExecutionConfigDict]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 10/21] dts: config docstring update
2023-11-21 15:08 ` Yoan Picchi
@ 2023-11-22 10:42 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 10:42 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
Thanks, Yoan, I'll make these changes in v8.
On Tue, Nov 21, 2023 at 4:08 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/config/__init__.py | 371 ++++++++++++++++++++++++++-----
> > dts/framework/config/types.py | 132 +++++++++++
> > 2 files changed, 446 insertions(+), 57 deletions(-)
> > create mode 100644 dts/framework/config/types.py
> >
> > diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> > index 2044c82611..0aa149a53d 100644
> > --- a/dts/framework/config/__init__.py
> > +++ b/dts/framework/config/__init__.py
> > @@ -3,8 +3,34 @@
> > # Copyright(c) 2022-2023 University of New Hampshire
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >
> > -"""
> > -Yaml config parsing methods
> > +"""Testbed configuration and test suite specification.
> > +
> > +This package offers classes that hold real-time information about the testbed, hold test run
> > +configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
> > +the YAML test run configuration file
> > +and validates it according to :download:`the schema <conf_yaml_schema.json>`.
> > +
> > +The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
> > +this package. The allowed keys and types inside this dictionary are defined in
> > +the :doc:`types <framework.config.types>` module.
> > +
> > +The test run configuration has two main sections:
> > +
> > + * The :class:`ExecutionConfiguration` which defines what tests are going to be run
> > + and how DPDK will be built. It also references the testbed where these tests and DPDK
> > + are going to be run,
> > + * The nodes of the testbed are defined in the other section,
> > + a :class:`list` of :class:`NodeConfiguration` objects.
> > +
> > +The real-time information about testbed is supposed to be gathered at runtime.
> > +
> > +The classes defined in this package make heavy use of :mod:`dataclasses`.
> > +All of them use slots and are frozen:
> > +
> > + * Slots enables some optimizations, by pre-allocating space for the defined
> > + attributes in the underlying data structure,
> > + * Frozen makes the object immutable. This enables further optimizations,
> > + and makes it thread safe should we every want to move in that direction.
>
> every -> ever ?
>
> > """
> >
> > import json
> > @@ -12,11 +38,20 @@
> > import pathlib
> > from dataclasses import dataclass
> > from enum import auto, unique
> > -from typing import Any, TypedDict, Union
> > +from typing import Union
> >
> > import warlock # type: ignore[import]
> > import yaml
> >
> > +from framework.config.types import (
> > + BuildTargetConfigDict,
> > + ConfigurationDict,
> > + ExecutionConfigDict,
> > + NodeConfigDict,
> > + PortConfigDict,
> > + TestSuiteConfigDict,
> > + TrafficGeneratorConfigDict,
> > +)
> > from framework.exception import ConfigurationError
> > from framework.settings import SETTINGS
> > from framework.utils import StrEnum
> > @@ -24,55 +59,97 @@
> >
> > @unique
> > class Architecture(StrEnum):
> > + r"""The supported architectures of :class:`~framework.testbed_model.node.Node`\s."""
> > +
> > + #:
> > i686 = auto()
> > + #:
> > x86_64 = auto()
> > + #:
> > x86_32 = auto()
> > + #:
> > arm64 = auto()
> > + #:
> > ppc64le = auto()
> >
> >
> > @unique
> > class OS(StrEnum):
> > + r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
> > +
> > + #:
> > linux = auto()
> > + #:
> > freebsd = auto()
> > + #:
> > windows = auto()
> >
> >
> > @unique
> > class CPUType(StrEnum):
> > + r"""The supported CPUs of :class:`~framework.testbed_model.node.Node`\s."""
> > +
> > + #:
> > native = auto()
> > + #:
> > armv8a = auto()
> > + #:
> > dpaa2 = auto()
> > + #:
> > thunderx = auto()
> > + #:
> > xgene1 = auto()
> >
> >
> > @unique
> > class Compiler(StrEnum):
> > + r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
> > +
> > + #:
> > gcc = auto()
> > + #:
> > clang = auto()
> > + #:
> > icc = auto()
> > + #:
> > msvc = auto()
> >
> >
> > @unique
> > class TrafficGeneratorType(StrEnum):
> > + """The supported traffic generators."""
> > +
> > + #:
> > SCAPY = auto()
> >
> >
> > -# Slots enables some optimizations, by pre-allocating space for the defined
> > -# attributes in the underlying data structure.
> > -#
> > -# Frozen makes the object immutable. This enables further optimizations,
> > -# and makes it thread safe should we every want to move in that direction.
> > @dataclass(slots=True, frozen=True)
> > class HugepageConfiguration:
> > + r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
> > +
> > + Attributes:
> > + amount: The number of hugepages.
> > + force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
> > + """
> > +
> > amount: int
> > force_first_numa: bool
> >
> >
> > @dataclass(slots=True, frozen=True)
> > class PortConfig:
> > + r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
> > +
> > + Attributes:
> > + node: The :class:`~framework.testbed_model.node.Node` where this port exists.
> > + pci: The PCI address of the port.
> > + os_driver_for_dpdk: The operating system driver name for use with DPDK.
> > + os_driver: The operating system driver name when the operating system controls the port.
> > + peer_node: The :class:`~framework.testbed_model.node.Node` of the port
> > + connected to this port.
> > + peer_pci: The PCI address of the port connected to this port.
> > + """
> > +
> > node: str
> > pci: str
> > os_driver_for_dpdk: str
> > @@ -81,18 +158,44 @@ class PortConfig:
> > peer_pci: str
> >
> > @staticmethod
> > - def from_dict(node: str, d: dict) -> "PortConfig":
> > + def from_dict(node: str, d: PortConfigDict) -> "PortConfig":
> > + """A convenience method that creates the object from fewer inputs.
> > +
> > + Args:
> > + node: The node where this port exists.
> > + d: The configuration dictionary.
> > +
> > + Returns:
> > + The port configuration instance.
> > + """
> > return PortConfig(node=node, **d)
> >
> >
> > @dataclass(slots=True, frozen=True)
> > class TrafficGeneratorConfig:
> > + """The configuration of traffic generators.
> > +
> > + The class will be expanded when more configuration is needed.
> > +
> > + Attributes:
> > + traffic_generator_type: The type of the traffic generator.
> > + """
> > +
> > traffic_generator_type: TrafficGeneratorType
> >
> > @staticmethod
> > - def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
> > - # This looks useless now, but is designed to allow expansion to traffic
> > - # generators that require more configuration later.
> > + def from_dict(d: TrafficGeneratorConfigDict) -> "ScapyTrafficGeneratorConfig":
> > + """A convenience method that produces traffic generator config of the proper type.
> > +
> > + Args:
> > + d: The configuration dictionary.
> > +
> > + Returns:
> > + The traffic generator configuration instance.
> > +
> > + Raises:
> > + ConfigurationError: An unknown traffic generator type was encountered.
> > + """
> > match TrafficGeneratorType(d["type"]):
> > case TrafficGeneratorType.SCAPY:
> > return ScapyTrafficGeneratorConfig(
> > @@ -106,11 +209,31 @@ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
> >
> > @dataclass(slots=True, frozen=True)
> > class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
> > + """Scapy traffic generator specific configuration."""
> > +
> > pass
> >
> >
> > @dataclass(slots=True, frozen=True)
> > class NodeConfiguration:
> > + r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
> > +
> > + Attributes:
> > + name: The name of the :class:`~framework.testbed_model.node.Node`.
> > + hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
> > + Can be an IP or a domain name.
> > + user: The name of the user used to connect to
> > + the :class:`~framework.testbed_model.node.Node`.
> > + password: The password of the user. The use of passwords is heavily discouraged.
> > + Please use keys instead.
> > + arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
> > + os: The operating system of the :class:`~framework.testbed_model.node.Node`.
> > + lcores: A comma delimited list of logical cores to use when running DPDK.
> > + use_first_core: If :data:`True`, the first logical core won't be used.
> > + hugepages: An optional hugepage configuration.
> > + ports: The ports that can be used in testing.
> > + """
> > +
> > name: str
> > hostname: str
> > user: str
> > @@ -123,57 +246,91 @@ class NodeConfiguration:
> > ports: list[PortConfig]
> >
> > @staticmethod
> > - def from_dict(d: dict) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
> > - hugepage_config = d.get("hugepages")
> > - if hugepage_config:
> > - if "force_first_numa" not in hugepage_config:
> > - hugepage_config["force_first_numa"] = False
> > - hugepage_config = HugepageConfiguration(**hugepage_config)
> > -
> > - common_config = {
> > - "name": d["name"],
> > - "hostname": d["hostname"],
> > - "user": d["user"],
> > - "password": d.get("password"),
> > - "arch": Architecture(d["arch"]),
> > - "os": OS(d["os"]),
> > - "lcores": d.get("lcores", "1"),
> > - "use_first_core": d.get("use_first_core", False),
> > - "hugepages": hugepage_config,
> > - "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> > - }
> > -
> > + def from_dict(
> > + d: NodeConfigDict,
> > + ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
> > + """A convenience method that processes the inputs before creating a specialized instance.
> > +
> > + Args:
> > + d: The configuration dictionary.
> > +
> > + Returns:
> > + Either an SUT or TG configuration instance.
> > + """
> > + hugepage_config = None
> > + if "hugepages" in d:
> > + hugepage_config_dict = d["hugepages"]
> > + if "force_first_numa" not in hugepage_config_dict:
> > + hugepage_config_dict["force_first_numa"] = False
> > + hugepage_config = HugepageConfiguration(**hugepage_config_dict)
> > +
> > + # The calls here contain duplicated code which is here because Mypy doesn't
> > + # properly support dictionary unpacking with TypedDicts
> > if "traffic_generator" in d:
> > return TGNodeConfiguration(
> > + name=d["name"],
> > + hostname=d["hostname"],
> > + user=d["user"],
> > + password=d.get("password"),
> > + arch=Architecture(d["arch"]),
> > + os=OS(d["os"]),
> > + lcores=d.get("lcores", "1"),
> > + use_first_core=d.get("use_first_core", False),
> > + hugepages=hugepage_config,
> > + ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> > traffic_generator=TrafficGeneratorConfig.from_dict(
> > d["traffic_generator"]
> > ),
> > - **common_config,
> > )
> > else:
> > return SutNodeConfiguration(
> > - memory_channels=d.get("memory_channels", 1), **common_config
> > + name=d["name"],
> > + hostname=d["hostname"],
> > + user=d["user"],
> > + password=d.get("password"),
> > + arch=Architecture(d["arch"]),
> > + os=OS(d["os"]),
> > + lcores=d.get("lcores", "1"),
> > + use_first_core=d.get("use_first_core", False),
> > + hugepages=hugepage_config,
> > + ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
> > + memory_channels=d.get("memory_channels", 1),
> > )
> >
> >
> > @dataclass(slots=True, frozen=True)
> > class SutNodeConfiguration(NodeConfiguration):
> > + """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
> > +
> > + Attributes:
> > + memory_channels: The number of memory channels to use when running DPDK.
> > + """
> > +
> > memory_channels: int
> >
> >
> > @dataclass(slots=True, frozen=True)
> > class TGNodeConfiguration(NodeConfiguration):
> > + """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
> > +
> > + Attributes:
> > + traffic_generator: The configuration of the traffic generator present on the TG node.
> > + """
> > +
> > traffic_generator: ScapyTrafficGeneratorConfig
> >
> >
> > @dataclass(slots=True, frozen=True)
> > class NodeInfo:
> > - """Class to hold important versions within the node.
> > -
> > - This class, unlike the NodeConfiguration class, cannot be generated at the start.
> > - This is because we need to initialize a connection with the node before we can
> > - collect the information needed in this class. Therefore, it cannot be a part of
> > - the configuration class above.
> > + """Supplemental node information.
> > +
> > + Attributes:
> > + os_name: The name of the running operating system of
> > + the :class:`~framework.testbed_model.node.Node`.
> > + os_version: The version of the running operating system of
> > + the :class:`~framework.testbed_model.node.Node`.
> > + kernel_version: The kernel version of the running operating system of
> > + the :class:`~framework.testbed_model.node.Node`.
> > """
> >
> > os_name: str
> > @@ -183,6 +340,20 @@ class NodeInfo:
> >
> > @dataclass(slots=True, frozen=True)
> > class BuildTargetConfiguration:
> > + """DPDK build configuration.
> > +
> > + The configuration used for building DPDK.
> > +
> > + Attributes:
> > + arch: The target architecture to build for.
> > + os: The target os to build for.
> > + cpu: The target CPU to build for.
> > + compiler: The compiler executable to use.
> > + compiler_wrapper: This string will be put in front of the compiler when
> > + executing the build. Useful for adding wrapper commands, such as ``ccache``.
> > + name: The name of the compiler.
> > + """
> > +
> > arch: Architecture
> > os: OS
> > cpu: CPUType
> > @@ -191,7 +362,18 @@ class BuildTargetConfiguration:
> > name: str
> >
> > @staticmethod
> > - def from_dict(d: dict) -> "BuildTargetConfiguration":
> > + def from_dict(d: BuildTargetConfigDict) -> "BuildTargetConfiguration":
> > + r"""A convenience method that processes the inputs before creating an instance.
> > +
> > + `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
> > + `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
> > +
> > + Args:
> > + d: The configuration dictionary.
> > +
> > + Returns:
> > + The build target configuration instance.
> > + """
> > return BuildTargetConfiguration(
> > arch=Architecture(d["arch"]),
> > os=OS(d["os"]),
> > @@ -204,23 +386,29 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
> >
> > @dataclass(slots=True, frozen=True)
> > class BuildTargetInfo:
> > - """Class to hold important versions within the build target.
> > + """Various versions and other information about a build target.
> >
> > - This is very similar to the NodeInfo class, it just instead holds information
> > - for the build target.
> > + Attributes:
> > + dpdk_version: The DPDK version that was built.
> > + compiler_version: The version of the compiler used to build DPDK.
> > """
> >
> > dpdk_version: str
> > compiler_version: str
> >
> >
> > -class TestSuiteConfigDict(TypedDict):
> > - suite: str
> > - cases: list[str]
> > -
> > -
> > @dataclass(slots=True, frozen=True)
> > class TestSuiteConfig:
> > + """Test suite configuration.
> > +
> > + Information about a single test suite to be executed.
> > +
> > + Attributes:
> > + test_suite: The name of the test suite module without the starting ``TestSuite_``.
> > + test_cases: The names of test cases from this test suite to execute.
> > + If empty, all test cases will be executed.
> > + """
> > +
> > test_suite: str
> > test_cases: list[str]
> >
> > @@ -228,6 +416,14 @@ class TestSuiteConfig:
> > def from_dict(
> > entry: str | TestSuiteConfigDict,
> > ) -> "TestSuiteConfig":
> > + """Create an instance from two different types.
> > +
> > + Args:
> > + entry: Either a suite name or a dictionary containing the config.
> > +
> > + Returns:
> > + The test suite configuration instance.
> > + """
> > if isinstance(entry, str):
> > return TestSuiteConfig(test_suite=entry, test_cases=[])
> > elif isinstance(entry, dict):
> > @@ -238,19 +434,49 @@ def from_dict(
> >
> > @dataclass(slots=True, frozen=True)
> > class ExecutionConfiguration:
> > + """The configuration of an execution.
> > +
> > + The configuration contains testbed information, what tests to execute
> > + and with what DPDK build.
> > +
> > + Attributes:
> > + build_targets: A list of DPDK builds to test.
> > + perf: Whether to run performance tests.
> > + func: Whether to run functional tests.
> > + skip_smoke_tests: Whether to skip smoke tests.
> > + test_suites: The names of test suites and/or test cases to execute.
> > + system_under_test_node: The SUT node to use in this execution.
> > + traffic_generator_node: The TG node to use in this execution.
> > + vdevs: The names of virtual devices to test.
> > + """
> > +
> > build_targets: list[BuildTargetConfiguration]
> > perf: bool
> > func: bool
> > + skip_smoke_tests: bool
> > test_suites: list[TestSuiteConfig]
> > system_under_test_node: SutNodeConfiguration
> > traffic_generator_node: TGNodeConfiguration
> > vdevs: list[str]
> > - skip_smoke_tests: bool
> >
> > @staticmethod
> > def from_dict(
> > - d: dict, node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]]
> > + d: ExecutionConfigDict,
> > + node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]],
> > ) -> "ExecutionConfiguration":
> > + """A convenience method that processes the inputs before creating an instance.
> > +
> > + The build target and the test suite config is transformed into their respective objects.
>
> is -> are
>
> > + SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
>
> configuration*s*
>
> > + just stored.
> > +
> > + Args:
> > + d: The configuration dictionary.
> > + node_map: A dictionary mapping node names to their config objects.
> > +
> > + Returns:
> > + The execution configuration instance.
> > + """
> > build_targets: list[BuildTargetConfiguration] = list(
> > map(BuildTargetConfiguration.from_dict, d["build_targets"])
> > )
> > @@ -291,10 +517,31 @@ def from_dict(
> >
> > @dataclass(slots=True, frozen=True)
> > class Configuration:
> > + """DTS testbed and test configuration.
> > +
> > + The node configuration is not stored in this object. Rather, all used node configurations
> > + are stored inside the execution configuration where the nodes are actually used.
> > +
> > + Attributes:
> > + executions: Execution configurations.
> > + """
> > +
> > executions: list[ExecutionConfiguration]
> >
> > @staticmethod
> > - def from_dict(d: dict) -> "Configuration":
> > + def from_dict(d: ConfigurationDict) -> "Configuration":
> > + """A convenience method that processes the inputs before creating an instance.
> > +
> > + Build target and test suite config is transformed into their respective objects.
>
> is -> are
>
> > + SUT and TG configuration are taken from `node_map`. The other (:class:`bool`) attributes are
>
> configuration*s*
>
> > + just stored.
> > +
> > + Args:
> > + d: The configuration dictionary.
> > +
> > + Returns:
> > + The whole configuration instance.
> > + """
> > nodes: list[Union[SutNodeConfiguration | TGNodeConfiguration]] = list(
> > map(NodeConfiguration.from_dict, d["nodes"])
> > )
> > @@ -313,9 +560,17 @@ def from_dict(d: dict) -> "Configuration":
> >
> >
> > def load_config() -> Configuration:
> > - """
> > - Loads the configuration file and the configuration file schema,
> > - validates the configuration file, and creates a configuration object.
> > + """Load DTS test run configuration from a file.
> > +
> > + Load the YAML test run configuration file
> > + and :download:`the configuration file schema <conf_yaml_schema.json>`,
> > + validate the test run configuration file, and create a test run configuration object.
> > +
> > + The YAML test run configuration file is specified in the :option:`--config-file` command line
> > + argument or the :envvar:`DTS_CFG_FILE` environment variable.
> > +
> > + Returns:
> > + The parsed test run configuration.
> > """
> > with open(SETTINGS.config_file_path, "r") as f:
> > config_data = yaml.safe_load(f)
> > @@ -326,6 +581,8 @@ def load_config() -> Configuration:
> >
> > with open(schema_path, "r") as f:
> > schema = json.load(f)
> > - config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
> > - config_obj: Configuration = Configuration.from_dict(dict(config))
> > + config = warlock.model_factory(schema, name="_Config")(config_data)
> > + config_obj: Configuration = Configuration.from_dict(
> > + dict(config) # type: ignore[arg-type]
> > + )
> > return config_obj
> > diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
> > new file mode 100644
> > index 0000000000..1927910d88
> > --- /dev/null
> > +++ b/dts/framework/config/types.py
> > @@ -0,0 +1,132 @@
> > +# SPDX-License-Identifier: BSD-3-Clause
> > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +
> > +"""Configuration dictionary contents specification.
> > +
> > +These type definitions serve as documentation of the configuration dictionary contents.
> > +
> > +The definitions use the built-in :class:`~typing.TypedDict` construct.
> > +"""
> > +
> > +from typing import TypedDict
> > +
> > +
> > +class PortConfigDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + pci: str
> > + #:
> > + os_driver_for_dpdk: str
> > + #:
> > + os_driver: str
> > + #:
> > + peer_node: str
> > + #:
> > + peer_pci: str
> > +
> > +
> > +class TrafficGeneratorConfigDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + type: str
> > +
> > +
> > +class HugepageConfigurationDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + amount: int
> > + #:
> > + force_first_numa: bool
> > +
> > +
> > +class NodeConfigDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + hugepages: HugepageConfigurationDict
> > + #:
> > + name: str
> > + #:
> > + hostname: str
> > + #:
> > + user: str
> > + #:
> > + password: str
> > + #:
> > + arch: str
> > + #:
> > + os: str
> > + #:
> > + lcores: str
> > + #:
> > + use_first_core: bool
> > + #:
> > + ports: list[PortConfigDict]
> > + #:
> > + memory_channels: int
> > + #:
> > + traffic_generator: TrafficGeneratorConfigDict
> > +
> > +
> > +class BuildTargetConfigDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + arch: str
> > + #:
> > + os: str
> > + #:
> > + cpu: str
> > + #:
> > + compiler: str
> > + #:
> > + compiler_wrapper: str
> > +
> > +
> > +class TestSuiteConfigDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + suite: str
> > + #:
> > + cases: list[str]
> > +
> > +
> > +class ExecutionSUTConfigDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + node_name: str
> > + #:
> > + vdevs: list[str]
> > +
> > +
> > +class ExecutionConfigDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + build_targets: list[BuildTargetConfigDict]
> > + #:
> > + perf: bool
> > + #:
> > + func: bool
> > + #:
> > + skip_smoke_tests: bool
> > + #:
> > + test_suites: TestSuiteConfigDict
> > + #:
> > + system_under_test_node: ExecutionSUTConfigDict
> > + #:
> > + traffic_generator_node: str
> > +
> > +
> > +class ConfigurationDict(TypedDict):
> > + """Allowed keys and values."""
> > +
> > + #:
> > + nodes: list[NodeConfigDict]
> > + #:
> > + executions: list[ExecutionConfigDict]
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 11/21] dts: remote session docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (9 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 10/21] dts: config " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-21 15:36 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 12/21] dts: interactive " Juraj Linkeš
` (10 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/__init__.py | 39 +++++-
.../remote_session/remote_session.py | 128 +++++++++++++-----
dts/framework/remote_session/ssh_session.py | 16 +--
3 files changed, 135 insertions(+), 48 deletions(-)
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 5e7ddb2b05..51a01d6b5e 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -2,12 +2,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-The package provides modules for managing remote connections to a remote host (node),
-differentiated by OS.
-The package provides a factory function, create_session, that returns the appropriate
-remote connection based on the passed configuration. The differences are in the
-underlying transport protocol (e.g. SSH) and remote OS (e.g. Linux).
+"""Remote interactive and non-interactive sessions.
+
+This package provides modules for managing remote connections to a remote host (node).
+
+The non-interactive sessions send commands and return their output and exit code.
+
+The interactive sessions open an interactive shell which is continuously open,
+allowing it to send and receive data within that particular shell.
"""
# pylama:ignore=W0611
@@ -26,10 +28,35 @@
def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> RemoteSession:
+ """Factory for non-interactive remote sessions.
+
+ The function returns an SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The SSH remote session.
+ """
return SSHSession(node_config, name, logger)
def create_interactive_session(
node_config: NodeConfiguration, logger: DTSLOG
) -> InteractiveRemoteSession:
+ """Factory for interactive remote sessions.
+
+ The function returns an interactive SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The interactive SSH remote session.
+ """
return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
index 0647d93de4..629c2d7b9c 100644
--- a/dts/framework/remote_session/remote_session.py
+++ b/dts/framework/remote_session/remote_session.py
@@ -3,6 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Base remote session.
+
+This module contains the abstract base class for remote sessions and defines
+the structure of the result of a command execution.
+"""
+
+
import dataclasses
from abc import ABC, abstractmethod
from pathlib import PurePath
@@ -15,8 +22,14 @@
@dataclasses.dataclass(slots=True, frozen=True)
class CommandResult:
- """
- The result of remote execution of a command.
+ """The result of remote execution of a command.
+
+ Attributes:
+ name: The name of the session that executed the command.
+ command: The executed command.
+ stdout: The standard output the command produced.
+ stderr: The standard error output the command produced.
+ return_code: The return code the command exited with.
"""
name: str
@@ -26,6 +39,7 @@ class CommandResult:
return_code: int
def __str__(self) -> str:
+ """Format the command outputs."""
return (
f"stdout: '{self.stdout}'\n"
f"stderr: '{self.stderr}'\n"
@@ -34,13 +48,24 @@ def __str__(self) -> str:
class RemoteSession(ABC):
- """
- The base class for defining which methods must be implemented in order to connect
- to a remote host (node) and maintain a remote session. The derived classes are
- supposed to implement/use some underlying transport protocol (e.g. SSH) to
- implement the methods. On top of that, it provides some basic services common to
- all derived classes, such as keeping history and logging what's being executed
- on the remote node.
+ """Non-interactive remote session.
+
+ The abstract methods must be implemented in order to connect to a remote host (node)
+ and maintain a remote session.
+ The subclasses must use (or implement) some underlying transport protocol (e.g. SSH)
+ to implement the methods. On top of that, it provides some basic services common to all
+ subclasses, such as keeping history and logging what's being executed on the remote node.
+
+ Attributes:
+ name: The name of the session.
+ hostname: The node's hostname. Could be an IP (possibly with port, separated by a colon)
+ or a domain name.
+ ip: The IP address of the node or a domain name, whichever was used in `hostname`.
+ port: The port of the node, if given in `hostname`.
+ username: The username used in the connection.
+ password: The password used in the connection. Most frequently empty,
+ as the use of passwords is discouraged.
+ history: The executed commands during this session.
"""
name: str
@@ -59,6 +84,16 @@ def __init__(
session_name: str,
logger: DTSLOG,
):
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ session_name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Raises:
+ SSHConnectionError: If the connection to the node was not successful.
+ """
self._node_config = node_config
self.name = session_name
@@ -79,8 +114,13 @@ def __init__(
@abstractmethod
def _connect(self) -> None:
- """
- Create connection to assigned node.
+ """Create a connection to the node.
+
+ The implementation must assign the established session to self.session.
+
+ The implementation must except all exceptions and convert them to an SSHConnectionError.
+
+ The implementation may optionally implement retry attempts.
"""
def send_command(
@@ -90,11 +130,24 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- Send a command to the connected node using optional env vars
- and return CommandResult.
- If verify is True, check the return code of the executed command
- and raise a RemoteCommandExecutionError if the command failed.
+ """Send `command` to the connected node.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds to execute `command`.
+ verify: If :data:`True`, will check the exit code of `command`.
+ env: A dictionary with environment variables to be used with `command` execution.
+
+ Raises:
+ SSHSessionDeadError: If the session isn't alive when sending `command`.
+ SSHTimeoutError: If `command` execution timed out.
+ RemoteCommandExecutionError: If verify is :data:`True` and `command` execution failed.
+
+ Returns:
+ The output of the command along with the return code.
"""
self._logger.info(
f"Sending: '{command}'" + (f" with env vars: '{env}'" if env else "")
@@ -115,29 +168,36 @@ def send_command(
def _send_command(
self, command: str, timeout: float, env: dict | None
) -> CommandResult:
- """
- Use the underlying protocol to execute the command using optional env vars
- and return CommandResult.
+ """Send a command to the connected node.
+
+ The implementation must execute the command remotely with `env` environment variables
+ and return the result.
+
+ The implementation must except all exceptions and raise an SSHSessionDeadError if
+ the session is not alive and an SSHTimeoutError if the command execution times out.
"""
def close(self, force: bool = False) -> None:
- """
- Close the remote session and free all used resources.
+ """Close the remote session and free all used resources.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
"""
self._logger.logger_exit()
self._close(force)
@abstractmethod
def _close(self, force: bool = False) -> None:
- """
- Execute protocol specific steps needed to close the session properly.
+ """Protocol specific steps needed to close the session properly.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
+ This doesn't have to be implemented in the overloaded method.
"""
@abstractmethod
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the remote session is still responding."""
@abstractmethod
def copy_from(
@@ -147,12 +207,12 @@ def copy_from(
) -> None:
"""Copy a file from the remote Node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote Node associated with this remote session
+ to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
- destination_file: a file or directory path on the local filesystem.
+ source_file: The file on the remote Node.
+ destination_file: A file or directory path on the local filesystem.
"""
@abstractmethod
@@ -163,10 +223,10 @@ def copy_to(
) -> None:
"""Copy a file from local filesystem to the remote Node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file` on the remote Node
+ associated with this remote session.
Args:
- source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ source_file: The file on the local filesystem.
+ destination_file: A file or directory path on the remote Node.
"""
diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
index cee11d14d6..7186490a9a 100644
--- a/dts/framework/remote_session/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -1,6 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""SSH session remote session."""
+
import socket
import traceback
from pathlib import PurePath
@@ -26,13 +28,8 @@
class SSHSession(RemoteSession):
"""A persistent SSH connection to a remote Node.
- The connection is implemented with the Fabric Python library.
-
- Args:
- node_config: The configuration of the Node to connect to.
- session_name: The name of the session.
- logger: The logger used for logging.
- This should be passed from the parent OSSession.
+ The connection is implemented with
+ `the Fabric Python library <https://docs.fabfile.org/en/latest/>`_.
Attributes:
session: The underlying Fabric SSH connection.
@@ -80,6 +77,7 @@ def _connect(self) -> None:
raise SSHConnectionError(self.hostname, errors)
def is_alive(self) -> bool:
+ """Overrides :meth:`~.remote_session.RemoteSession.is_alive`."""
return self.session.is_connected
def _send_command(
@@ -89,7 +87,7 @@ def _send_command(
Args:
command: The command to execute.
- timeout: Wait at most this many seconds for the execution to complete.
+ timeout: Wait at most this long in seconds to execute the command.
env: Extra environment variables that will be used in command execution.
Raises:
@@ -118,6 +116,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
self.session.get(str(destination_file), str(source_file))
def copy_to(
@@ -125,6 +124,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
self.session.put(str(source_file), str(destination_file))
def _close(self, force: bool = False) -> None:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 11/21] dts: remote session docstring update
2023-11-15 13:09 ` [PATCH v7 11/21] dts: remote session " Juraj Linkeš
@ 2023-11-21 15:36 ` Yoan Picchi
2023-11-22 11:13 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-21 15:36 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/remote_session/__init__.py | 39 +++++-
> .../remote_session/remote_session.py | 128 +++++++++++++-----
> dts/framework/remote_session/ssh_session.py | 16 +--
> 3 files changed, 135 insertions(+), 48 deletions(-)
>
> diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
> index 5e7ddb2b05..51a01d6b5e 100644
> --- a/dts/framework/remote_session/__init__.py
> +++ b/dts/framework/remote_session/__init__.py
> @@ -2,12 +2,14 @@
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2023 University of New Hampshire
>
> -"""
> -The package provides modules for managing remote connections to a remote host (node),
> -differentiated by OS.
> -The package provides a factory function, create_session, that returns the appropriate
> -remote connection based on the passed configuration. The differences are in the
> -underlying transport protocol (e.g. SSH) and remote OS (e.g. Linux).
> +"""Remote interactive and non-interactive sessions.
> +
> +This package provides modules for managing remote connections to a remote host (node).
> +
> +The non-interactive sessions send commands and return their output and exit code.
> +
> +The interactive sessions open an interactive shell which is continuously open,
> +allowing it to send and receive data within that particular shell.
> """
>
> # pylama:ignore=W0611
> @@ -26,10 +28,35 @@
> def create_remote_session(
> node_config: NodeConfiguration, name: str, logger: DTSLOG
> ) -> RemoteSession:
> + """Factory for non-interactive remote sessions.
> +
> + The function returns an SSH session, but will be extended if support
> + for other protocols is added.
> +
> + Args:
> + node_config: The test run configuration of the node to connect to.
> + name: The name of the session.
> + logger: The logger instance this session will use.
> +
> + Returns:
> + The SSH remote session.
> + """
> return SSHSession(node_config, name, logger)
>
>
> def create_interactive_session(
> node_config: NodeConfiguration, logger: DTSLOG
> ) -> InteractiveRemoteSession:
> + """Factory for interactive remote sessions.
> +
> + The function returns an interactive SSH session, but will be extended if support
> + for other protocols is added.
> +
> + Args:
> + node_config: The test run configuration of the node to connect to.
> + logger: The logger instance this session will use.
> +
> + Returns:
> + The interactive SSH remote session.
> + """
> return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
> index 0647d93de4..629c2d7b9c 100644
> --- a/dts/framework/remote_session/remote_session.py
> +++ b/dts/framework/remote_session/remote_session.py
> @@ -3,6 +3,13 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> +"""Base remote session.
> +
> +This module contains the abstract base class for remote sessions and defines
> +the structure of the result of a command execution.
> +"""
> +
> +
> import dataclasses
> from abc import ABC, abstractmethod
> from pathlib import PurePath
> @@ -15,8 +22,14 @@
>
> @dataclasses.dataclass(slots=True, frozen=True)
> class CommandResult:
> - """
> - The result of remote execution of a command.
> + """The result of remote execution of a command.
> +
> + Attributes:
> + name: The name of the session that executed the command.
> + command: The executed command.
> + stdout: The standard output the command produced.
> + stderr: The standard error output the command produced.
> + return_code: The return code the command exited with.
> """
>
> name: str
> @@ -26,6 +39,7 @@ class CommandResult:
> return_code: int
>
> def __str__(self) -> str:
> + """Format the command outputs."""
> return (
> f"stdout: '{self.stdout}'\n"
> f"stderr: '{self.stderr}'\n"
> @@ -34,13 +48,24 @@ def __str__(self) -> str:
>
>
> class RemoteSession(ABC):
> - """
> - The base class for defining which methods must be implemented in order to connect
> - to a remote host (node) and maintain a remote session. The derived classes are
> - supposed to implement/use some underlying transport protocol (e.g. SSH) to
> - implement the methods. On top of that, it provides some basic services common to
> - all derived classes, such as keeping history and logging what's being executed
> - on the remote node.
> + """Non-interactive remote session.
> +
> + The abstract methods must be implemented in order to connect to a remote host (node)
> + and maintain a remote session.
> + The subclasses must use (or implement) some underlying transport protocol (e.g. SSH)
> + to implement the methods. On top of that, it provides some basic services common to all
> + subclasses, such as keeping history and logging what's being executed on the remote node.
> +
> + Attributes:
> + name: The name of the session.
> + hostname: The node's hostname. Could be an IP (possibly with port, separated by a colon)
> + or a domain name.
> + ip: The IP address of the node or a domain name, whichever was used in `hostname`.
> + port: The port of the node, if given in `hostname`.
> + username: The username used in the connection.
> + password: The password used in the connection. Most frequently empty,
> + as the use of passwords is discouraged.
> + history: The executed commands during this session.
> """
>
> name: str
> @@ -59,6 +84,16 @@ def __init__(
> session_name: str,
> logger: DTSLOG,
> ):
> + """Connect to the node during initialization.
> +
> + Args:
> + node_config: The test run configuration of the node to connect to.
> + session_name: The name of the session.
> + logger: The logger instance this session will use.
> +
> + Raises:
> + SSHConnectionError: If the connection to the node was not successful.
> + """
> self._node_config = node_config
>
> self.name = session_name
> @@ -79,8 +114,13 @@ def __init__(
>
> @abstractmethod
> def _connect(self) -> None:
> - """
> - Create connection to assigned node.
> + """Create a connection to the node.
> +
> + The implementation must assign the established session to self.session.
> +
> + The implementation must except all exceptions and convert them to an SSHConnectionError.
> +
> + The implementation may optionally implement retry attempts.
> """
>
> def send_command(
> @@ -90,11 +130,24 @@ def send_command(
> verify: bool = False,
> env: dict | None = None,
> ) -> CommandResult:
> - """
> - Send a command to the connected node using optional env vars
> - and return CommandResult.
> - If verify is True, check the return code of the executed command
> - and raise a RemoteCommandExecutionError if the command failed.
> + """Send `command` to the connected node.
> +
> + The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
> + environment variable configure the timeout of command execution.
> +
> + Args:
> + command: The command to execute.
> + timeout: Wait at most this long in seconds to execute `command`.
> + verify: If :data:`True`, will check the exit code of `command`.
> + env: A dictionary with environment variables to be used with `command` execution.
> +
> + Raises:
> + SSHSessionDeadError: If the session isn't alive when sending `command`.
> + SSHTimeoutError: If `command` execution timed out.
> + RemoteCommandExecutionError: If verify is :data:`True` and `command` execution failed.
> +
> + Returns:
> + The output of the command along with the return code.
> """
> self._logger.info(
> f"Sending: '{command}'" + (f" with env vars: '{env}'" if env else "")
> @@ -115,29 +168,36 @@ def send_command(
> def _send_command(
> self, command: str, timeout: float, env: dict | None
> ) -> CommandResult:
> - """
> - Use the underlying protocol to execute the command using optional env vars
> - and return CommandResult.
> + """Send a command to the connected node.
> +
> + The implementation must execute the command remotely with `env` environment variables
> + and return the result.
> +
> + The implementation must except all exceptions and raise an SSHSessionDeadError if
> + the session is not alive and an SSHTimeoutError if the command execution times out.
3 way "and". Needs comas or splitting the sentence.
> """
>
> def close(self, force: bool = False) -> None:
> - """
> - Close the remote session and free all used resources.
> + """Close the remote session and free all used resources.
> +
> + Args:
> + force: Force the closure of the connection. This may not clean up all resources.
> """
> self._logger.logger_exit()
> self._close(force)
>
> @abstractmethod
> def _close(self, force: bool = False) -> None:
> - """
> - Execute protocol specific steps needed to close the session properly.
> + """Protocol specific steps needed to close the session properly.
> +
> + Args:
> + force: Force the closure of the connection. This may not clean up all resources.
> + This doesn't have to be implemented in the overloaded method.
> """
>
> @abstractmethod
> def is_alive(self) -> bool:
> - """
> - Check whether the remote session is still responding.
> - """
> + """Check whether the remote session is still responding."""
>
> @abstractmethod
> def copy_from(
> @@ -147,12 +207,12 @@ def copy_from(
> ) -> None:
> """Copy a file from the remote Node to the local filesystem.
>
> - Copy source_file from the remote Node associated with this remote
> - session to destination_file on the local filesystem.
> + Copy `source_file` from the remote Node associated with this remote session
> + to `destination_file` on the local filesystem.
>
> Args:
> - source_file: the file on the remote Node.
> - destination_file: a file or directory path on the local filesystem.
> + source_file: The file on the remote Node.
> + destination_file: A file or directory path on the local filesystem.
> """
>
> @abstractmethod
> @@ -163,10 +223,10 @@ def copy_to(
> ) -> None:
> """Copy a file from local filesystem to the remote Node.
>
> - Copy source_file from local filesystem to destination_file
> - on the remote Node associated with this remote session.
> + Copy `source_file` from local filesystem to `destination_file` on the remote Node
> + associated with this remote session.
>
> Args:
> - source_file: the file on the local filesystem.
> - destination_file: a file or directory path on the remote Node.
> + source_file: The file on the local filesystem.
> + destination_file: A file or directory path on the remote Node.
> """
> diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
> index cee11d14d6..7186490a9a 100644
> --- a/dts/framework/remote_session/ssh_session.py
> +++ b/dts/framework/remote_session/ssh_session.py
> @@ -1,6 +1,8 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> +"""SSH session remote session."""
Is the double "session" intended?
> +
> import socket
> import traceback
> from pathlib import PurePath
> @@ -26,13 +28,8 @@
> class SSHSession(RemoteSession):
> """A persistent SSH connection to a remote Node.
>
> - The connection is implemented with the Fabric Python library.
> -
> - Args:
> - node_config: The configuration of the Node to connect to.
> - session_name: The name of the session.
> - logger: The logger used for logging.
> - This should be passed from the parent OSSession.
> + The connection is implemented with
> + `the Fabric Python library <https://docs.fabfile.org/en/latest/>`_.
>
> Attributes:
> session: The underlying Fabric SSH connection.
> @@ -80,6 +77,7 @@ def _connect(self) -> None:
> raise SSHConnectionError(self.hostname, errors)
>
> def is_alive(self) -> bool:
> + """Overrides :meth:`~.remote_session.RemoteSession.is_alive`."""
> return self.session.is_connected
>
> def _send_command(
> @@ -89,7 +87,7 @@ def _send_command(
>
> Args:
> command: The command to execute.
> - timeout: Wait at most this many seconds for the execution to complete.
> + timeout: Wait at most this long in seconds to execute the command.
Is the timeout actually to start running the command and not to wait for
it to be completed?
> env: Extra environment variables that will be used in command execution.
>
> Raises:
> @@ -118,6 +116,7 @@ def copy_from(
> source_file: str | PurePath,
> destination_file: str | PurePath,
> ) -> None:
> + """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
> self.session.get(str(destination_file), str(source_file))
>
> def copy_to(
> @@ -125,6 +124,7 @@ def copy_to(
> source_file: str | PurePath,
> destination_file: str | PurePath,
> ) -> None:
> + """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
> self.session.put(str(source_file), str(destination_file))
>
> def _close(self, force: bool = False) -> None:
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 11/21] dts: remote session docstring update
2023-11-21 15:36 ` Yoan Picchi
@ 2023-11-22 11:13 ` Juraj Linkeš
2023-11-22 11:25 ` Yoan Picchi
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 11:13 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Tue, Nov 21, 2023 at 4:36 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/remote_session/__init__.py | 39 +++++-
> > .../remote_session/remote_session.py | 128 +++++++++++++-----
> > dts/framework/remote_session/ssh_session.py | 16 +--
> > 3 files changed, 135 insertions(+), 48 deletions(-)
> >
> > diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
> > index 5e7ddb2b05..51a01d6b5e 100644
> > --- a/dts/framework/remote_session/__init__.py
> > +++ b/dts/framework/remote_session/__init__.py
> > @@ -2,12 +2,14 @@
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2023 University of New Hampshire
> >
> > -"""
> > -The package provides modules for managing remote connections to a remote host (node),
> > -differentiated by OS.
> > -The package provides a factory function, create_session, that returns the appropriate
> > -remote connection based on the passed configuration. The differences are in the
> > -underlying transport protocol (e.g. SSH) and remote OS (e.g. Linux).
> > +"""Remote interactive and non-interactive sessions.
> > +
> > +This package provides modules for managing remote connections to a remote host (node).
> > +
> > +The non-interactive sessions send commands and return their output and exit code.
> > +
> > +The interactive sessions open an interactive shell which is continuously open,
> > +allowing it to send and receive data within that particular shell.
> > """
> >
> > # pylama:ignore=W0611
> > @@ -26,10 +28,35 @@
> > def create_remote_session(
> > node_config: NodeConfiguration, name: str, logger: DTSLOG
> > ) -> RemoteSession:
> > + """Factory for non-interactive remote sessions.
> > +
> > + The function returns an SSH session, but will be extended if support
> > + for other protocols is added.
> > +
> > + Args:
> > + node_config: The test run configuration of the node to connect to.
> > + name: The name of the session.
> > + logger: The logger instance this session will use.
> > +
> > + Returns:
> > + The SSH remote session.
> > + """
> > return SSHSession(node_config, name, logger)
> >
> >
> > def create_interactive_session(
> > node_config: NodeConfiguration, logger: DTSLOG
> > ) -> InteractiveRemoteSession:
> > + """Factory for interactive remote sessions.
> > +
> > + The function returns an interactive SSH session, but will be extended if support
> > + for other protocols is added.
> > +
> > + Args:
> > + node_config: The test run configuration of the node to connect to.
> > + logger: The logger instance this session will use.
> > +
> > + Returns:
> > + The interactive SSH remote session.
> > + """
> > return InteractiveRemoteSession(node_config, logger)
> > diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
> > index 0647d93de4..629c2d7b9c 100644
> > --- a/dts/framework/remote_session/remote_session.py
> > +++ b/dts/framework/remote_session/remote_session.py
> > @@ -3,6 +3,13 @@
> > # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022-2023 University of New Hampshire
> >
> > +"""Base remote session.
> > +
> > +This module contains the abstract base class for remote sessions and defines
> > +the structure of the result of a command execution.
> > +"""
> > +
> > +
> > import dataclasses
> > from abc import ABC, abstractmethod
> > from pathlib import PurePath
> > @@ -15,8 +22,14 @@
> >
> > @dataclasses.dataclass(slots=True, frozen=True)
> > class CommandResult:
> > - """
> > - The result of remote execution of a command.
> > + """The result of remote execution of a command.
> > +
> > + Attributes:
> > + name: The name of the session that executed the command.
> > + command: The executed command.
> > + stdout: The standard output the command produced.
> > + stderr: The standard error output the command produced.
> > + return_code: The return code the command exited with.
> > """
> >
> > name: str
> > @@ -26,6 +39,7 @@ class CommandResult:
> > return_code: int
> >
> > def __str__(self) -> str:
> > + """Format the command outputs."""
> > return (
> > f"stdout: '{self.stdout}'\n"
> > f"stderr: '{self.stderr}'\n"
> > @@ -34,13 +48,24 @@ def __str__(self) -> str:
> >
> >
> > class RemoteSession(ABC):
> > - """
> > - The base class for defining which methods must be implemented in order to connect
> > - to a remote host (node) and maintain a remote session. The derived classes are
> > - supposed to implement/use some underlying transport protocol (e.g. SSH) to
> > - implement the methods. On top of that, it provides some basic services common to
> > - all derived classes, such as keeping history and logging what's being executed
> > - on the remote node.
> > + """Non-interactive remote session.
> > +
> > + The abstract methods must be implemented in order to connect to a remote host (node)
> > + and maintain a remote session.
> > + The subclasses must use (or implement) some underlying transport protocol (e.g. SSH)
> > + to implement the methods. On top of that, it provides some basic services common to all
> > + subclasses, such as keeping history and logging what's being executed on the remote node.
> > +
> > + Attributes:
> > + name: The name of the session.
> > + hostname: The node's hostname. Could be an IP (possibly with port, separated by a colon)
> > + or a domain name.
> > + ip: The IP address of the node or a domain name, whichever was used in `hostname`.
> > + port: The port of the node, if given in `hostname`.
> > + username: The username used in the connection.
> > + password: The password used in the connection. Most frequently empty,
> > + as the use of passwords is discouraged.
> > + history: The executed commands during this session.
> > """
> >
> > name: str
> > @@ -59,6 +84,16 @@ def __init__(
> > session_name: str,
> > logger: DTSLOG,
> > ):
> > + """Connect to the node during initialization.
> > +
> > + Args:
> > + node_config: The test run configuration of the node to connect to.
> > + session_name: The name of the session.
> > + logger: The logger instance this session will use.
> > +
> > + Raises:
> > + SSHConnectionError: If the connection to the node was not successful.
> > + """
> > self._node_config = node_config
> >
> > self.name = session_name
> > @@ -79,8 +114,13 @@ def __init__(
> >
> > @abstractmethod
> > def _connect(self) -> None:
> > - """
> > - Create connection to assigned node.
> > + """Create a connection to the node.
> > +
> > + The implementation must assign the established session to self.session.
> > +
> > + The implementation must except all exceptions and convert them to an SSHConnectionError.
> > +
> > + The implementation may optionally implement retry attempts.
> > """
> >
> > def send_command(
> > @@ -90,11 +130,24 @@ def send_command(
> > verify: bool = False,
> > env: dict | None = None,
> > ) -> CommandResult:
> > - """
> > - Send a command to the connected node using optional env vars
> > - and return CommandResult.
> > - If verify is True, check the return code of the executed command
> > - and raise a RemoteCommandExecutionError if the command failed.
> > + """Send `command` to the connected node.
> > +
> > + The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
> > + environment variable configure the timeout of command execution.
> > +
> > + Args:
> > + command: The command to execute.
> > + timeout: Wait at most this long in seconds to execute `command`.
> > + verify: If :data:`True`, will check the exit code of `command`.
> > + env: A dictionary with environment variables to be used with `command` execution.
> > +
> > + Raises:
> > + SSHSessionDeadError: If the session isn't alive when sending `command`.
> > + SSHTimeoutError: If `command` execution timed out.
> > + RemoteCommandExecutionError: If verify is :data:`True` and `command` execution failed.
> > +
> > + Returns:
> > + The output of the command along with the return code.
> > """
> > self._logger.info(
> > f"Sending: '{command}'" + (f" with env vars: '{env}'" if env else "")
> > @@ -115,29 +168,36 @@ def send_command(
> > def _send_command(
> > self, command: str, timeout: float, env: dict | None
> > ) -> CommandResult:
> > - """
> > - Use the underlying protocol to execute the command using optional env vars
> > - and return CommandResult.
> > + """Send a command to the connected node.
> > +
> > + The implementation must execute the command remotely with `env` environment variables
> > + and return the result.
> > +
> > + The implementation must except all exceptions and raise an SSHSessionDeadError if
> > + the session is not alive and an SSHTimeoutError if the command execution times out.
>
> 3 way "and". Needs comas or splitting the sentence.
>
What about this?
The implementation must except all exceptions and raise:
* SSHSessionDeadError if the session is not alive,
* SSHTimeoutError if the command execution times out.
> > """
> >
> > def close(self, force: bool = False) -> None:
> > - """
> > - Close the remote session and free all used resources.
> > + """Close the remote session and free all used resources.
> > +
> > + Args:
> > + force: Force the closure of the connection. This may not clean up all resources.
> > """
> > self._logger.logger_exit()
> > self._close(force)
> >
> > @abstractmethod
> > def _close(self, force: bool = False) -> None:
> > - """
> > - Execute protocol specific steps needed to close the session properly.
> > + """Protocol specific steps needed to close the session properly.
> > +
> > + Args:
> > + force: Force the closure of the connection. This may not clean up all resources.
> > + This doesn't have to be implemented in the overloaded method.
> > """
> >
> > @abstractmethod
> > def is_alive(self) -> bool:
> > - """
> > - Check whether the remote session is still responding.
> > - """
> > + """Check whether the remote session is still responding."""
> >
> > @abstractmethod
> > def copy_from(
> > @@ -147,12 +207,12 @@ def copy_from(
> > ) -> None:
> > """Copy a file from the remote Node to the local filesystem.
> >
> > - Copy source_file from the remote Node associated with this remote
> > - session to destination_file on the local filesystem.
> > + Copy `source_file` from the remote Node associated with this remote session
> > + to `destination_file` on the local filesystem.
> >
> > Args:
> > - source_file: the file on the remote Node.
> > - destination_file: a file or directory path on the local filesystem.
> > + source_file: The file on the remote Node.
> > + destination_file: A file or directory path on the local filesystem.
> > """
> >
> > @abstractmethod
> > @@ -163,10 +223,10 @@ def copy_to(
> > ) -> None:
> > """Copy a file from local filesystem to the remote Node.
> >
> > - Copy source_file from local filesystem to destination_file
> > - on the remote Node associated with this remote session.
> > + Copy `source_file` from local filesystem to `destination_file` on the remote Node
> > + associated with this remote session.
> >
> > Args:
> > - source_file: the file on the local filesystem.
> > - destination_file: a file or directory path on the remote Node.
> > + source_file: The file on the local filesystem.
> > + destination_file: A file or directory path on the remote Node.
> > """
> > diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
> > index cee11d14d6..7186490a9a 100644
> > --- a/dts/framework/remote_session/ssh_session.py
> > +++ b/dts/framework/remote_session/ssh_session.py
> > @@ -1,6 +1,8 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >
> > +"""SSH session remote session."""
>
> Is the double "session" intended?
>
Not really, I'll remove the first occurence.
> > +
> > import socket
> > import traceback
> > from pathlib import PurePath
> > @@ -26,13 +28,8 @@
> > class SSHSession(RemoteSession):
> > """A persistent SSH connection to a remote Node.
> >
> > - The connection is implemented with the Fabric Python library.
> > -
> > - Args:
> > - node_config: The configuration of the Node to connect to.
> > - session_name: The name of the session.
> > - logger: The logger used for logging.
> > - This should be passed from the parent OSSession.
> > + The connection is implemented with
> > + `the Fabric Python library <https://docs.fabfile.org/en/latest/>`_.
> >
> > Attributes:
> > session: The underlying Fabric SSH connection.
> > @@ -80,6 +77,7 @@ def _connect(self) -> None:
> > raise SSHConnectionError(self.hostname, errors)
> >
> > def is_alive(self) -> bool:
> > + """Overrides :meth:`~.remote_session.RemoteSession.is_alive`."""
> > return self.session.is_connected
> >
> > def _send_command(
> > @@ -89,7 +87,7 @@ def _send_command(
> >
> > Args:
> > command: The command to execute.
> > - timeout: Wait at most this many seconds for the execution to complete.
> > + timeout: Wait at most this long in seconds to execute the command.
>
> Is the timeout actually to start running the command and not to wait for
> it to be completed?
>
It is to wait for it to be completed. The wording is a bit confusing,
what about:
Wait at most this long in seconds for the command execution to complete.
I'll change this in all places where timeout is documented.
> > env: Extra environment variables that will be used in command execution.
> >
> > Raises:
> > @@ -118,6 +116,7 @@ def copy_from(
> > source_file: str | PurePath,
> > destination_file: str | PurePath,
> > ) -> None:
> > + """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
> > self.session.get(str(destination_file), str(source_file))
> >
> > def copy_to(
> > @@ -125,6 +124,7 @@ def copy_to(
> > source_file: str | PurePath,
> > destination_file: str | PurePath,
> > ) -> None:
> > + """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
> > self.session.put(str(source_file), str(destination_file))
> >
> > def _close(self, force: bool = False) -> None:
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 11/21] dts: remote session docstring update
2023-11-22 11:13 ` Juraj Linkeš
@ 2023-11-22 11:25 ` Yoan Picchi
0 siblings, 0 replies; 393+ messages in thread
From: Yoan Picchi @ 2023-11-22 11:25 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On 11/22/23 11:13, Juraj Linkeš wrote:
> On Tue, Nov 21, 2023 at 4:36 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>>
>> On 11/15/23 13:09, Juraj Linkeš wrote:
>>> Format according to the Google format and PEP257, with slight
>>> deviations.
>>>
>>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>>> ---
>>> dts/framework/remote_session/__init__.py | 39 +++++-
>>> .../remote_session/remote_session.py | 128 +++++++++++++-----
>>> dts/framework/remote_session/ssh_session.py | 16 +--
>>> 3 files changed, 135 insertions(+), 48 deletions(-)
>>>
>>> diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
>>> index 5e7ddb2b05..51a01d6b5e 100644
>>> --- a/dts/framework/remote_session/__init__.py
>>> +++ b/dts/framework/remote_session/__init__.py
>>> @@ -2,12 +2,14 @@
>>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>>> # Copyright(c) 2023 University of New Hampshire
>>>
>>> -"""
>>> -The package provides modules for managing remote connections to a remote host (node),
>>> -differentiated by OS.
>>> -The package provides a factory function, create_session, that returns the appropriate
>>> -remote connection based on the passed configuration. The differences are in the
>>> -underlying transport protocol (e.g. SSH) and remote OS (e.g. Linux).
>>> +"""Remote interactive and non-interactive sessions.
>>> +
>>> +This package provides modules for managing remote connections to a remote host (node).
>>> +
>>> +The non-interactive sessions send commands and return their output and exit code.
>>> +
>>> +The interactive sessions open an interactive shell which is continuously open,
>>> +allowing it to send and receive data within that particular shell.
>>> """
>>>
>>> # pylama:ignore=W0611
>>> @@ -26,10 +28,35 @@
>>> def create_remote_session(
>>> node_config: NodeConfiguration, name: str, logger: DTSLOG
>>> ) -> RemoteSession:
>>> + """Factory for non-interactive remote sessions.
>>> +
>>> + The function returns an SSH session, but will be extended if support
>>> + for other protocols is added.
>>> +
>>> + Args:
>>> + node_config: The test run configuration of the node to connect to.
>>> + name: The name of the session.
>>> + logger: The logger instance this session will use.
>>> +
>>> + Returns:
>>> + The SSH remote session.
>>> + """
>>> return SSHSession(node_config, name, logger)
>>>
>>>
>>> def create_interactive_session(
>>> node_config: NodeConfiguration, logger: DTSLOG
>>> ) -> InteractiveRemoteSession:
>>> + """Factory for interactive remote sessions.
>>> +
>>> + The function returns an interactive SSH session, but will be extended if support
>>> + for other protocols is added.
>>> +
>>> + Args:
>>> + node_config: The test run configuration of the node to connect to.
>>> + logger: The logger instance this session will use.
>>> +
>>> + Returns:
>>> + The interactive SSH remote session.
>>> + """
>>> return InteractiveRemoteSession(node_config, logger)
>>> diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
>>> index 0647d93de4..629c2d7b9c 100644
>>> --- a/dts/framework/remote_session/remote_session.py
>>> +++ b/dts/framework/remote_session/remote_session.py
>>> @@ -3,6 +3,13 @@
>>> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>>> # Copyright(c) 2022-2023 University of New Hampshire
>>>
>>> +"""Base remote session.
>>> +
>>> +This module contains the abstract base class for remote sessions and defines
>>> +the structure of the result of a command execution.
>>> +"""
>>> +
>>> +
>>> import dataclasses
>>> from abc import ABC, abstractmethod
>>> from pathlib import PurePath
>>> @@ -15,8 +22,14 @@
>>>
>>> @dataclasses.dataclass(slots=True, frozen=True)
>>> class CommandResult:
>>> - """
>>> - The result of remote execution of a command.
>>> + """The result of remote execution of a command.
>>> +
>>> + Attributes:
>>> + name: The name of the session that executed the command.
>>> + command: The executed command.
>>> + stdout: The standard output the command produced.
>>> + stderr: The standard error output the command produced.
>>> + return_code: The return code the command exited with.
>>> """
>>>
>>> name: str
>>> @@ -26,6 +39,7 @@ class CommandResult:
>>> return_code: int
>>>
>>> def __str__(self) -> str:
>>> + """Format the command outputs."""
>>> return (
>>> f"stdout: '{self.stdout}'\n"
>>> f"stderr: '{self.stderr}'\n"
>>> @@ -34,13 +48,24 @@ def __str__(self) -> str:
>>>
>>>
>>> class RemoteSession(ABC):
>>> - """
>>> - The base class for defining which methods must be implemented in order to connect
>>> - to a remote host (node) and maintain a remote session. The derived classes are
>>> - supposed to implement/use some underlying transport protocol (e.g. SSH) to
>>> - implement the methods. On top of that, it provides some basic services common to
>>> - all derived classes, such as keeping history and logging what's being executed
>>> - on the remote node.
>>> + """Non-interactive remote session.
>>> +
>>> + The abstract methods must be implemented in order to connect to a remote host (node)
>>> + and maintain a remote session.
>>> + The subclasses must use (or implement) some underlying transport protocol (e.g. SSH)
>>> + to implement the methods. On top of that, it provides some basic services common to all
>>> + subclasses, such as keeping history and logging what's being executed on the remote node.
>>> +
>>> + Attributes:
>>> + name: The name of the session.
>>> + hostname: The node's hostname. Could be an IP (possibly with port, separated by a colon)
>>> + or a domain name.
>>> + ip: The IP address of the node or a domain name, whichever was used in `hostname`.
>>> + port: The port of the node, if given in `hostname`.
>>> + username: The username used in the connection.
>>> + password: The password used in the connection. Most frequently empty,
>>> + as the use of passwords is discouraged.
>>> + history: The executed commands during this session.
>>> """
>>>
>>> name: str
>>> @@ -59,6 +84,16 @@ def __init__(
>>> session_name: str,
>>> logger: DTSLOG,
>>> ):
>>> + """Connect to the node during initialization.
>>> +
>>> + Args:
>>> + node_config: The test run configuration of the node to connect to.
>>> + session_name: The name of the session.
>>> + logger: The logger instance this session will use.
>>> +
>>> + Raises:
>>> + SSHConnectionError: If the connection to the node was not successful.
>>> + """
>>> self._node_config = node_config
>>>
>>> self.name = session_name
>>> @@ -79,8 +114,13 @@ def __init__(
>>>
>>> @abstractmethod
>>> def _connect(self) -> None:
>>> - """
>>> - Create connection to assigned node.
>>> + """Create a connection to the node.
>>> +
>>> + The implementation must assign the established session to self.session.
>>> +
>>> + The implementation must except all exceptions and convert them to an SSHConnectionError.
>>> +
>>> + The implementation may optionally implement retry attempts.
>>> """
>>>
>>> def send_command(
>>> @@ -90,11 +130,24 @@ def send_command(
>>> verify: bool = False,
>>> env: dict | None = None,
>>> ) -> CommandResult:
>>> - """
>>> - Send a command to the connected node using optional env vars
>>> - and return CommandResult.
>>> - If verify is True, check the return code of the executed command
>>> - and raise a RemoteCommandExecutionError if the command failed.
>>> + """Send `command` to the connected node.
>>> +
>>> + The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
>>> + environment variable configure the timeout of command execution.
>>> +
>>> + Args:
>>> + command: The command to execute.
>>> + timeout: Wait at most this long in seconds to execute `command`.
>>> + verify: If :data:`True`, will check the exit code of `command`.
>>> + env: A dictionary with environment variables to be used with `command` execution.
>>> +
>>> + Raises:
>>> + SSHSessionDeadError: If the session isn't alive when sending `command`.
>>> + SSHTimeoutError: If `command` execution timed out.
>>> + RemoteCommandExecutionError: If verify is :data:`True` and `command` execution failed.
>>> +
>>> + Returns:
>>> + The output of the command along with the return code.
>>> """
>>> self._logger.info(
>>> f"Sending: '{command}'" + (f" with env vars: '{env}'" if env else "")
>>> @@ -115,29 +168,36 @@ def send_command(
>>> def _send_command(
>>> self, command: str, timeout: float, env: dict | None
>>> ) -> CommandResult:
>>> - """
>>> - Use the underlying protocol to execute the command using optional env vars
>>> - and return CommandResult.
>>> + """Send a command to the connected node.
>>> +
>>> + The implementation must execute the command remotely with `env` environment variables
>>> + and return the result.
>>> +
>>> + The implementation must except all exceptions and raise an SSHSessionDeadError if
>>> + the session is not alive and an SSHTimeoutError if the command execution times out.
>>
>> 3 way "and". Needs comas or splitting the sentence.
>>
>
> What about this?
>
> The implementation must except all exceptions and raise:
>
> * SSHSessionDeadError if the session is not alive,
> * SSHTimeoutError if the command execution times out.
>
Sounds good.
>
>>> """
>>>
>>> def close(self, force: bool = False) -> None:
>>> - """
>>> - Close the remote session and free all used resources.
>>> + """Close the remote session and free all used resources.
>>> +
>>> + Args:
>>> + force: Force the closure of the connection. This may not clean up all resources.
>>> """
>>> self._logger.logger_exit()
>>> self._close(force)
>>>
>>> @abstractmethod
>>> def _close(self, force: bool = False) -> None:
>>> - """
>>> - Execute protocol specific steps needed to close the session properly.
>>> + """Protocol specific steps needed to close the session properly.
>>> +
>>> + Args:
>>> + force: Force the closure of the connection. This may not clean up all resources.
>>> + This doesn't have to be implemented in the overloaded method.
>>> """
>>>
>>> @abstractmethod
>>> def is_alive(self) -> bool:
>>> - """
>>> - Check whether the remote session is still responding.
>>> - """
>>> + """Check whether the remote session is still responding."""
>>>
>>> @abstractmethod
>>> def copy_from(
>>> @@ -147,12 +207,12 @@ def copy_from(
>>> ) -> None:
>>> """Copy a file from the remote Node to the local filesystem.
>>>
>>> - Copy source_file from the remote Node associated with this remote
>>> - session to destination_file on the local filesystem.
>>> + Copy `source_file` from the remote Node associated with this remote session
>>> + to `destination_file` on the local filesystem.
>>>
>>> Args:
>>> - source_file: the file on the remote Node.
>>> - destination_file: a file or directory path on the local filesystem.
>>> + source_file: The file on the remote Node.
>>> + destination_file: A file or directory path on the local filesystem.
>>> """
>>>
>>> @abstractmethod
>>> @@ -163,10 +223,10 @@ def copy_to(
>>> ) -> None:
>>> """Copy a file from local filesystem to the remote Node.
>>>
>>> - Copy source_file from local filesystem to destination_file
>>> - on the remote Node associated with this remote session.
>>> + Copy `source_file` from local filesystem to `destination_file` on the remote Node
>>> + associated with this remote session.
>>>
>>> Args:
>>> - source_file: the file on the local filesystem.
>>> - destination_file: a file or directory path on the remote Node.
>>> + source_file: The file on the local filesystem.
>>> + destination_file: A file or directory path on the remote Node.
>>> """
>>> diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
>>> index cee11d14d6..7186490a9a 100644
>>> --- a/dts/framework/remote_session/ssh_session.py
>>> +++ b/dts/framework/remote_session/ssh_session.py
>>> @@ -1,6 +1,8 @@
>>> # SPDX-License-Identifier: BSD-3-Clause
>>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>>>
>>> +"""SSH session remote session."""
>>
>> Is the double "session" intended?
>>
>
> Not really, I'll remove the first occurence.
>
>>> +
>>> import socket
>>> import traceback
>>> from pathlib import PurePath
>>> @@ -26,13 +28,8 @@
>>> class SSHSession(RemoteSession):
>>> """A persistent SSH connection to a remote Node.
>>>
>>> - The connection is implemented with the Fabric Python library.
>>> -
>>> - Args:
>>> - node_config: The configuration of the Node to connect to.
>>> - session_name: The name of the session.
>>> - logger: The logger used for logging.
>>> - This should be passed from the parent OSSession.
>>> + The connection is implemented with
>>> + `the Fabric Python library <https://docs.fabfile.org/en/latest/>`_.
>>>
>>> Attributes:
>>> session: The underlying Fabric SSH connection.
>>> @@ -80,6 +77,7 @@ def _connect(self) -> None:
>>> raise SSHConnectionError(self.hostname, errors)
>>>
>>> def is_alive(self) -> bool:
>>> + """Overrides :meth:`~.remote_session.RemoteSession.is_alive`."""
>>> return self.session.is_connected
>>>
>>> def _send_command(
>>> @@ -89,7 +87,7 @@ def _send_command(
>>>
>>> Args:
>>> command: The command to execute.
>>> - timeout: Wait at most this many seconds for the execution to complete.
>>> + timeout: Wait at most this long in seconds to execute the command.
>>
>> Is the timeout actually to start running the command and not to wait for
>> it to be completed?
>>
>
> It is to wait for it to be completed. The wording is a bit confusing,
> what about:
>
> Wait at most this long in seconds for the command execution to complete.
>
> I'll change this in all places where timeout is documented.
Sounds good. I think I saw this confusing wording 3 times so far, but I
can't quite remember which files it was.
>
>>> env: Extra environment variables that will be used in command execution.
>>>
>>> Raises:
>>> @@ -118,6 +116,7 @@ def copy_from(
>>> source_file: str | PurePath,
>>> destination_file: str | PurePath,
>>> ) -> None:
>>> + """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
>>> self.session.get(str(destination_file), str(source_file))
>>>
>>> def copy_to(
>>> @@ -125,6 +124,7 @@ def copy_to(
>>> source_file: str | PurePath,
>>> destination_file: str | PurePath,
>>> ) -> None:
>>> + """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
>>> self.session.put(str(source_file), str(destination_file))
>>>
>>> def _close(self, force: bool = False) -> None:
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 12/21] dts: interactive remote session docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (10 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 11/21] dts: remote session " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 13/21] dts: port and virtual device " Juraj Linkeš
` (9 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../interactive_remote_session.py | 36 +++----
.../remote_session/interactive_shell.py | 99 +++++++++++--------
dts/framework/remote_session/python_shell.py | 26 ++++-
dts/framework/remote_session/testpmd_shell.py | 61 +++++++++---
4 files changed, 150 insertions(+), 72 deletions(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 9085a668e8..c1bf30ac61 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -22,27 +22,23 @@
class InteractiveRemoteSession:
"""SSH connection dedicated to interactive applications.
- This connection is created using paramiko and is a persistent connection to the
- host. This class defines methods for connecting to the node and configures this
- connection to send "keep alive" packets every 30 seconds. Because paramiko attempts
- to use SSH keys to establish a connection first, providing a password is optional.
- This session is utilized by InteractiveShells and cannot be interacted with
- directly.
-
- Arguments:
- node_config: Configuration class for the node you are connecting to.
- _logger: Desired logger for this session to use.
+ The connection is created using `paramiko <https://docs.paramiko.org/en/latest/>`_
+ and is a persistent connection to the host. This class defines the methods for connecting
+ to the node and configures the connection to send "keep alive" packets every 30 seconds.
+ Because paramiko attempts to use SSH keys to establish a connection first, providing
+ a password is optional. This session is utilized by InteractiveShells
+ and cannot be interacted with directly.
Attributes:
- hostname: Hostname that will be used to initialize a connection to the node.
- ip: A subsection of hostname that removes the port for the connection if there
+ hostname: The hostname that will be used to initialize a connection to the node.
+ ip: A subsection of `hostname` that removes the port for the connection if there
is one. If there is no port, this will be the same as hostname.
- port: Port to use for the ssh connection. This will be extracted from the
- hostname if there is a port included, otherwise it will default to 22.
+ port: Port to use for the ssh connection. This will be extracted from `hostname`
+ if there is a port included, otherwise it will default to ``22``.
username: User to connect to the node with.
password: Password of the user connecting to the host. This will default to an
empty string if a password is not provided.
- session: Underlying paramiko connection.
+ session: The underlying paramiko connection.
Raises:
SSHConnectionError: There is an error creating the SSH connection.
@@ -58,9 +54,15 @@ class InteractiveRemoteSession:
_node_config: NodeConfiguration
_transport: Transport | None
- def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG) -> None:
+ def __init__(self, node_config: NodeConfiguration, logger: DTSLOG) -> None:
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+ """
self._node_config = node_config
- self._logger = _logger
+ self._logger = logger
self.hostname = node_config.hostname
self.username = node_config.user
self.password = node_config.password if node_config.password else ""
diff --git a/dts/framework/remote_session/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
index c24376b2a8..a98a822e91 100644
--- a/dts/framework/remote_session/interactive_shell.py
+++ b/dts/framework/remote_session/interactive_shell.py
@@ -3,18 +3,20 @@
"""Common functionality for interactive shell handling.
-This base class, InteractiveShell, is meant to be extended by other classes that
-contain functionality specific to that shell type. These derived classes will often
-modify things like the prompt to expect or the arguments to pass into the application,
-but still utilize the same method for sending a command and collecting output. How
-this output is handled however is often application specific. If an application needs
-elevated privileges to start it is expected that the method for gaining those
-privileges is provided when initializing the class.
+The base class, :class:`InteractiveShell`, is meant to be extended by subclasses that contain
+functionality specific to that shell type. These subclasses will often modify things like
+the prompt to expect or the arguments to pass into the application, but still utilize
+the same method for sending a command and collecting output. How this output is handled however
+is often application specific. If an application needs elevated privileges to start it is expected
+that the method for gaining those privileges is provided when initializing the class.
+
+The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+environment variable configure the timeout of getting the output from command execution.
"""
from abc import ABC
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from paramiko import Channel, SSHClient, channel # type: ignore[import]
@@ -30,28 +32,6 @@ class InteractiveShell(ABC):
and collecting input until reaching a certain prompt. All interactive applications
will use the same SSH connection, but each will create their own channel on that
session.
-
- Arguments:
- interactive_session: The SSH session dedicated to interactive shells.
- logger: Logger used for displaying information in the console.
- get_privileged_command: Method for modifying a command to allow it to use
- elevated privileges. If this is None, the application will not be started
- with elevated privileges.
- app_args: Command line arguments to be passed to the application on startup.
- timeout: Timeout used for the SSH channel that is dedicated to this interactive
- shell. This timeout is for collecting output, so if reading from the buffer
- and no output is gathered within the timeout, an exception is thrown.
-
- Attributes
- _default_prompt: Prompt to expect at the end of output when sending a command.
- This is often overridden by derived classes.
- _command_extra_chars: Extra characters to add to the end of every command
- before sending them. This is often overridden by derived classes and is
- most commonly an additional newline character.
- path: Path to the executable to start the interactive application.
- dpdk_app: Whether this application is a DPDK app. If it is, the build
- directory for DPDK on the node will be prepended to the path to the
- executable.
"""
_interactive_session: SSHClient
@@ -61,10 +41,22 @@ class InteractiveShell(ABC):
_logger: DTSLOG
_timeout: float
_app_args: str
- _default_prompt: str = ""
- _command_extra_chars: str = ""
- path: PurePath
- dpdk_app: bool = False
+
+ #: Prompt to expect at the end of output when sending a command.
+ #: This is often overridden by subclasses.
+ _default_prompt: ClassVar[str] = ""
+
+ #: Extra characters to add to the end of every command
+ #: before sending them. This is often overridden by subclasses and is
+ #: most commonly an additional newline character.
+ _command_extra_chars: ClassVar[str] = ""
+
+ #: Path to the executable to start the interactive application.
+ path: ClassVar[PurePath]
+
+ #: Whether this application is a DPDK app. If it is, the build directory
+ #: for DPDK on the node will be prepended to the path to the executable.
+ dpdk_app: ClassVar[bool] = False
def __init__(
self,
@@ -74,6 +66,19 @@ def __init__(
app_args: str = "",
timeout: float = SETTINGS.timeout,
) -> None:
+ """Create an SSH channel during initialization.
+
+ Args:
+ interactive_session: The SSH session dedicated to interactive shells.
+ logger: The logger instance this session will use.
+ get_privileged_command: A method for modifying a command to allow it to use
+ elevated privileges. If :data:`None`, the application will not be started
+ with elevated privileges.
+ app_args: The command line arguments to be passed to the application on startup.
+ timeout: The timeout used for the SSH channel that is dedicated to this interactive
+ shell. This timeout is for collecting output, so if reading from the buffer
+ and no output is gathered within the timeout, an exception is thrown.
+ """
self._interactive_session = interactive_session
self._ssh_channel = self._interactive_session.invoke_shell()
self._stdin = self._ssh_channel.makefile_stdin("w")
@@ -92,6 +97,10 @@ def _start_application(
This method is often overridden by subclasses as their process for
starting may look different.
+
+ Args:
+ get_privileged_command: A function (but could be any callable) that produces
+ the version of the command with elevated privileges.
"""
start_command = f"{self.path} {self._app_args}"
if get_privileged_command is not None:
@@ -99,16 +108,24 @@ def _start_application(
self.send_command(start_command)
def send_command(self, command: str, prompt: str | None = None) -> str:
- """Send a command and get all output before the expected ending string.
+ """Send `command` and get all output before the expected ending string.
Lines that expect input are not included in the stdout buffer, so they cannot
- be used for expect. For example, if you were prompted to log into something
- with a username and password, you cannot expect "username:" because it won't
- yet be in the stdout buffer. A workaround for this could be consuming an
- extra newline character to force the current prompt into the stdout buffer.
+ be used for expect.
+
+ Example:
+ If you were prompted to log into something with a username and password,
+ you cannot expect ``username:`` because it won't yet be in the stdout buffer.
+ A workaround for this could be consuming an extra newline character to force
+ the current `prompt` into the stdout buffer.
+
+ Args:
+ command: The command to send.
+ prompt: After sending the command, `send_command` will be expecting this string.
+ If :data:`None`, will use the class's default prompt.
Returns:
- All output in the buffer before expected string
+ All output in the buffer before expected string.
"""
self._logger.info(f"Sending: '{command}'")
if prompt is None:
@@ -126,8 +143,10 @@ def send_command(self, command: str, prompt: str | None = None) -> str:
return out
def close(self) -> None:
+ """Properly free all resources."""
self._stdin.close()
self._ssh_channel.close()
def __del__(self) -> None:
+ """Make sure the session is properly closed before deleting the object."""
self.close()
diff --git a/dts/framework/remote_session/python_shell.py b/dts/framework/remote_session/python_shell.py
index cc3ad48a68..ccfd3783e8 100644
--- a/dts/framework/remote_session/python_shell.py
+++ b/dts/framework/remote_session/python_shell.py
@@ -1,12 +1,32 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Python interactive shell.
+
+Typical usage example in a TestSuite::
+
+ from framework.remote_session import PythonShell
+ python_shell = self.tg_node.create_interactive_shell(
+ PythonShell, timeout=5, privileged=True
+ )
+ python_shell.send_command("print('Hello World')")
+ python_shell.close()
+"""
+
from pathlib import PurePath
+from typing import ClassVar
from .interactive_shell import InteractiveShell
class PythonShell(InteractiveShell):
- _default_prompt: str = ">>>"
- _command_extra_chars: str = "\n"
- path: PurePath = PurePath("python3")
+ """Python interactive shell."""
+
+ #: Python's prompt.
+ _default_prompt: ClassVar[str] = ">>>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
+
+ #: The Python executable.
+ path: ClassVar[PurePath] = PurePath("python3")
diff --git a/dts/framework/remote_session/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
index 1455b5a199..2632515d74 100644
--- a/dts/framework/remote_session/testpmd_shell.py
+++ b/dts/framework/remote_session/testpmd_shell.py
@@ -1,45 +1,82 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Testpmd interactive shell.
+
+Typical usage example in a TestSuite::
+
+ testpmd_shell = self.sut_node.create_interactive_shell(
+ TestPmdShell, privileged=True
+ )
+ devices = testpmd_shell.get_devices()
+ for device in devices:
+ print(device)
+ testpmd_shell.close()
+"""
+
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from .interactive_shell import InteractiveShell
class TestPmdDevice(object):
+ """The data of a device that testpmd can recognize.
+
+ Attributes:
+ pci_address: The PCI address of the device.
+ """
+
pci_address: str
def __init__(self, pci_address_line: str):
+ """Initialize the device from the testpmd output line string.
+
+ Args:
+ pci_address_line: A line of testpmd output that contains a device.
+ """
self.pci_address = pci_address_line.strip().split(": ")[1].strip()
def __str__(self) -> str:
+ """The PCI address captures what the device is."""
return self.pci_address
class TestPmdShell(InteractiveShell):
- path: PurePath = PurePath("app", "dpdk-testpmd")
- dpdk_app: bool = True
- _default_prompt: str = "testpmd>"
- _command_extra_chars: str = (
- "\n" # We want to append an extra newline to every command
- )
+ """Testpmd interactive shell.
+
+ The testpmd shell users should never use
+ the :meth:`~framework.remote_session.interactive_shell.InteractiveShell.send_command` method
+ directly, but rather call specialized methods. If there isn't one that satisfies a need,
+ it should be added.
+ """
+
+ #: The path to the testpmd executable.
+ path: ClassVar[PurePath] = PurePath("app", "dpdk-testpmd")
+
+ #: Flag this as a DPDK app so that it's clear this is not a system app and
+ #: needs to be looked in a specific path.
+ dpdk_app: ClassVar[bool] = True
+
+ #: The testpmd's prompt.
+ _default_prompt: ClassVar[str] = "testpmd>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
def _start_application(
self, get_privileged_command: Callable[[str], str] | None
) -> None:
- """See "_start_application" in InteractiveShell."""
self._app_args += " -- -i"
super()._start_application(get_privileged_command)
def get_devices(self) -> list[TestPmdDevice]:
- """Get a list of device names that are known to testpmd
+ """Get a list of device names that are known to testpmd.
- Uses the device info listed in testpmd and then parses the output to
- return only the names of the devices.
+ Uses the device info listed in testpmd and then parses the output.
Returns:
- A list of strings representing device names (e.g. 0000:14:00.1)
+ A list of devices.
"""
dev_info: str = self.send_command("show device info all")
dev_list: list[TestPmdDevice] = []
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 13/21] dts: port and virtual device docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (11 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 12/21] dts: interactive " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 14/21] dts: cpu " Juraj Linkeš
` (8 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/__init__.py | 16 ++++--
dts/framework/testbed_model/port.py | 53 +++++++++++++++----
dts/framework/testbed_model/virtual_device.py | 17 +++++-
3 files changed, 71 insertions(+), 15 deletions(-)
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 8ced05653b..a02be1f2d9 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -2,9 +2,19 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-This package contains the classes used to model the physical traffic generator,
-system under test and any other components that need to be interacted with.
+"""Testbed modelling.
+
+This package defines the testbed elements DTS works with:
+
+ * A system under test node: :class:`SutNode`,
+ * A traffic generator node: :class:`TGNode`,
+ * The ports of network interface cards (NICs) present on nodes: :class:`Port`,
+ * The logical cores of CPUs present on nodes: :class:`LogicalCore`,
+ * The virtual devices that can be created on nodes: :class:`VirtualDevice`,
+ * The operating systems running on nodes: :class:`LinuxSession` and :class:`PosixSession`.
+
+DTS needs to be able to connect to nodes and understand some of the hardware present on these nodes
+to properly build and test DPDK.
"""
# pylama:ignore=W0611
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 680c29bfe3..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -2,6 +2,13 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""NIC port model.
+
+Basic port information, such as location (the port are identified by their PCI address on a node),
+drivers and address.
+"""
+
+
from dataclasses import dataclass
from framework.config import PortConfig
@@ -9,24 +16,35 @@
@dataclass(slots=True, frozen=True)
class PortIdentifier:
+ """The port identifier.
+
+ Attributes:
+ node: The node where the port resides.
+ pci: The PCI address of the port on `node`.
+ """
+
node: str
pci: str
@dataclass(slots=True)
class Port:
- """
- identifier: The PCI address of the port on a node.
-
- os_driver: The driver used by this port when the OS is controlling it.
- Example: i40e
- os_driver_for_dpdk: The driver the device must be bound to for DPDK to use it,
- Example: vfio-pci.
+ """Physical port on a node.
- Note: os_driver and os_driver_for_dpdk may be the same thing.
- Example: mlx5_core
+ The ports are identified by the node they're on and their PCI addresses. The port on the other
+ side of the connection is also captured here.
+ Each port is serviced by a driver, which may be different for the operating system (`os_driver`)
+ and for DPDK (`os_driver_for_dpdk`). For some devices, they are the same, e.g.: ``mlx5_core``.
- peer: The identifier of a port this port is connected with.
+ Attributes:
+ identifier: The PCI address of the port on a node.
+ os_driver: The operating system driver name when the operating system controls the port,
+ e.g.: ``i40e``.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK, e.g.: ``vfio-pci``.
+ peer: The identifier of a port this port is connected with.
+ The `peer` is on a different node.
+ mac_address: The MAC address of the port.
+ logical_name: The logical name of the port. Must be discovered.
"""
identifier: PortIdentifier
@@ -37,6 +55,12 @@ class Port:
logical_name: str = ""
def __init__(self, node_name: str, config: PortConfig):
+ """Initialize the port from `node_name` and `config`.
+
+ Args:
+ node_name: The name of the port's node.
+ config: The test run configuration of the port.
+ """
self.identifier = PortIdentifier(
node=node_name,
pci=config.pci,
@@ -47,14 +71,23 @@ def __init__(self, node_name: str, config: PortConfig):
@property
def node(self) -> str:
+ """The node where the port resides."""
return self.identifier.node
@property
def pci(self) -> str:
+ """The PCI address of the port."""
return self.identifier.pci
@dataclass(slots=True, frozen=True)
class PortLink:
+ """The physical, cabled connection between the ports.
+
+ Attributes:
+ sut_port: The port on the SUT node connected to `tg_port`.
+ tg_port: The port on the TG node connected to `sut_port`.
+ """
+
sut_port: Port
tg_port: Port
diff --git a/dts/framework/testbed_model/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
index eb664d9f17..e9b5e9c3be 100644
--- a/dts/framework/testbed_model/virtual_device.py
+++ b/dts/framework/testbed_model/virtual_device.py
@@ -1,16 +1,29 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Virtual devices model.
+
+Alongside support for physical hardware, DPDK can create various virtual devices.
+"""
+
class VirtualDevice(object):
- """
- Base class for virtual devices used by DPDK.
+ """Base class for virtual devices used by DPDK.
+
+ Attributes:
+ name: The name of the virtual device.
"""
name: str
def __init__(self, name: str):
+ """Initialize the virtual device.
+
+ Args:
+ name: The name of the virtual device.
+ """
self.name = name
def __str__(self) -> str:
+ """This corresponds to the name used for DPDK devices."""
return self.name
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 14/21] dts: cpu docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (12 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 13/21] dts: port and virtual device " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-21 17:45 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 15/21] dts: os session " Juraj Linkeš
` (7 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/cpu.py | 196 +++++++++++++++++++++--------
1 file changed, 144 insertions(+), 52 deletions(-)
diff --git a/dts/framework/testbed_model/cpu.py b/dts/framework/testbed_model/cpu.py
index 8fe785dfe4..4edeb4a7c2 100644
--- a/dts/framework/testbed_model/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -1,6 +1,22 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""CPU core representation and filtering.
+
+This module provides a unified representation of logical CPU cores along
+with filtering capabilities.
+
+When symmetric multiprocessing (SMP or multithreading) is enabled on a server,
+the physical CPU cores are split into logical CPU cores with different IDs.
+
+:class:`LogicalCoreCountFilter` filters by the number of logical cores. It's possible to specify
+the socket from which to filter the number of logical cores. It's also possible to not use all
+logical CPU cores from each physical core (e.g. only the first logical core of each physical core).
+
+:class:`LogicalCoreListFilter` filters by logical core IDs. This mostly checks that
+the logical cores are actually present on the server.
+"""
+
import dataclasses
from abc import ABC, abstractmethod
from collections.abc import Iterable, ValuesView
@@ -11,9 +27,17 @@
@dataclass(slots=True, frozen=True)
class LogicalCore(object):
- """
- Representation of a CPU core. A physical core is represented in OS
- by multiple logical cores (lcores) if CPU multithreading is enabled.
+ """Representation of a logical CPU core.
+
+ A physical core is represented in OS by multiple logical cores (lcores)
+ if CPU multithreading is enabled. When multithreading is disabled, their IDs are the same.
+
+ Attributes:
+ lcore: The logical core ID of a CPU core. It's the same as `core` with
+ disabled multithreading.
+ core: The physical core ID of a CPU core.
+ socket: The physical socket ID where the CPU resides.
+ node: The NUMA node ID where the CPU resides.
"""
lcore: int
@@ -22,27 +46,36 @@ class LogicalCore(object):
node: int
def __int__(self) -> int:
+ """The CPU is best represented by the logical core, as that's what we configure in EAL."""
return self.lcore
class LogicalCoreList(object):
- """
- Convert these options into a list of logical core ids.
- lcore_list=[LogicalCore1, LogicalCore2] - a list of LogicalCores
- lcore_list=[0,1,2,3] - a list of int indices
- lcore_list=['0','1','2-3'] - a list of str indices; ranges are supported
- lcore_list='0,1,2-3' - a comma delimited str of indices; ranges are supported
-
- The class creates a unified format used across the framework and allows
- the user to use either a str representation (using str(instance) or directly
- in f-strings) or a list representation (by accessing instance.lcore_list).
- Empty lcore_list is allowed.
+ r"""A unified way to store :class:`LogicalCore`\s.
+
+ Create a unified format used across the framework and allow the user to use
+ either a :class:`str` representation (using ``str(instance)`` or directly in f-strings)
+ or a :class:`list` representation (by accessing the `lcore_list` property,
+ which stores logical core IDs).
"""
_lcore_list: list[int]
_lcore_str: str
def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
+ """Process `lcore_list`, then sort.
+
+ There are four supported logical core list formats::
+
+ lcore_list=[LogicalCore1, LogicalCore2] # a list of LogicalCores
+ lcore_list=[0,1,2,3] # a list of int indices
+ lcore_list=['0','1','2-3'] # a list of str indices; ranges are supported
+ lcore_list='0,1,2-3' # a comma delimited str of indices; ranges are supported
+
+ Args:
+ lcore_list: Various ways to represent multiple logical cores.
+ Empty `lcore_list` is allowed.
+ """
self._lcore_list = []
if isinstance(lcore_list, str):
lcore_list = lcore_list.split(",")
@@ -60,6 +93,7 @@ def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
@property
def lcore_list(self) -> list[int]:
+ """The logical core IDs."""
return self._lcore_list
def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
@@ -89,28 +123,30 @@ def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
return formatted_core_list
def __str__(self) -> str:
+ """The consecutive ranges of logical core IDs."""
return self._lcore_str
@dataclasses.dataclass(slots=True, frozen=True)
class LogicalCoreCount(object):
- """
- Define the number of logical cores to use.
- If sockets is not None, socket_count is ignored.
- """
+ """Define the number of logical cores per physical cores per sockets."""
+ #: Use this many logical cores per each physical core.
lcores_per_core: int = 1
+ #: Use this many physical cores per each socket.
cores_per_socket: int = 2
+ #: Use this many sockets.
socket_count: int = 1
+ #: Use exactly these sockets. This takes precedence over `socket_count`,
+ #: so when `sockets` is not :data:`None`, `socket_count` is ignored.
sockets: list[int] | None = None
class LogicalCoreFilter(ABC):
- """
- Filter according to the input filter specifier. Each filter needs to be
- implemented in a derived class.
- This class only implements operations common to all filters, such as sorting
- the list to be filtered beforehand.
+ """Common filtering class.
+
+ Each filter needs to be implemented in a subclass. This base class sorts the list of cores
+ and defines the filtering method, which must be implemented by subclasses.
"""
_filter_specifier: LogicalCoreCount | LogicalCoreList
@@ -122,6 +158,17 @@ def __init__(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
):
+ """Filter according to the input filter specifier.
+
+ The input `lcore_list` is copied and sorted by physical core before filtering.
+ The list is copied so that the original is left intact.
+
+ Args:
+ lcore_list: The logical CPU cores to filter.
+ filter_specifier: Filter cores from `lcore_list` according to this filter.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If data:`False`,
+ sort in descending order.
+ """
self._filter_specifier = filter_specifier
# sorting by core is needed in case hyperthreading is enabled
@@ -132,31 +179,45 @@ def __init__(
@abstractmethod
def filter(self) -> list[LogicalCore]:
- """
- Use self._filter_specifier to filter self._lcores_to_filter
- and return the list of filtered LogicalCores.
- self._lcores_to_filter is a sorted copy of the original list,
- so it may be modified.
+ r"""Filter the cores.
+
+ Use `self._filter_specifier` to filter `self._lcores_to_filter` and return
+ the filtered :class:`LogicalCore`\s.
+ `self._lcores_to_filter` is a sorted copy of the original list, so it may be modified.
+
+ Returns:
+ The filtered cores.
"""
class LogicalCoreCountFilter(LogicalCoreFilter):
- """
+ """Filter cores by specified counts.
+
Filter the input list of LogicalCores according to specified rules:
- Use cores from the specified number of sockets or from the specified socket ids.
- If sockets is specified, it takes precedence over socket_count.
- From each of those sockets, use only cores_per_socket of cores.
- And for each core, use lcores_per_core of logical cores. Hypertheading
- must be enabled for this to take effect.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+
+ * The input `filter_specifier` is :class:`LogicalCoreCount`,
+ * Use cores from the specified number of sockets or from the specified socket ids,
+ * If `sockets` is specified, it takes precedence over `socket_count`,
+ * From each of those sockets, use only `cores_per_socket` of cores,
+ * And for each core, use `lcores_per_core` of logical cores. Hypertheading
+ must be enabled for this to take effect.
"""
_filter_specifier: LogicalCoreCount
def filter(self) -> list[LogicalCore]:
+ """Filter the cores according to :class:`LogicalCoreCount`.
+
+ Start by filtering the allowed sockets. The cores matching the allowed socket are returned.
+ The cores of each socket are stored in separate lists.
+
+ Then filter the allowed physical cores from those lists of cores per socket. When filtering
+ physical cores, store the desired number of logical cores per physical core which then
+ together constitute the final filtered list.
+
+ Returns:
+ The filtered cores.
+ """
sockets_to_filter = self._filter_sockets(self._lcores_to_filter)
filtered_lcores = []
for socket_to_filter in sockets_to_filter:
@@ -166,24 +227,37 @@ def filter(self) -> list[LogicalCore]:
def _filter_sockets(
self, lcores_to_filter: Iterable[LogicalCore]
) -> ValuesView[list[LogicalCore]]:
- """
- Remove all lcores that don't match the specified socket(s).
- If self._filter_specifier.sockets is not None, keep lcores from those sockets,
- otherwise keep lcores from the first
- self._filter_specifier.socket_count sockets.
+ """Filter a list of cores per each allowed socket.
+
+ The sockets may be specified in two ways, either a number or a specific list of sockets.
+ In case of a specific list, we just need to return the cores from those sockets.
+ If filtering a number of cores, we need to go through all cores and note which sockets
+ appear and only filter from the first n that appear.
+
+ Args:
+ lcores_to_filter: The cores to filter. These must be sorted by the physical core.
+
+ Returns:
+ A list of lists of logical CPU cores. Each list contains cores from one socket.
"""
allowed_sockets: set[int] = set()
socket_count = self._filter_specifier.socket_count
if self._filter_specifier.sockets:
+ # when sockets in filter is specified, the sockets are already set
socket_count = len(self._filter_specifier.sockets)
allowed_sockets = set(self._filter_specifier.sockets)
+ # filter socket_count sockets from all sockets by checking the socket of each CPU
filtered_lcores: dict[int, list[LogicalCore]] = {}
for lcore in lcores_to_filter:
if not self._filter_specifier.sockets:
+ # this is when sockets is not set, so we do the actual filtering
+ # when it is set, allowed_sockets is already defined and can't be changed
if len(allowed_sockets) < socket_count:
+ # allowed_sockets is a set, so adding an existing socket won't re-add it
allowed_sockets.add(lcore.socket)
if lcore.socket in allowed_sockets:
+ # separate sockets per socket; this makes it easier in further processing
if lcore.socket in filtered_lcores:
filtered_lcores[lcore.socket].append(lcore)
else:
@@ -200,12 +274,13 @@ def _filter_sockets(
def _filter_cores_from_socket(
self, lcores_to_filter: Iterable[LogicalCore]
) -> list[LogicalCore]:
- """
- Keep only the first self._filter_specifier.cores_per_socket cores.
- In multithreaded environments, keep only
- the first self._filter_specifier.lcores_per_core lcores of those cores.
- """
+ """Filter a list of cores from the given socket.
+
+ Go through the cores and note how many logical cores per physical core have been filtered.
+ Returns:
+ The filtered logical CPU cores.
+ """
# no need to use ordered dict, from Python3.7 the dict
# insertion order is preserved (LIFO).
lcore_count_per_core_map: dict[int, int] = {}
@@ -248,15 +323,21 @@ def _filter_cores_from_socket(
class LogicalCoreListFilter(LogicalCoreFilter):
- """
- Filter the input list of Logical Cores according to the input list of
- lcore indices.
- An empty LogicalCoreList won't filter anything.
+ """Filter the logical CPU cores by logical CPU core IDs.
+
+ This is a simple filter that looks at logical CPU IDs and only filter those that match.
+
+ The input filter is :class:`LogicalCoreList`. An empty LogicalCoreList won't filter anything.
"""
_filter_specifier: LogicalCoreList
def filter(self) -> list[LogicalCore]:
+ """Filter based on logical CPU core ID.
+
+ Return:
+ The filtered logical CPU cores.
+ """
if not len(self._filter_specifier.lcore_list):
return self._lcores_to_filter
@@ -279,6 +360,17 @@ def lcore_filter(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool,
) -> LogicalCoreFilter:
+ """Factory for using the right filter with `filter_specifier`.
+
+ Args:
+ core_list: The logical CPU cores to filter.
+ filter_specifier: The filter to use.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If :data:`False`,
+ sort in descending order.
+
+ Returns:
+ The filter matching `filter_specifier`.
+ """
if isinstance(filter_specifier, LogicalCoreList):
return LogicalCoreListFilter(core_list, filter_specifier, ascending)
elif isinstance(filter_specifier, LogicalCoreCount):
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 14/21] dts: cpu docstring update
2023-11-15 13:09 ` [PATCH v7 14/21] dts: cpu " Juraj Linkeš
@ 2023-11-21 17:45 ` Yoan Picchi
2023-11-22 11:18 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-21 17:45 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/testbed_model/cpu.py | 196 +++++++++++++++++++++--------
> 1 file changed, 144 insertions(+), 52 deletions(-)
>
> diff --git a/dts/framework/testbed_model/cpu.py b/dts/framework/testbed_model/cpu.py
> index 8fe785dfe4..4edeb4a7c2 100644
> --- a/dts/framework/testbed_model/cpu.py
> +++ b/dts/framework/testbed_model/cpu.py
> @@ -1,6 +1,22 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> +"""CPU core representation and filtering.
> +
> +This module provides a unified representation of logical CPU cores along
> +with filtering capabilities.
> +
> +When symmetric multiprocessing (SMP or multithreading) is enabled on a server,
> +the physical CPU cores are split into logical CPU cores with different IDs.
> +
> +:class:`LogicalCoreCountFilter` filters by the number of logical cores. It's possible to specify
> +the socket from which to filter the number of logical cores. It's also possible to not use all
> +logical CPU cores from each physical core (e.g. only the first logical core of each physical core).
> +
> +:class:`LogicalCoreListFilter` filters by logical core IDs. This mostly checks that
> +the logical cores are actually present on the server.
> +"""
> +
> import dataclasses
> from abc import ABC, abstractmethod
> from collections.abc import Iterable, ValuesView
> @@ -11,9 +27,17 @@
>
> @dataclass(slots=True, frozen=True)
> class LogicalCore(object):
> - """
> - Representation of a CPU core. A physical core is represented in OS
> - by multiple logical cores (lcores) if CPU multithreading is enabled.
> + """Representation of a logical CPU core.
> +
> + A physical core is represented in OS by multiple logical cores (lcores)
> + if CPU multithreading is enabled. When multithreading is disabled, their IDs are the same.
> +
> + Attributes:
> + lcore: The logical core ID of a CPU core. It's the same as `core` with
> + disabled multithreading.
> + core: The physical core ID of a CPU core.
> + socket: The physical socket ID where the CPU resides.
> + node: The NUMA node ID where the CPU resides.
> """
>
> lcore: int
> @@ -22,27 +46,36 @@ class LogicalCore(object):
> node: int
>
> def __int__(self) -> int:
> + """The CPU is best represented by the logical core, as that's what we configure in EAL."""
> return self.lcore
>
>
> class LogicalCoreList(object):
> - """
> - Convert these options into a list of logical core ids.
> - lcore_list=[LogicalCore1, LogicalCore2] - a list of LogicalCores
> - lcore_list=[0,1,2,3] - a list of int indices
> - lcore_list=['0','1','2-3'] - a list of str indices; ranges are supported
> - lcore_list='0,1,2-3' - a comma delimited str of indices; ranges are supported
> -
> - The class creates a unified format used across the framework and allows
> - the user to use either a str representation (using str(instance) or directly
> - in f-strings) or a list representation (by accessing instance.lcore_list).
> - Empty lcore_list is allowed.
> + r"""A unified way to store :class:`LogicalCore`\s.
> +
> + Create a unified format used across the framework and allow the user to use
> + either a :class:`str` representation (using ``str(instance)`` or directly in f-strings)
> + or a :class:`list` representation (by accessing the `lcore_list` property,
> + which stores logical core IDs).
> """
>
> _lcore_list: list[int]
> _lcore_str: str
>
> def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
> + """Process `lcore_list`, then sort.
> +
> + There are four supported logical core list formats::
> +
> + lcore_list=[LogicalCore1, LogicalCore2] # a list of LogicalCores
> + lcore_list=[0,1,2,3] # a list of int indices
> + lcore_list=['0','1','2-3'] # a list of str indices; ranges are supported
> + lcore_list='0,1,2-3' # a comma delimited str of indices; ranges are supported
> +
> + Args:
> + lcore_list: Various ways to represent multiple logical cores.
> + Empty `lcore_list` is allowed.
> + """
> self._lcore_list = []
> if isinstance(lcore_list, str):
> lcore_list = lcore_list.split(",")
> @@ -60,6 +93,7 @@ def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
>
> @property
> def lcore_list(self) -> list[int]:
> + """The logical core IDs."""
> return self._lcore_list
>
> def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
> @@ -89,28 +123,30 @@ def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
> return formatted_core_list
>
> def __str__(self) -> str:
> + """The consecutive ranges of logical core IDs."""
> return self._lcore_str
>
>
> @dataclasses.dataclass(slots=True, frozen=True)
> class LogicalCoreCount(object):
> - """
> - Define the number of logical cores to use.
> - If sockets is not None, socket_count is ignored.
> - """
> + """Define the number of logical cores per physical cores per sockets."""
>
> + #: Use this many logical cores per each physical core.
> lcores_per_core: int = 1
> + #: Use this many physical cores per each socket.
> cores_per_socket: int = 2
> + #: Use this many sockets.
> socket_count: int = 1
> + #: Use exactly these sockets. This takes precedence over `socket_count`,
> + #: so when `sockets` is not :data:`None`, `socket_count` is ignored.
> sockets: list[int] | None = None
>
>
> class LogicalCoreFilter(ABC):
> - """
> - Filter according to the input filter specifier. Each filter needs to be
> - implemented in a derived class.
> - This class only implements operations common to all filters, such as sorting
> - the list to be filtered beforehand.
> + """Common filtering class.
> +
> + Each filter needs to be implemented in a subclass. This base class sorts the list of cores
> + and defines the filtering method, which must be implemented by subclasses.
> """
>
> _filter_specifier: LogicalCoreCount | LogicalCoreList
> @@ -122,6 +158,17 @@ def __init__(
> filter_specifier: LogicalCoreCount | LogicalCoreList,
> ascending: bool = True,
> ):
> + """Filter according to the input filter specifier.
> +
> + The input `lcore_list` is copied and sorted by physical core before filtering.
> + The list is copied so that the original is left intact.
> +
> + Args:
> + lcore_list: The logical CPU cores to filter.
> + filter_specifier: Filter cores from `lcore_list` according to this filter.
> + ascending: Sort cores in ascending order (lowest to highest IDs). If data:`False`,
> + sort in descending order.
> + """
> self._filter_specifier = filter_specifier
>
> # sorting by core is needed in case hyperthreading is enabled
> @@ -132,31 +179,45 @@ def __init__(
>
> @abstractmethod
> def filter(self) -> list[LogicalCore]:
> - """
> - Use self._filter_specifier to filter self._lcores_to_filter
> - and return the list of filtered LogicalCores.
> - self._lcores_to_filter is a sorted copy of the original list,
> - so it may be modified.
> + r"""Filter the cores.
> +
> + Use `self._filter_specifier` to filter `self._lcores_to_filter` and return
> + the filtered :class:`LogicalCore`\s.
> + `self._lcores_to_filter` is a sorted copy of the original list, so it may be modified.
> +
> + Returns:
> + The filtered cores.
> """
>
>
> class LogicalCoreCountFilter(LogicalCoreFilter):
> - """
> + """Filter cores by specified counts.
> +
> Filter the input list of LogicalCores according to specified rules:
> - Use cores from the specified number of sockets or from the specified socket ids.
> - If sockets is specified, it takes precedence over socket_count.
> - From each of those sockets, use only cores_per_socket of cores.
> - And for each core, use lcores_per_core of logical cores. Hypertheading
> - must be enabled for this to take effect.
> - If ascending is True, use cores with the lowest numerical id first
> - and continue in ascending order. If False, start with the highest
> - id and continue in descending order. This ordering affects which
> - sockets to consider first as well.
> +
> + * The input `filter_specifier` is :class:`LogicalCoreCount`,
> + * Use cores from the specified number of sockets or from the specified socket ids,
> + * If `sockets` is specified, it takes precedence over `socket_count`,
> + * From each of those sockets, use only `cores_per_socket` of cores,
> + * And for each core, use `lcores_per_core` of logical cores. Hypertheading
> + must be enabled for this to take effect.
> """
>
> _filter_specifier: LogicalCoreCount
>
> def filter(self) -> list[LogicalCore]:
> + """Filter the cores according to :class:`LogicalCoreCount`.
> +
> + Start by filtering the allowed sockets. The cores matching the allowed socket are returned.
allowed socket*s*
> + The cores of each socket are stored in separate lists.
> +
> + Then filter the allowed physical cores from those lists of cores per socket. When filtering
> + physical cores, store the desired number of logical cores per physical core which then
> + together constitute the final filtered list.
> +
> + Returns:
> + The filtered cores.
> + """
> sockets_to_filter = self._filter_sockets(self._lcores_to_filter)
> filtered_lcores = []
> for socket_to_filter in sockets_to_filter:
> @@ -166,24 +227,37 @@ def filter(self) -> list[LogicalCore]:
> def _filter_sockets(
> self, lcores_to_filter: Iterable[LogicalCore]
> ) -> ValuesView[list[LogicalCore]]:
> - """
> - Remove all lcores that don't match the specified socket(s).
> - If self._filter_specifier.sockets is not None, keep lcores from those sockets,
> - otherwise keep lcores from the first
> - self._filter_specifier.socket_count sockets.
> + """Filter a list of cores per each allowed socket.
> +
> + The sockets may be specified in two ways, either a number or a specific list of sockets.
> + In case of a specific list, we just need to return the cores from those sockets.
> + If filtering a number of cores, we need to go through all cores and note which sockets
> + appear and only filter from the first n that appear.
> +
> + Args:
> + lcores_to_filter: The cores to filter. These must be sorted by the physical core.
> +
> + Returns:
> + A list of lists of logical CPU cores. Each list contains cores from one socket.
> """
> allowed_sockets: set[int] = set()
> socket_count = self._filter_specifier.socket_count
> if self._filter_specifier.sockets:
> + # when sockets in filter is specified, the sockets are already set
> socket_count = len(self._filter_specifier.sockets)
> allowed_sockets = set(self._filter_specifier.sockets)
>
> + # filter socket_count sockets from all sockets by checking the socket of each CPU
> filtered_lcores: dict[int, list[LogicalCore]] = {}
> for lcore in lcores_to_filter:
> if not self._filter_specifier.sockets:
> + # this is when sockets is not set, so we do the actual filtering
> + # when it is set, allowed_sockets is already defined and can't be changed
> if len(allowed_sockets) < socket_count:
> + # allowed_sockets is a set, so adding an existing socket won't re-add it
> allowed_sockets.add(lcore.socket)
> if lcore.socket in allowed_sockets:
> + # separate sockets per socket; this makes it easier in further processing
socket*s* per socket ?
> if lcore.socket in filtered_lcores:
> filtered_lcores[lcore.socket].append(lcore)
> else:
> @@ -200,12 +274,13 @@ def _filter_sockets(
> def _filter_cores_from_socket(
> self, lcores_to_filter: Iterable[LogicalCore]
> ) -> list[LogicalCore]:
> - """
> - Keep only the first self._filter_specifier.cores_per_socket cores.
> - In multithreaded environments, keep only
> - the first self._filter_specifier.lcores_per_core lcores of those cores.
> - """
> + """Filter a list of cores from the given socket.
> +
> + Go through the cores and note how many logical cores per physical core have been filtered.
>
> + Returns:
> + The filtered logical CPU cores.
> + """
> # no need to use ordered dict, from Python3.7 the dict
> # insertion order is preserved (LIFO).
> lcore_count_per_core_map: dict[int, int] = {}
> @@ -248,15 +323,21 @@ def _filter_cores_from_socket(
>
>
> class LogicalCoreListFilter(LogicalCoreFilter):
> - """
> - Filter the input list of Logical Cores according to the input list of
> - lcore indices.
> - An empty LogicalCoreList won't filter anything.
> + """Filter the logical CPU cores by logical CPU core IDs.
> +
> + This is a simple filter that looks at logical CPU IDs and only filter those that match.
> +
> + The input filter is :class:`LogicalCoreList`. An empty LogicalCoreList won't filter anything.
> """
>
> _filter_specifier: LogicalCoreList
>
> def filter(self) -> list[LogicalCore]:
> + """Filter based on logical CPU core ID.
> +
> + Return:
> + The filtered logical CPU cores.
> + """
> if not len(self._filter_specifier.lcore_list):
> return self._lcores_to_filter
>
> @@ -279,6 +360,17 @@ def lcore_filter(
> filter_specifier: LogicalCoreCount | LogicalCoreList,
> ascending: bool,
> ) -> LogicalCoreFilter:
> + """Factory for using the right filter with `filter_specifier`.
> +
> + Args:
> + core_list: The logical CPU cores to filter.
> + filter_specifier: The filter to use.
> + ascending: Sort cores in ascending order (lowest to highest IDs). If :data:`False`,
> + sort in descending order.
> +
> + Returns:
> + The filter matching `filter_specifier`.
> + """
> if isinstance(filter_specifier, LogicalCoreList):
> return LogicalCoreListFilter(core_list, filter_specifier, ascending)
> elif isinstance(filter_specifier, LogicalCoreCount):
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 14/21] dts: cpu docstring update
2023-11-21 17:45 ` Yoan Picchi
@ 2023-11-22 11:18 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 11:18 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Tue, Nov 21, 2023 at 6:45 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/testbed_model/cpu.py | 196 +++++++++++++++++++++--------
> > 1 file changed, 144 insertions(+), 52 deletions(-)
> >
> > diff --git a/dts/framework/testbed_model/cpu.py b/dts/framework/testbed_model/cpu.py
> > index 8fe785dfe4..4edeb4a7c2 100644
> > --- a/dts/framework/testbed_model/cpu.py
> > +++ b/dts/framework/testbed_model/cpu.py
> > @@ -1,6 +1,22 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >
> > +"""CPU core representation and filtering.
> > +
> > +This module provides a unified representation of logical CPU cores along
> > +with filtering capabilities.
> > +
> > +When symmetric multiprocessing (SMP or multithreading) is enabled on a server,
> > +the physical CPU cores are split into logical CPU cores with different IDs.
> > +
> > +:class:`LogicalCoreCountFilter` filters by the number of logical cores. It's possible to specify
> > +the socket from which to filter the number of logical cores. It's also possible to not use all
> > +logical CPU cores from each physical core (e.g. only the first logical core of each physical core).
> > +
> > +:class:`LogicalCoreListFilter` filters by logical core IDs. This mostly checks that
> > +the logical cores are actually present on the server.
> > +"""
> > +
> > import dataclasses
> > from abc import ABC, abstractmethod
> > from collections.abc import Iterable, ValuesView
> > @@ -11,9 +27,17 @@
> >
> > @dataclass(slots=True, frozen=True)
> > class LogicalCore(object):
> > - """
> > - Representation of a CPU core. A physical core is represented in OS
> > - by multiple logical cores (lcores) if CPU multithreading is enabled.
> > + """Representation of a logical CPU core.
> > +
> > + A physical core is represented in OS by multiple logical cores (lcores)
> > + if CPU multithreading is enabled. When multithreading is disabled, their IDs are the same.
> > +
> > + Attributes:
> > + lcore: The logical core ID of a CPU core. It's the same as `core` with
> > + disabled multithreading.
> > + core: The physical core ID of a CPU core.
> > + socket: The physical socket ID where the CPU resides.
> > + node: The NUMA node ID where the CPU resides.
> > """
> >
> > lcore: int
> > @@ -22,27 +46,36 @@ class LogicalCore(object):
> > node: int
> >
> > def __int__(self) -> int:
> > + """The CPU is best represented by the logical core, as that's what we configure in EAL."""
> > return self.lcore
> >
> >
> > class LogicalCoreList(object):
> > - """
> > - Convert these options into a list of logical core ids.
> > - lcore_list=[LogicalCore1, LogicalCore2] - a list of LogicalCores
> > - lcore_list=[0,1,2,3] - a list of int indices
> > - lcore_list=['0','1','2-3'] - a list of str indices; ranges are supported
> > - lcore_list='0,1,2-3' - a comma delimited str of indices; ranges are supported
> > -
> > - The class creates a unified format used across the framework and allows
> > - the user to use either a str representation (using str(instance) or directly
> > - in f-strings) or a list representation (by accessing instance.lcore_list).
> > - Empty lcore_list is allowed.
> > + r"""A unified way to store :class:`LogicalCore`\s.
> > +
> > + Create a unified format used across the framework and allow the user to use
> > + either a :class:`str` representation (using ``str(instance)`` or directly in f-strings)
> > + or a :class:`list` representation (by accessing the `lcore_list` property,
> > + which stores logical core IDs).
> > """
> >
> > _lcore_list: list[int]
> > _lcore_str: str
> >
> > def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
> > + """Process `lcore_list`, then sort.
> > +
> > + There are four supported logical core list formats::
> > +
> > + lcore_list=[LogicalCore1, LogicalCore2] # a list of LogicalCores
> > + lcore_list=[0,1,2,3] # a list of int indices
> > + lcore_list=['0','1','2-3'] # a list of str indices; ranges are supported
> > + lcore_list='0,1,2-3' # a comma delimited str of indices; ranges are supported
> > +
> > + Args:
> > + lcore_list: Various ways to represent multiple logical cores.
> > + Empty `lcore_list` is allowed.
> > + """
> > self._lcore_list = []
> > if isinstance(lcore_list, str):
> > lcore_list = lcore_list.split(",")
> > @@ -60,6 +93,7 @@ def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
> >
> > @property
> > def lcore_list(self) -> list[int]:
> > + """The logical core IDs."""
> > return self._lcore_list
> >
> > def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
> > @@ -89,28 +123,30 @@ def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
> > return formatted_core_list
> >
> > def __str__(self) -> str:
> > + """The consecutive ranges of logical core IDs."""
> > return self._lcore_str
> >
> >
> > @dataclasses.dataclass(slots=True, frozen=True)
> > class LogicalCoreCount(object):
> > - """
> > - Define the number of logical cores to use.
> > - If sockets is not None, socket_count is ignored.
> > - """
> > + """Define the number of logical cores per physical cores per sockets."""
> >
> > + #: Use this many logical cores per each physical core.
> > lcores_per_core: int = 1
> > + #: Use this many physical cores per each socket.
> > cores_per_socket: int = 2
> > + #: Use this many sockets.
> > socket_count: int = 1
> > + #: Use exactly these sockets. This takes precedence over `socket_count`,
> > + #: so when `sockets` is not :data:`None`, `socket_count` is ignored.
> > sockets: list[int] | None = None
> >
> >
> > class LogicalCoreFilter(ABC):
> > - """
> > - Filter according to the input filter specifier. Each filter needs to be
> > - implemented in a derived class.
> > - This class only implements operations common to all filters, such as sorting
> > - the list to be filtered beforehand.
> > + """Common filtering class.
> > +
> > + Each filter needs to be implemented in a subclass. This base class sorts the list of cores
> > + and defines the filtering method, which must be implemented by subclasses.
> > """
> >
> > _filter_specifier: LogicalCoreCount | LogicalCoreList
> > @@ -122,6 +158,17 @@ def __init__(
> > filter_specifier: LogicalCoreCount | LogicalCoreList,
> > ascending: bool = True,
> > ):
> > + """Filter according to the input filter specifier.
> > +
> > + The input `lcore_list` is copied and sorted by physical core before filtering.
> > + The list is copied so that the original is left intact.
> > +
> > + Args:
> > + lcore_list: The logical CPU cores to filter.
> > + filter_specifier: Filter cores from `lcore_list` according to this filter.
> > + ascending: Sort cores in ascending order (lowest to highest IDs). If data:`False`,
> > + sort in descending order.
> > + """
> > self._filter_specifier = filter_specifier
> >
> > # sorting by core is needed in case hyperthreading is enabled
> > @@ -132,31 +179,45 @@ def __init__(
> >
> > @abstractmethod
> > def filter(self) -> list[LogicalCore]:
> > - """
> > - Use self._filter_specifier to filter self._lcores_to_filter
> > - and return the list of filtered LogicalCores.
> > - self._lcores_to_filter is a sorted copy of the original list,
> > - so it may be modified.
> > + r"""Filter the cores.
> > +
> > + Use `self._filter_specifier` to filter `self._lcores_to_filter` and return
> > + the filtered :class:`LogicalCore`\s.
> > + `self._lcores_to_filter` is a sorted copy of the original list, so it may be modified.
> > +
> > + Returns:
> > + The filtered cores.
> > """
> >
> >
> > class LogicalCoreCountFilter(LogicalCoreFilter):
> > - """
> > + """Filter cores by specified counts.
> > +
> > Filter the input list of LogicalCores according to specified rules:
> > - Use cores from the specified number of sockets or from the specified socket ids.
> > - If sockets is specified, it takes precedence over socket_count.
> > - From each of those sockets, use only cores_per_socket of cores.
> > - And for each core, use lcores_per_core of logical cores. Hypertheading
> > - must be enabled for this to take effect.
> > - If ascending is True, use cores with the lowest numerical id first
> > - and continue in ascending order. If False, start with the highest
> > - id and continue in descending order. This ordering affects which
> > - sockets to consider first as well.
> > +
> > + * The input `filter_specifier` is :class:`LogicalCoreCount`,
> > + * Use cores from the specified number of sockets or from the specified socket ids,
> > + * If `sockets` is specified, it takes precedence over `socket_count`,
> > + * From each of those sockets, use only `cores_per_socket` of cores,
> > + * And for each core, use `lcores_per_core` of logical cores. Hypertheading
> > + must be enabled for this to take effect.
> > """
> >
> > _filter_specifier: LogicalCoreCount
> >
> > def filter(self) -> list[LogicalCore]:
> > + """Filter the cores according to :class:`LogicalCoreCount`.
> > +
> > + Start by filtering the allowed sockets. The cores matching the allowed socket are returned.
>
> allowed socket*s*
>
Ack.
> > + The cores of each socket are stored in separate lists.
> > +
> > + Then filter the allowed physical cores from those lists of cores per socket. When filtering
> > + physical cores, store the desired number of logical cores per physical core which then
> > + together constitute the final filtered list.
> > +
> > + Returns:
> > + The filtered cores.
> > + """
> > sockets_to_filter = self._filter_sockets(self._lcores_to_filter)
> > filtered_lcores = []
> > for socket_to_filter in sockets_to_filter:
> > @@ -166,24 +227,37 @@ def filter(self) -> list[LogicalCore]:
> > def _filter_sockets(
> > self, lcores_to_filter: Iterable[LogicalCore]
> > ) -> ValuesView[list[LogicalCore]]:
> > - """
> > - Remove all lcores that don't match the specified socket(s).
> > - If self._filter_specifier.sockets is not None, keep lcores from those sockets,
> > - otherwise keep lcores from the first
> > - self._filter_specifier.socket_count sockets.
> > + """Filter a list of cores per each allowed socket.
> > +
> > + The sockets may be specified in two ways, either a number or a specific list of sockets.
> > + In case of a specific list, we just need to return the cores from those sockets.
> > + If filtering a number of cores, we need to go through all cores and note which sockets
> > + appear and only filter from the first n that appear.
> > +
> > + Args:
> > + lcores_to_filter: The cores to filter. These must be sorted by the physical core.
> > +
> > + Returns:
> > + A list of lists of logical CPU cores. Each list contains cores from one socket.
> > """
> > allowed_sockets: set[int] = set()
> > socket_count = self._filter_specifier.socket_count
> > if self._filter_specifier.sockets:
> > + # when sockets in filter is specified, the sockets are already set
> > socket_count = len(self._filter_specifier.sockets)
> > allowed_sockets = set(self._filter_specifier.sockets)
> >
> > + # filter socket_count sockets from all sockets by checking the socket of each CPU
> > filtered_lcores: dict[int, list[LogicalCore]] = {}
> > for lcore in lcores_to_filter:
> > if not self._filter_specifier.sockets:
> > + # this is when sockets is not set, so we do the actual filtering
> > + # when it is set, allowed_sockets is already defined and can't be changed
> > if len(allowed_sockets) < socket_count:
> > + # allowed_sockets is a set, so adding an existing socket won't re-add it
> > allowed_sockets.add(lcore.socket)
> > if lcore.socket in allowed_sockets:
> > + # separate sockets per socket; this makes it easier in further processing
>
> socket*s* per socket ?
>
Good catch, this should be "separate lcores into sockets".
> > if lcore.socket in filtered_lcores:
> > filtered_lcores[lcore.socket].append(lcore)
> > else:
> > @@ -200,12 +274,13 @@ def _filter_sockets(
> > def _filter_cores_from_socket(
> > self, lcores_to_filter: Iterable[LogicalCore]
> > ) -> list[LogicalCore]:
> > - """
> > - Keep only the first self._filter_specifier.cores_per_socket cores.
> > - In multithreaded environments, keep only
> > - the first self._filter_specifier.lcores_per_core lcores of those cores.
> > - """
> > + """Filter a list of cores from the given socket.
> > +
> > + Go through the cores and note how many logical cores per physical core have been filtered.
> >
> > + Returns:
> > + The filtered logical CPU cores.
> > + """
> > # no need to use ordered dict, from Python3.7 the dict
> > # insertion order is preserved (LIFO).
> > lcore_count_per_core_map: dict[int, int] = {}
> > @@ -248,15 +323,21 @@ def _filter_cores_from_socket(
> >
> >
> > class LogicalCoreListFilter(LogicalCoreFilter):
> > - """
> > - Filter the input list of Logical Cores according to the input list of
> > - lcore indices.
> > - An empty LogicalCoreList won't filter anything.
> > + """Filter the logical CPU cores by logical CPU core IDs.
> > +
> > + This is a simple filter that looks at logical CPU IDs and only filter those that match.
> > +
> > + The input filter is :class:`LogicalCoreList`. An empty LogicalCoreList won't filter anything.
> > """
> >
> > _filter_specifier: LogicalCoreList
> >
> > def filter(self) -> list[LogicalCore]:
> > + """Filter based on logical CPU core ID.
> > +
> > + Return:
> > + The filtered logical CPU cores.
> > + """
> > if not len(self._filter_specifier.lcore_list):
> > return self._lcores_to_filter
> >
> > @@ -279,6 +360,17 @@ def lcore_filter(
> > filter_specifier: LogicalCoreCount | LogicalCoreList,
> > ascending: bool,
> > ) -> LogicalCoreFilter:
> > + """Factory for using the right filter with `filter_specifier`.
> > +
> > + Args:
> > + core_list: The logical CPU cores to filter.
> > + filter_specifier: The filter to use.
> > + ascending: Sort cores in ascending order (lowest to highest IDs). If :data:`False`,
> > + sort in descending order.
> > +
> > + Returns:
> > + The filter matching `filter_specifier`.
> > + """
> > if isinstance(filter_specifier, LogicalCoreList):
> > return LogicalCoreListFilter(core_list, filter_specifier, ascending)
> > elif isinstance(filter_specifier, LogicalCoreCount):
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 15/21] dts: os session docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (13 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 14/21] dts: cpu " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-22 11:50 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 16/21] dts: posix and linux sessions " Juraj Linkeš
` (6 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/os_session.py | 275 ++++++++++++++++------
1 file changed, 208 insertions(+), 67 deletions(-)
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 76e595a518..72b9193a61 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -2,6 +2,29 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""OS-aware remote session.
+
+DPDK supports multiple different operating systems, meaning it can run on these different operating
+systems. This module defines the common API that OS-unaware layers use and translates the API into
+OS-aware calls/utility usage.
+
+Note:
+ Running commands with administrative privileges requires OS awareness. This is the only layer
+ that's aware of OS differences, so this is where non-privileged command get converted
+ to privileged commands.
+
+Example:
+ A user wishes to remove a directory on
+ a remote :class:`~framework.testbed_model.sut_node.SutNode`.
+ The :class:`~framework.testbed_model.sut_node.SutNode` object isn't aware what OS the node
+ is running - it delegates the OS translation logic
+ to :attr:`~framework.testbed_model.node.Node.main_session`. The SUT node calls
+ :meth:`~OSSession.remove_remote_dir` with a generic, OS-unaware path and
+ the :attr:`~framework.testbed_model.node.Node.main_session` translates that
+ to ``rm -rf`` if the node's OS is Linux and other commands for other OSs.
+ It also translates the path to match the underlying OS.
+"""
+
from abc import ABC, abstractmethod
from collections.abc import Iterable
from ipaddress import IPv4Interface, IPv6Interface
@@ -28,10 +51,16 @@
class OSSession(ABC):
- """
- The OS classes create a DTS node remote session and implement OS specific
+ """OS-unaware to OS-aware translation API definition.
+
+ The OSSession classes create a remote session to a DTS node and implement OS specific
behavior. There a few control methods implemented by the base class, the rest need
- to be implemented by derived classes.
+ to be implemented by subclasses.
+
+ Attributes:
+ name: The name of the session.
+ remote_session: The remote session maintaining the connection to the node.
+ interactive_session: The interactive remote session maintaining the connection to the node.
"""
_config: NodeConfiguration
@@ -46,6 +75,15 @@ def __init__(
name: str,
logger: DTSLOG,
):
+ """Initialize the OS-aware session.
+
+ Connect to the node right away and also create an interactive remote session.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
self._config = node_config
self.name = name
self._logger = logger
@@ -53,15 +91,15 @@ def __init__(
self.interactive_session = create_interactive_session(node_config, logger)
def close(self, force: bool = False) -> None:
- """
- Close the remote session.
+ """Close the underlying remote session.
+
+ Args:
+ force: Force the closure of the connection.
"""
self.remote_session.close(force)
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the underlying remote session is still responding."""
return self.remote_session.is_alive()
def send_command(
@@ -72,10 +110,23 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- An all-purpose API in case the command to be executed is already
- OS-agnostic, such as when the path to the executed command has been
- constructed beforehand.
+ """An all-purpose API for OS-agnostic commands.
+
+ This can be used for an execution of a portable command that's executed the same way
+ on all operating systems, such as Python.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds to execute the command.
+ privileged: Whether to run the command with administrative privileges.
+ verify: If :data:`True`, will check the exit code of the command.
+ env: A dictionary with environment variables to be used with the command execution.
+
+ Raises:
+ RemoteCommandExecutionError: If verify is :data:`True` and the command failed.
"""
if privileged:
command = self._get_privileged_command(command)
@@ -89,8 +140,20 @@ def create_interactive_shell(
privileged: bool,
app_args: str,
) -> InteractiveShellType:
- """
- See "create_interactive_shell" in SutNode
+ """Factory for interactive session handlers.
+
+ Instantiate `shell_cls` according to the remote OS specifics.
+
+ Args:
+ shell_cls: The class of the shell.
+ timeout: Timeout for reading output from the SSH channel. If you are
+ reading from the buffer and don't receive any data within the timeout
+ it will throw an error.
+ privileged: Whether to run the shell with administrative privileges.
+ app_args: The arguments to be passed to the application.
+
+ Returns:
+ An instance of the desired interactive application shell.
"""
return shell_cls(
self.interactive_session.session,
@@ -114,27 +177,42 @@ def _get_privileged_command(command: str) -> str:
@abstractmethod
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
- """
- Try to find DPDK remote dir in remote_dir.
+ """Try to find DPDK directory in `remote_dir`.
+
+ The directory is the one which is created after the extraction of the tarball. The files
+ are usually extracted into a directory starting with ``dpdk-``.
+
+ Returns:
+ The absolute path of the DPDK remote directory, empty path if not found.
"""
@abstractmethod
def get_remote_tmp_dir(self) -> PurePath:
- """
- Get the path of the temporary directory of the remote OS.
+ """Get the path of the temporary directory of the remote OS.
+
+ Returns:
+ The absolute path of the temporary directory.
"""
@abstractmethod
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for the target architecture. Get
- information from the node if needed.
+ """Create extra environment variables needed for the target architecture.
+
+ Different architectures may require different configuration, such as setting 32-bit CFLAGS.
+
+ Returns:
+ A dictionary with keys as environment variables.
"""
@abstractmethod
def join_remote_path(self, *args: str | PurePath) -> PurePath:
- """
- Join path parts using the path separator that fits the remote OS.
+ """Join path parts using the path separator that fits the remote OS.
+
+ Args:
+ args: Any number of paths to join.
+
+ Returns:
+ The resulting joined path.
"""
@abstractmethod
@@ -143,13 +221,13 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from the remote Node to the local filesystem.
+ """Copy a file from the remote node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote node associated with this remote
+ session to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
+ source_file: the file on the remote node.
destination_file: a file or directory path on the local filesystem.
"""
@@ -159,14 +237,14 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from local filesystem to the remote Node.
+ """Copy a file from local filesystem to the remote node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file`
+ on the remote node associated with this remote session.
Args:
source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ destination_file: a file or directory path on the remote node.
"""
@abstractmethod
@@ -176,8 +254,12 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
- """
- Remove remote directory, by default remove recursively and forcefully.
+ """Remove remote directory, by default remove recursively and forcefully.
+
+ Args:
+ remote_dir_path: The path of the directory to remove.
+ recursive: If :data:`True`, also remove all contents inside the directory.
+ force: If :data:`True`, ignore all warnings and try to remove at all costs.
"""
@abstractmethod
@@ -186,9 +268,12 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
- """
- Extract remote tarball in place. If expected_dir is a non-empty string, check
- whether the dir exists after extracting the archive.
+ """Extract remote tarball in its remote directory.
+
+ Args:
+ remote_tarball_path: The path of the tarball on the remote node.
+ expected_dir: If non-empty, check whether `expected_dir` exists after extracting
+ the archive.
"""
@abstractmethod
@@ -201,69 +286,119 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
- """
- Build DPDK in the input dir with specified environment variables and meson
- arguments.
+ """Build DPDK on the remote node.
+
+ An extracted DPDK tarball must be present on the node. The build consists of two steps::
+
+ meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
+ ninja -C remote_dpdk_build_dir
+
+ The :option:`--compile-timeout` command line argument and the :envvar:`DTS_COMPILE_TIMEOUT`
+ environment variable configure the timeout of DPDK build.
+
+ Args:
+ env_vars: Use these environment variables then building DPDK.
+ meson_args: Use these meson arguments when building DPDK.
+ remote_dpdk_dir: The directory on the remote node where DPDK will be built.
+ remote_dpdk_build_dir: The target build directory on the remote node.
+ rebuild: If :data:`True`, do a subsequent build with ``meson configure`` instead
+ of ``meson setup``.
+ timeout: Wait at most this long in seconds for the build to execute.
"""
@abstractmethod
def get_dpdk_version(self, version_path: str | PurePath) -> str:
- """
- Inspect DPDK version on the remote node from version_path.
+ """Inspect the DPDK version on the remote node.
+
+ Args:
+ version_path: The path to the VERSION file containing the DPDK version.
+
+ Returns:
+ The DPDK version.
"""
@abstractmethod
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
- """
- Compose a list of LogicalCores present on the remote node.
- If use_first_core is False, the first physical core won't be used.
+ r"""Get the list of :class:`~framework.testbed_model.cpu.LogicalCore`\s on the remote node.
+
+ Args:
+ use_first_core: If :data:`False`, the first physical core won't be used.
+
+ Returns:
+ The logical cores present on the node.
"""
@abstractmethod
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
- """
- Kill and cleanup all DPDK apps identified by dpdk_prefix_list. If
- dpdk_prefix_list is empty, attempt to find running DPDK apps to kill and clean.
+ """Kill and cleanup all DPDK apps.
+
+ Args:
+ dpdk_prefix_list: Kill all apps identified by `dpdk_prefix_list`.
+ If `dpdk_prefix_list` is empty, attempt to find running DPDK apps to kill and clean.
"""
@abstractmethod
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
- """
- Get the DPDK file prefix that will be used when running DPDK apps.
+ """Make OS-specific modification to the DPDK file prefix.
+
+ Args:
+ dpdk_prefix: The OS-unaware file prefix.
+
+ Returns:
+ The OS-specific file prefix.
"""
@abstractmethod
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
- """
- Get the node's Hugepage Size, configure the specified amount of hugepages
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Configure hugepages on the node.
+
+ Get the node's Hugepage Size, configure the specified count of hugepages
if needed and mount the hugepages if needed.
- If force_first_numa is True, configure hugepages just on the first socket.
+
+ Args:
+ hugepage_count: Configure this many hugepages.
+ force_first_numa: If :data:`True`, configure hugepages just on the first socket.
"""
@abstractmethod
def get_compiler_version(self, compiler_name: str) -> str:
- """
- Get installed version of compiler used for DPDK
+ """Get installed version of compiler used for DPDK.
+
+ Args:
+ compiler_name: The name of the compiler executable.
+
+ Returns:
+ The compiler's version.
"""
@abstractmethod
def get_node_info(self) -> NodeInfo:
- """
- Collect information about the node
+ """Collect additional information about the node.
+
+ Returns:
+ Node information.
"""
@abstractmethod
def update_ports(self, ports: list[Port]) -> None:
- """
- Get additional information about ports:
- Logical name (e.g. enp7s0) if applicable
- Mac address
+ """Get additional information about ports from the operating system and update them.
+
+ The additional information is:
+
+ * Logical name (e.g. ``enp7s0``) if applicable,
+ * Mac address.
+
+ Args:
+ ports: The ports to update.
"""
@abstractmethod
def configure_port_state(self, port: Port, enable: bool) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port` in the operating system.
+
+ Args:
+ port: The port to configure.
+ enable: If :data:`True`, enable the port, otherwise shut it down.
"""
@abstractmethod
@@ -273,12 +408,18 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
- """
- Configure (add or delete) an IP address of the input port.
+ """Configure an IP address on `port` in the operating system.
+
+ Args:
+ address: The address to configure.
+ port: The port to configure.
+ delete: If :data:`True`, remove the IP address, otherwise configure it.
"""
@abstractmethod
def configure_ipv4_forwarding(self, enable: bool) -> None:
- """
- Enable IPv4 forwarding in the underlying OS.
+ """Enable IPv4 forwarding in the operating system.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
"""
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 15/21] dts: os session docstring update
2023-11-15 13:09 ` [PATCH v7 15/21] dts: os session " Juraj Linkeš
@ 2023-11-22 11:50 ` Yoan Picchi
2023-11-22 13:27 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-22 11:50 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/testbed_model/os_session.py | 275 ++++++++++++++++------
> 1 file changed, 208 insertions(+), 67 deletions(-)
>
> diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
> index 76e595a518..72b9193a61 100644
> --- a/dts/framework/testbed_model/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -2,6 +2,29 @@
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2023 University of New Hampshire
>
> +"""OS-aware remote session.
> +
> +DPDK supports multiple different operating systems, meaning it can run on these different operating
> +systems. This module defines the common API that OS-unaware layers use and translates the API into
> +OS-aware calls/utility usage.
> +
> +Note:
> + Running commands with administrative privileges requires OS awareness. This is the only layer
> + that's aware of OS differences, so this is where non-privileged command get converted
> + to privileged commands.
> +
> +Example:
> + A user wishes to remove a directory on
> + a remote :class:`~framework.testbed_model.sut_node.SutNode`.
> + The :class:`~framework.testbed_model.sut_node.SutNode` object isn't aware what OS the node
> + is running - it delegates the OS translation logic
> + to :attr:`~framework.testbed_model.node.Node.main_session`. The SUT node calls
> + :meth:`~OSSession.remove_remote_dir` with a generic, OS-unaware path and
> + the :attr:`~framework.testbed_model.node.Node.main_session` translates that
> + to ``rm -rf`` if the node's OS is Linux and other commands for other OSs.
> + It also translates the path to match the underlying OS.
> +"""
> +
> from abc import ABC, abstractmethod
> from collections.abc import Iterable
> from ipaddress import IPv4Interface, IPv6Interface
> @@ -28,10 +51,16 @@
>
>
> class OSSession(ABC):
> - """
> - The OS classes create a DTS node remote session and implement OS specific
> + """OS-unaware to OS-aware translation API definition.
> +
> + The OSSession classes create a remote session to a DTS node and implement OS specific
> behavior. There a few control methods implemented by the base class, the rest need
> - to be implemented by derived classes.
> + to be implemented by subclasses.
> +
> + Attributes:
> + name: The name of the session.
> + remote_session: The remote session maintaining the connection to the node.
> + interactive_session: The interactive remote session maintaining the connection to the node.
> """
>
> _config: NodeConfiguration
> @@ -46,6 +75,15 @@ def __init__(
> name: str,
> logger: DTSLOG,
> ):
> + """Initialize the OS-aware session.
> +
> + Connect to the node right away and also create an interactive remote session.
> +
> + Args:
> + node_config: The test run configuration of the node to connect to.
> + name: The name of the session.
> + logger: The logger instance this session will use.
> + """
> self._config = node_config
> self.name = name
> self._logger = logger
> @@ -53,15 +91,15 @@ def __init__(
> self.interactive_session = create_interactive_session(node_config, logger)
>
> def close(self, force: bool = False) -> None:
> - """
> - Close the remote session.
> + """Close the underlying remote session.
> +
> + Args:
> + force: Force the closure of the connection.
> """
> self.remote_session.close(force)
>
> def is_alive(self) -> bool:
> - """
> - Check whether the remote session is still responding.
> - """
> + """Check whether the underlying remote session is still responding."""
> return self.remote_session.is_alive()
>
> def send_command(
> @@ -72,10 +110,23 @@ def send_command(
> verify: bool = False,
> env: dict | None = None,
> ) -> CommandResult:
> - """
> - An all-purpose API in case the command to be executed is already
> - OS-agnostic, such as when the path to the executed command has been
> - constructed beforehand.
> + """An all-purpose API for OS-agnostic commands.
> +
> + This can be used for an execution of a portable command that's executed the same way
> + on all operating systems, such as Python.
> +
> + The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
> + environment variable configure the timeout of command execution.
> +
> + Args:
> + command: The command to execute.
> + timeout: Wait at most this long in seconds to execute the command.
confusing start/end of execution
> + privileged: Whether to run the command with administrative privileges.
> + verify: If :data:`True`, will check the exit code of the command.
> + env: A dictionary with environment variables to be used with the command execution.
> +
> + Raises:
> + RemoteCommandExecutionError: If verify is :data:`True` and the command failed.
> """
> if privileged:
> command = self._get_privileged_command(command)
> @@ -89,8 +140,20 @@ def create_interactive_shell(
> privileged: bool,
> app_args: str,
> ) -> InteractiveShellType:
> - """
> - See "create_interactive_shell" in SutNode
> + """Factory for interactive session handlers.
> +
> + Instantiate `shell_cls` according to the remote OS specifics.
> +
> + Args:
> + shell_cls: The class of the shell.
> + timeout: Timeout for reading output from the SSH channel. If you are
> + reading from the buffer and don't receive any data within the timeout
> + it will throw an error.
> + privileged: Whether to run the shell with administrative privileges.
> + app_args: The arguments to be passed to the application.
> +
> + Returns:
> + An instance of the desired interactive application shell.
> """
> return shell_cls(
> self.interactive_session.session,
> @@ -114,27 +177,42 @@ def _get_privileged_command(command: str) -> str:
>
> @abstractmethod
> def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
> - """
> - Try to find DPDK remote dir in remote_dir.
> + """Try to find DPDK directory in `remote_dir`.
> +
> + The directory is the one which is created after the extraction of the tarball. The files
> + are usually extracted into a directory starting with ``dpdk-``.
> +
> + Returns:
> + The absolute path of the DPDK remote directory, empty path if not found.
> """
>
> @abstractmethod
> def get_remote_tmp_dir(self) -> PurePath:
> - """
> - Get the path of the temporary directory of the remote OS.
> + """Get the path of the temporary directory of the remote OS.
> +
> + Returns:
> + The absolute path of the temporary directory.
> """
>
> @abstractmethod
> def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
> - """
> - Create extra environment variables needed for the target architecture. Get
> - information from the node if needed.
> + """Create extra environment variables needed for the target architecture.
> +
> + Different architectures may require different configuration, such as setting 32-bit CFLAGS.
> +
> + Returns:
> + A dictionary with keys as environment variables.
> """
>
> @abstractmethod
> def join_remote_path(self, *args: str | PurePath) -> PurePath:
> - """
> - Join path parts using the path separator that fits the remote OS.
> + """Join path parts using the path separator that fits the remote OS.
> +
> + Args:
> + args: Any number of paths to join.
> +
> + Returns:
> + The resulting joined path.
> """
>
> @abstractmethod
> @@ -143,13 +221,13 @@ def copy_from(
> source_file: str | PurePath,
> destination_file: str | PurePath,
> ) -> None:
> - """Copy a file from the remote Node to the local filesystem.
> + """Copy a file from the remote node to the local filesystem.
>
> - Copy source_file from the remote Node associated with this remote
> - session to destination_file on the local filesystem.
> + Copy `source_file` from the remote node associated with this remote
> + session to `destination_file` on the local filesystem.
>
> Args:
> - source_file: the file on the remote Node.
> + source_file: the file on the remote node.
> destination_file: a file or directory path on the local filesystem.
> """
>
> @@ -159,14 +237,14 @@ def copy_to(
> source_file: str | PurePath,
> destination_file: str | PurePath,
> ) -> None:
> - """Copy a file from local filesystem to the remote Node.
> + """Copy a file from local filesystem to the remote node.
>
> - Copy source_file from local filesystem to destination_file
> - on the remote Node associated with this remote session.
> + Copy `source_file` from local filesystem to `destination_file`
> + on the remote node associated with this remote session.
>
> Args:
> source_file: the file on the local filesystem.
> - destination_file: a file or directory path on the remote Node.
> + destination_file: a file or directory path on the remote node.
> """
>
> @abstractmethod
> @@ -176,8 +254,12 @@ def remove_remote_dir(
> recursive: bool = True,
> force: bool = True,
> ) -> None:
> - """
> - Remove remote directory, by default remove recursively and forcefully.
> + """Remove remote directory, by default remove recursively and forcefully.
> +
> + Args:
> + remote_dir_path: The path of the directory to remove.
> + recursive: If :data:`True`, also remove all contents inside the directory.
> + force: If :data:`True`, ignore all warnings and try to remove at all costs.
> """
>
> @abstractmethod
> @@ -186,9 +268,12 @@ def extract_remote_tarball(
> remote_tarball_path: str | PurePath,
> expected_dir: str | PurePath | None = None,
> ) -> None:
> - """
> - Extract remote tarball in place. If expected_dir is a non-empty string, check
> - whether the dir exists after extracting the archive.
> + """Extract remote tarball in its remote directory.
> +
> + Args:
> + remote_tarball_path: The path of the tarball on the remote node.
> + expected_dir: If non-empty, check whether `expected_dir` exists after extracting
> + the archive.
> """
>
> @abstractmethod
> @@ -201,69 +286,119 @@ def build_dpdk(
> rebuild: bool = False,
> timeout: float = SETTINGS.compile_timeout,
> ) -> None:
> - """
> - Build DPDK in the input dir with specified environment variables and meson
> - arguments.
> + """Build DPDK on the remote node.
> +
> + An extracted DPDK tarball must be present on the node. The build consists of two steps::
> +
> + meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
> + ninja -C remote_dpdk_build_dir
> +
> + The :option:`--compile-timeout` command line argument and the :envvar:`DTS_COMPILE_TIMEOUT`
> + environment variable configure the timeout of DPDK build.
> +
> + Args:
> + env_vars: Use these environment variables then building DPDK.
> + meson_args: Use these meson arguments when building DPDK.
> + remote_dpdk_dir: The directory on the remote node where DPDK will be built.
> + remote_dpdk_build_dir: The target build directory on the remote node.
> + rebuild: If :data:`True`, do a subsequent build with ``meson configure`` instead
> + of ``meson setup``.
> + timeout: Wait at most this long in seconds for the build to execute.
confusing start/end of execution
> """
>
> @abstractmethod
> def get_dpdk_version(self, version_path: str | PurePath) -> str:
> - """
> - Inspect DPDK version on the remote node from version_path.
> + """Inspect the DPDK version on the remote node.
> +
> + Args:
> + version_path: The path to the VERSION file containing the DPDK version.
> +
> + Returns:
> + The DPDK version.
> """
>
> @abstractmethod
> def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> - """
> - Compose a list of LogicalCores present on the remote node.
> - If use_first_core is False, the first physical core won't be used.
> + r"""Get the list of :class:`~framework.testbed_model.cpu.LogicalCore`\s on the remote node.
> +
> + Args:
> + use_first_core: If :data:`False`, the first physical core won't be used.
> +
> + Returns:
> + The logical cores present on the node.
> """
>
> @abstractmethod
> def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
> - """
> - Kill and cleanup all DPDK apps identified by dpdk_prefix_list. If
> - dpdk_prefix_list is empty, attempt to find running DPDK apps to kill and clean.
> + """Kill and cleanup all DPDK apps.
> +
> + Args:
> + dpdk_prefix_list: Kill all apps identified by `dpdk_prefix_list`.
> + If `dpdk_prefix_list` is empty, attempt to find running DPDK apps to kill and clean.
> """
>
> @abstractmethod
> def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> - """
> - Get the DPDK file prefix that will be used when running DPDK apps.
> + """Make OS-specific modification to the DPDK file prefix.
> +
> + Args:
> + dpdk_prefix: The OS-unaware file prefix.
> +
> + Returns:
> + The OS-specific file prefix.
> """
>
> @abstractmethod
> - def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
> - """
> - Get the node's Hugepage Size, configure the specified amount of hugepages
> + def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
> + """Configure hugepages on the node.
> +
> + Get the node's Hugepage Size, configure the specified count of hugepages
> if needed and mount the hugepages if needed.
> - If force_first_numa is True, configure hugepages just on the first socket.
> +
> + Args:
> + hugepage_count: Configure this many hugepages.
> + force_first_numa: If :data:`True`, configure hugepages just on the first socket.
force *numa* configures the first *socket* ?
> """
>
> @abstractmethod
> def get_compiler_version(self, compiler_name: str) -> str:
> - """
> - Get installed version of compiler used for DPDK
> + """Get installed version of compiler used for DPDK.
> +
> + Args:
> + compiler_name: The name of the compiler executable.
> +
> + Returns:
> + The compiler's version.
> """
>
> @abstractmethod
> def get_node_info(self) -> NodeInfo:
> - """
> - Collect information about the node
> + """Collect additional information about the node.
> +
> + Returns:
> + Node information.
> """
>
> @abstractmethod
> def update_ports(self, ports: list[Port]) -> None:
> - """
> - Get additional information about ports:
> - Logical name (e.g. enp7s0) if applicable
> - Mac address
> + """Get additional information about ports from the operating system and update them.
> +
> + The additional information is:
> +
> + * Logical name (e.g. ``enp7s0``) if applicable,
> + * Mac address.
> +
> + Args:
> + ports: The ports to update.
> """
>
> @abstractmethod
> def configure_port_state(self, port: Port, enable: bool) -> None:
> - """
> - Enable/disable port.
> + """Enable/disable `port` in the operating system.
> +
> + Args:
> + port: The port to configure.
> + enable: If :data:`True`, enable the port, otherwise shut it down.
> """
>
> @abstractmethod
> @@ -273,12 +408,18 @@ def configure_port_ip_address(
> port: Port,
> delete: bool,
> ) -> None:
> - """
> - Configure (add or delete) an IP address of the input port.
> + """Configure an IP address on `port` in the operating system.
> +
> + Args:
> + address: The address to configure.
> + port: The port to configure.
> + delete: If :data:`True`, remove the IP address, otherwise configure it.
> """
>
> @abstractmethod
> def configure_ipv4_forwarding(self, enable: bool) -> None:
> - """
> - Enable IPv4 forwarding in the underlying OS.
> + """Enable IPv4 forwarding in the operating system.
> +
> + Args:
> + enable: If :data:`True`, enable the forwarding, otherwise disable it.
> """
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 15/21] dts: os session docstring update
2023-11-22 11:50 ` Yoan Picchi
@ 2023-11-22 13:27 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 13:27 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Wed, Nov 22, 2023 at 12:50 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/testbed_model/os_session.py | 275 ++++++++++++++++------
> > 1 file changed, 208 insertions(+), 67 deletions(-)
> >
> > diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
> > index 76e595a518..72b9193a61 100644
> > --- a/dts/framework/testbed_model/os_session.py
> > +++ b/dts/framework/testbed_model/os_session.py
> > @@ -2,6 +2,29 @@
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2023 University of New Hampshire
> >
> > +"""OS-aware remote session.
> > +
> > +DPDK supports multiple different operating systems, meaning it can run on these different operating
> > +systems. This module defines the common API that OS-unaware layers use and translates the API into
> > +OS-aware calls/utility usage.
> > +
> > +Note:
> > + Running commands with administrative privileges requires OS awareness. This is the only layer
> > + that's aware of OS differences, so this is where non-privileged command get converted
> > + to privileged commands.
> > +
> > +Example:
> > + A user wishes to remove a directory on
> > + a remote :class:`~framework.testbed_model.sut_node.SutNode`.
> > + The :class:`~framework.testbed_model.sut_node.SutNode` object isn't aware what OS the node
> > + is running - it delegates the OS translation logic
> > + to :attr:`~framework.testbed_model.node.Node.main_session`. The SUT node calls
> > + :meth:`~OSSession.remove_remote_dir` with a generic, OS-unaware path and
> > + the :attr:`~framework.testbed_model.node.Node.main_session` translates that
> > + to ``rm -rf`` if the node's OS is Linux and other commands for other OSs.
> > + It also translates the path to match the underlying OS.
> > +"""
> > +
> > from abc import ABC, abstractmethod
> > from collections.abc import Iterable
> > from ipaddress import IPv4Interface, IPv6Interface
> > @@ -28,10 +51,16 @@
> >
> >
> > class OSSession(ABC):
> > - """
> > - The OS classes create a DTS node remote session and implement OS specific
> > + """OS-unaware to OS-aware translation API definition.
> > +
> > + The OSSession classes create a remote session to a DTS node and implement OS specific
> > behavior. There a few control methods implemented by the base class, the rest need
> > - to be implemented by derived classes.
> > + to be implemented by subclasses.
> > +
> > + Attributes:
> > + name: The name of the session.
> > + remote_session: The remote session maintaining the connection to the node.
> > + interactive_session: The interactive remote session maintaining the connection to the node.
> > """
> >
> > _config: NodeConfiguration
> > @@ -46,6 +75,15 @@ def __init__(
> > name: str,
> > logger: DTSLOG,
> > ):
> > + """Initialize the OS-aware session.
> > +
> > + Connect to the node right away and also create an interactive remote session.
> > +
> > + Args:
> > + node_config: The test run configuration of the node to connect to.
> > + name: The name of the session.
> > + logger: The logger instance this session will use.
> > + """
> > self._config = node_config
> > self.name = name
> > self._logger = logger
> > @@ -53,15 +91,15 @@ def __init__(
> > self.interactive_session = create_interactive_session(node_config, logger)
> >
> > def close(self, force: bool = False) -> None:
> > - """
> > - Close the remote session.
> > + """Close the underlying remote session.
> > +
> > + Args:
> > + force: Force the closure of the connection.
> > """
> > self.remote_session.close(force)
> >
> > def is_alive(self) -> bool:
> > - """
> > - Check whether the remote session is still responding.
> > - """
> > + """Check whether the underlying remote session is still responding."""
> > return self.remote_session.is_alive()
> >
> > def send_command(
> > @@ -72,10 +110,23 @@ def send_command(
> > verify: bool = False,
> > env: dict | None = None,
> > ) -> CommandResult:
> > - """
> > - An all-purpose API in case the command to be executed is already
> > - OS-agnostic, such as when the path to the executed command has been
> > - constructed beforehand.
> > + """An all-purpose API for OS-agnostic commands.
> > +
> > + This can be used for an execution of a portable command that's executed the same way
> > + on all operating systems, such as Python.
> > +
> > + The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
> > + environment variable configure the timeout of command execution.
> > +
> > + Args:
> > + command: The command to execute.
> > + timeout: Wait at most this long in seconds to execute the command.
>
> confusing start/end of execution
>
Ack.
> > + privileged: Whether to run the command with administrative privileges.
> > + verify: If :data:`True`, will check the exit code of the command.
> > + env: A dictionary with environment variables to be used with the command execution.
> > +
> > + Raises:
> > + RemoteCommandExecutionError: If verify is :data:`True` and the command failed.
> > """
> > if privileged:
> > command = self._get_privileged_command(command)
> > @@ -89,8 +140,20 @@ def create_interactive_shell(
> > privileged: bool,
> > app_args: str,
> > ) -> InteractiveShellType:
> > - """
> > - See "create_interactive_shell" in SutNode
> > + """Factory for interactive session handlers.
> > +
> > + Instantiate `shell_cls` according to the remote OS specifics.
> > +
> > + Args:
> > + shell_cls: The class of the shell.
> > + timeout: Timeout for reading output from the SSH channel. If you are
> > + reading from the buffer and don't receive any data within the timeout
> > + it will throw an error.
> > + privileged: Whether to run the shell with administrative privileges.
> > + app_args: The arguments to be passed to the application.
> > +
> > + Returns:
> > + An instance of the desired interactive application shell.
> > """
> > return shell_cls(
> > self.interactive_session.session,
> > @@ -114,27 +177,42 @@ def _get_privileged_command(command: str) -> str:
> >
> > @abstractmethod
> > def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
> > - """
> > - Try to find DPDK remote dir in remote_dir.
> > + """Try to find DPDK directory in `remote_dir`.
> > +
> > + The directory is the one which is created after the extraction of the tarball. The files
> > + are usually extracted into a directory starting with ``dpdk-``.
> > +
> > + Returns:
> > + The absolute path of the DPDK remote directory, empty path if not found.
> > """
> >
> > @abstractmethod
> > def get_remote_tmp_dir(self) -> PurePath:
> > - """
> > - Get the path of the temporary directory of the remote OS.
> > + """Get the path of the temporary directory of the remote OS.
> > +
> > + Returns:
> > + The absolute path of the temporary directory.
> > """
> >
> > @abstractmethod
> > def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
> > - """
> > - Create extra environment variables needed for the target architecture. Get
> > - information from the node if needed.
> > + """Create extra environment variables needed for the target architecture.
> > +
> > + Different architectures may require different configuration, such as setting 32-bit CFLAGS.
> > +
> > + Returns:
> > + A dictionary with keys as environment variables.
> > """
> >
> > @abstractmethod
> > def join_remote_path(self, *args: str | PurePath) -> PurePath:
> > - """
> > - Join path parts using the path separator that fits the remote OS.
> > + """Join path parts using the path separator that fits the remote OS.
> > +
> > + Args:
> > + args: Any number of paths to join.
> > +
> > + Returns:
> > + The resulting joined path.
> > """
> >
> > @abstractmethod
> > @@ -143,13 +221,13 @@ def copy_from(
> > source_file: str | PurePath,
> > destination_file: str | PurePath,
> > ) -> None:
> > - """Copy a file from the remote Node to the local filesystem.
> > + """Copy a file from the remote node to the local filesystem.
> >
> > - Copy source_file from the remote Node associated with this remote
> > - session to destination_file on the local filesystem.
> > + Copy `source_file` from the remote node associated with this remote
> > + session to `destination_file` on the local filesystem.
> >
> > Args:
> > - source_file: the file on the remote Node.
> > + source_file: the file on the remote node.
> > destination_file: a file or directory path on the local filesystem.
> > """
> >
> > @@ -159,14 +237,14 @@ def copy_to(
> > source_file: str | PurePath,
> > destination_file: str | PurePath,
> > ) -> None:
> > - """Copy a file from local filesystem to the remote Node.
> > + """Copy a file from local filesystem to the remote node.
> >
> > - Copy source_file from local filesystem to destination_file
> > - on the remote Node associated with this remote session.
> > + Copy `source_file` from local filesystem to `destination_file`
> > + on the remote node associated with this remote session.
> >
> > Args:
> > source_file: the file on the local filesystem.
> > - destination_file: a file or directory path on the remote Node.
> > + destination_file: a file or directory path on the remote node.
> > """
> >
> > @abstractmethod
> > @@ -176,8 +254,12 @@ def remove_remote_dir(
> > recursive: bool = True,
> > force: bool = True,
> > ) -> None:
> > - """
> > - Remove remote directory, by default remove recursively and forcefully.
> > + """Remove remote directory, by default remove recursively and forcefully.
> > +
> > + Args:
> > + remote_dir_path: The path of the directory to remove.
> > + recursive: If :data:`True`, also remove all contents inside the directory.
> > + force: If :data:`True`, ignore all warnings and try to remove at all costs.
> > """
> >
> > @abstractmethod
> > @@ -186,9 +268,12 @@ def extract_remote_tarball(
> > remote_tarball_path: str | PurePath,
> > expected_dir: str | PurePath | None = None,
> > ) -> None:
> > - """
> > - Extract remote tarball in place. If expected_dir is a non-empty string, check
> > - whether the dir exists after extracting the archive.
> > + """Extract remote tarball in its remote directory.
> > +
> > + Args:
> > + remote_tarball_path: The path of the tarball on the remote node.
> > + expected_dir: If non-empty, check whether `expected_dir` exists after extracting
> > + the archive.
> > """
> >
> > @abstractmethod
> > @@ -201,69 +286,119 @@ def build_dpdk(
> > rebuild: bool = False,
> > timeout: float = SETTINGS.compile_timeout,
> > ) -> None:
> > - """
> > - Build DPDK in the input dir with specified environment variables and meson
> > - arguments.
> > + """Build DPDK on the remote node.
> > +
> > + An extracted DPDK tarball must be present on the node. The build consists of two steps::
> > +
> > + meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
> > + ninja -C remote_dpdk_build_dir
> > +
> > + The :option:`--compile-timeout` command line argument and the :envvar:`DTS_COMPILE_TIMEOUT`
> > + environment variable configure the timeout of DPDK build.
> > +
> > + Args:
> > + env_vars: Use these environment variables then building DPDK.
> > + meson_args: Use these meson arguments when building DPDK.
> > + remote_dpdk_dir: The directory on the remote node where DPDK will be built.
> > + remote_dpdk_build_dir: The target build directory on the remote node.
> > + rebuild: If :data:`True`, do a subsequent build with ``meson configure`` instead
> > + of ``meson setup``.
> > + timeout: Wait at most this long in seconds for the build to execute.
>
> confusing start/end of execution
>
Ack.
> > """
> >
> > @abstractmethod
> > def get_dpdk_version(self, version_path: str | PurePath) -> str:
> > - """
> > - Inspect DPDK version on the remote node from version_path.
> > + """Inspect the DPDK version on the remote node.
> > +
> > + Args:
> > + version_path: The path to the VERSION file containing the DPDK version.
> > +
> > + Returns:
> > + The DPDK version.
> > """
> >
> > @abstractmethod
> > def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> > - """
> > - Compose a list of LogicalCores present on the remote node.
> > - If use_first_core is False, the first physical core won't be used.
> > + r"""Get the list of :class:`~framework.testbed_model.cpu.LogicalCore`\s on the remote node.
> > +
> > + Args:
> > + use_first_core: If :data:`False`, the first physical core won't be used.
> > +
> > + Returns:
> > + The logical cores present on the node.
> > """
> >
> > @abstractmethod
> > def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
> > - """
> > - Kill and cleanup all DPDK apps identified by dpdk_prefix_list. If
> > - dpdk_prefix_list is empty, attempt to find running DPDK apps to kill and clean.
> > + """Kill and cleanup all DPDK apps.
> > +
> > + Args:
> > + dpdk_prefix_list: Kill all apps identified by `dpdk_prefix_list`.
> > + If `dpdk_prefix_list` is empty, attempt to find running DPDK apps to kill and clean.
> > """
> >
> > @abstractmethod
> > def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> > - """
> > - Get the DPDK file prefix that will be used when running DPDK apps.
> > + """Make OS-specific modification to the DPDK file prefix.
> > +
> > + Args:
> > + dpdk_prefix: The OS-unaware file prefix.
> > +
> > + Returns:
> > + The OS-specific file prefix.
> > """
> >
> > @abstractmethod
> > - def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
> > - """
> > - Get the node's Hugepage Size, configure the specified amount of hugepages
> > + def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
> > + """Configure hugepages on the node.
> > +
> > + Get the node's Hugepage Size, configure the specified count of hugepages
> > if needed and mount the hugepages if needed.
> > - If force_first_numa is True, configure hugepages just on the first socket.
> > +
> > + Args:
> > + hugepage_count: Configure this many hugepages.
> > + force_first_numa: If :data:`True`, configure hugepages just on the first socket.
>
> force *numa* configures the first *socket* ?
>
Good catch, should be numa node, not socket.
> > """
> >
> > @abstractmethod
> > def get_compiler_version(self, compiler_name: str) -> str:
> > - """
> > - Get installed version of compiler used for DPDK
> > + """Get installed version of compiler used for DPDK.
> > +
> > + Args:
> > + compiler_name: The name of the compiler executable.
> > +
> > + Returns:
> > + The compiler's version.
> > """
> >
> > @abstractmethod
> > def get_node_info(self) -> NodeInfo:
> > - """
> > - Collect information about the node
> > + """Collect additional information about the node.
> > +
> > + Returns:
> > + Node information.
> > """
> >
> > @abstractmethod
> > def update_ports(self, ports: list[Port]) -> None:
> > - """
> > - Get additional information about ports:
> > - Logical name (e.g. enp7s0) if applicable
> > - Mac address
> > + """Get additional information about ports from the operating system and update them.
> > +
> > + The additional information is:
> > +
> > + * Logical name (e.g. ``enp7s0``) if applicable,
> > + * Mac address.
> > +
> > + Args:
> > + ports: The ports to update.
> > """
> >
> > @abstractmethod
> > def configure_port_state(self, port: Port, enable: bool) -> None:
> > - """
> > - Enable/disable port.
> > + """Enable/disable `port` in the operating system.
> > +
> > + Args:
> > + port: The port to configure.
> > + enable: If :data:`True`, enable the port, otherwise shut it down.
> > """
> >
> > @abstractmethod
> > @@ -273,12 +408,18 @@ def configure_port_ip_address(
> > port: Port,
> > delete: bool,
> > ) -> None:
> > - """
> > - Configure (add or delete) an IP address of the input port.
> > + """Configure an IP address on `port` in the operating system.
> > +
> > + Args:
> > + address: The address to configure.
> > + port: The port to configure.
> > + delete: If :data:`True`, remove the IP address, otherwise configure it.
> > """
> >
> > @abstractmethod
> > def configure_ipv4_forwarding(self, enable: bool) -> None:
> > - """
> > - Enable IPv4 forwarding in the underlying OS.
> > + """Enable IPv4 forwarding in the operating system.
> > +
> > + Args:
> > + enable: If :data:`True`, enable the forwarding, otherwise disable it.
> > """
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 16/21] dts: posix and linux sessions docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (14 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 15/21] dts: os session " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-22 13:24 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 17/21] dts: node " Juraj Linkeš
` (5 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/linux_session.py | 63 ++++++++++-----
dts/framework/testbed_model/posix_session.py | 81 +++++++++++++++++---
2 files changed, 113 insertions(+), 31 deletions(-)
diff --git a/dts/framework/testbed_model/linux_session.py b/dts/framework/testbed_model/linux_session.py
index f472bb8f0f..279954ff63 100644
--- a/dts/framework/testbed_model/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -2,6 +2,13 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""Linux OS translator.
+
+Translate OS-unaware calls into Linux calls/utilities. Most of Linux distributions are mostly
+compliant with POSIX standards, so this module only implements the parts that aren't.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import json
from ipaddress import IPv4Interface, IPv6Interface
from typing import TypedDict, Union
@@ -17,43 +24,51 @@
class LshwConfigurationOutput(TypedDict):
+ """The relevant parts of ``lshw``'s ``configuration`` section."""
+
+ #:
link: str
class LshwOutput(TypedDict):
- """
- A model of the relevant information from json lshw output, e.g.:
- {
- ...
- "businfo" : "pci@0000:08:00.0",
- "logicalname" : "enp8s0",
- "version" : "00",
- "serial" : "52:54:00:59:e1:ac",
- ...
- "configuration" : {
- ...
- "link" : "yes",
- ...
- },
- ...
+ """A model of the relevant information from ``lshw``'s json output.
+
+ e.g.::
+
+ {
+ ...
+ "businfo" : "pci@0000:08:00.0",
+ "logicalname" : "enp8s0",
+ "version" : "00",
+ "serial" : "52:54:00:59:e1:ac",
+ ...
+ "configuration" : {
+ ...
+ "link" : "yes",
+ ...
+ },
+ ...
"""
+ #:
businfo: str
+ #:
logicalname: NotRequired[str]
+ #:
serial: NotRequired[str]
+ #:
configuration: LshwConfigurationOutput
class LinuxSession(PosixSession):
- """
- The implementation of non-Posix compliant parts of Linux remote sessions.
- """
+ """The implementation of non-Posix compliant parts of Linux."""
@staticmethod
def _get_privileged_command(command: str) -> str:
return f"sudo -- sh -c '{command}'"
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_cpus`."""
cpu_info = self.send_command("lscpu -p=CPU,CORE,SOCKET,NODE|grep -v \\#").stdout
lcores = []
for cpu_line in cpu_info.splitlines():
@@ -65,18 +80,20 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
return lcores
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return dpdk_prefix
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.setup_hugepages`."""
self._logger.info("Getting Hugepage information.")
hugepage_size = self._get_hugepage_size()
hugepages_total = self._get_hugepages_total()
self._numa_nodes = self._get_numa_nodes()
- if force_first_numa or hugepages_total != hugepage_amount:
+ if force_first_numa or hugepages_total != hugepage_count:
# when forcing numa, we need to clear existing hugepages regardless
# of size, so they can be moved to the first numa node
- self._configure_huge_pages(hugepage_amount, hugepage_size, force_first_numa)
+ self._configure_huge_pages(hugepage_count, hugepage_size, force_first_numa)
else:
self._logger.info("Hugepages already configured.")
self._mount_huge_pages()
@@ -140,6 +157,7 @@ def _configure_huge_pages(
)
def update_ports(self, ports: list[Port]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.update_ports`."""
self._logger.debug("Gathering port info.")
for port in ports:
assert (
@@ -178,6 +196,7 @@ def _update_port_attr(
)
def configure_port_state(self, port: Port, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_state`."""
state = "up" if enable else "down"
self.send_command(
f"ip link set dev {port.logical_name} {state}", privileged=True
@@ -189,6 +208,7 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_ip_address`."""
command = "del" if delete else "add"
self.send_command(
f"ip address {command} {address} dev {port.logical_name}",
@@ -197,5 +217,6 @@ def configure_port_ip_address(
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_ipv4_forwarding`."""
state = 1 if enable else 0
self.send_command(f"sysctl -w net.ipv4.ip_forward={state}", privileged=True)
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 1d1d5b1b26..a4824aa274 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -2,6 +2,15 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""POSIX compliant OS translator.
+
+Translates OS-unaware calls into POSIX compliant calls/utilities. POSIX is a set of standards
+for portability between Unix operating systems which not all Linux distributions
+(or the tools most frequently bundled with said distributions) adhere to. Most of Linux
+distributions are mostly compliant though.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import re
from collections.abc import Iterable
from pathlib import PurePath, PurePosixPath
@@ -15,13 +24,21 @@
class PosixSession(OSSession):
- """
- An intermediary class implementing the Posix compliant parts of
- Linux and other OS remote sessions.
- """
+ """An intermediary class implementing the POSIX standard."""
@staticmethod
def combine_short_options(**opts: bool) -> str:
+ """Combine shell options into one argument.
+
+ These are options such as ``-x``, ``-v``, ``-f`` which are combined into ``-xvf``.
+
+ Args:
+ opts: The keys are option names (usually one letter) and the bool values indicate
+ whether to include the option in the resulting argument.
+
+ Returns:
+ The options combined into one argument.
+ """
ret_opts = ""
for opt, include in opts.items():
if include:
@@ -33,17 +50,19 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.guess_dpdk_remote_dir`."""
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
def get_remote_tmp_dir(self) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_tmp_dir`."""
return PurePosixPath("/tmp")
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for i686 arch build. Get information
- from the node if needed.
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_build_env_vars`.
+
+ Supported architecture: ``i686``.
"""
env_vars = {}
if arch == Architecture.i686:
@@ -63,6 +82,7 @@ def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
return env_vars
def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
return PurePosixPath(*args)
def copy_from(
@@ -70,6 +90,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_from`."""
self.remote_session.copy_from(source_file, destination_file)
def copy_to(
@@ -77,6 +98,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_to`."""
self.remote_session.copy_to(source_file, destination_file)
def remove_remote_dir(
@@ -85,6 +107,7 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.remove_remote_dir`."""
opts = PosixSession.combine_short_options(r=recursive, f=force)
self.send_command(f"rm{opts} {remote_dir_path}")
@@ -93,6 +116,7 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
self.send_command(
f"tar xfm {remote_tarball_path} "
f"-C {PurePosixPath(remote_tarball_path).parent}",
@@ -110,6 +134,7 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.build_dpdk`."""
try:
if rebuild:
# reconfigure, then build
@@ -140,12 +165,14 @@ def build_dpdk(
raise DPDKBuildError(f"DPDK build failed when doing '{e.command}'.")
def get_dpdk_version(self, build_dir: str | PurePath) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_version`."""
out = self.send_command(
f"cat {self.join_remote_path(build_dir, 'VERSION')}", verify=True
)
return out.stdout
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.kill_cleanup_dpdk_apps`."""
self._logger.info("Cleaning up DPDK apps.")
dpdk_runtime_dirs = self._get_dpdk_runtime_dirs(dpdk_prefix_list)
if dpdk_runtime_dirs:
@@ -159,6 +186,14 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
def _get_dpdk_runtime_dirs(
self, dpdk_prefix_list: Iterable[str]
) -> list[PurePosixPath]:
+ """Find runtime directories DPDK apps are currently using.
+
+ Args:
+ dpdk_prefix_list: The prefixes DPDK apps were started with.
+
+ Returns:
+ The paths of DPDK apps' runtime dirs.
+ """
prefix = PurePosixPath("/var", "run", "dpdk")
if not dpdk_prefix_list:
remote_prefixes = self._list_remote_dirs(prefix)
@@ -170,9 +205,13 @@ def _get_dpdk_runtime_dirs(
return [PurePosixPath(prefix, dpdk_prefix) for dpdk_prefix in dpdk_prefix_list]
def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
- """
- Return a list of directories of the remote_dir.
- If remote_path doesn't exist, return None.
+ """Contents of remote_path.
+
+ Args:
+ remote_path: List the contents of this path.
+
+ Returns:
+ The contents of remote_path. If remote_path doesn't exist, return None.
"""
out = self.send_command(
f"ls -l {remote_path} | awk '/^d/ {{print $NF}}'"
@@ -183,6 +222,17 @@ def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
return out.splitlines()
def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[int]:
+ """Find PIDs of running DPDK apps.
+
+ Look at each "config" file found in dpdk_runtime_dirs and find the PIDs of processes
+ that opened those file.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+
+ Returns:
+ The PIDs of running DPDK apps.
+ """
pids = []
pid_regex = r"p(\d+)"
for dpdk_runtime_dir in dpdk_runtime_dirs:
@@ -203,6 +253,14 @@ def _remote_files_exists(self, remote_path: PurePath) -> bool:
def _check_dpdk_hugepages(
self, dpdk_runtime_dirs: Iterable[str | PurePath]
) -> None:
+ """Check there aren't any leftover hugepages.
+
+ If any hugegapes are found, emit a warning. The hugepages are investigated in the
+ "hugepage_info" file of dpdk_runtime_dirs.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+ """
for dpdk_runtime_dir in dpdk_runtime_dirs:
hugepage_info = PurePosixPath(dpdk_runtime_dir, "hugepage_info")
if self._remote_files_exists(hugepage_info):
@@ -220,9 +278,11 @@ def _remove_dpdk_runtime_dirs(
self.remove_remote_dir(dpdk_runtime_dir)
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return ""
def get_compiler_version(self, compiler_name: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_compiler_version`."""
match compiler_name:
case "gcc":
return self.send_command(
@@ -240,6 +300,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
raise ValueError(f"Unknown compiler {compiler_name}")
def get_node_info(self) -> NodeInfo:
+ """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
os_release_info = self.send_command(
"awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
SETTINGS.timeout,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 16/21] dts: posix and linux sessions docstring update
2023-11-15 13:09 ` [PATCH v7 16/21] dts: posix and linux sessions " Juraj Linkeš
@ 2023-11-22 13:24 ` Yoan Picchi
2023-11-22 13:35 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-22 13:24 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/testbed_model/linux_session.py | 63 ++++++++++-----
> dts/framework/testbed_model/posix_session.py | 81 +++++++++++++++++---
> 2 files changed, 113 insertions(+), 31 deletions(-)
>
> diff --git a/dts/framework/testbed_model/linux_session.py b/dts/framework/testbed_model/linux_session.py
> index f472bb8f0f..279954ff63 100644
> --- a/dts/framework/testbed_model/linux_session.py
> +++ b/dts/framework/testbed_model/linux_session.py
> @@ -2,6 +2,13 @@
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2023 University of New Hampshire
>
> +"""Linux OS translator.
> +
> +Translate OS-unaware calls into Linux calls/utilities. Most of Linux distributions are mostly
> +compliant with POSIX standards, so this module only implements the parts that aren't.
> +This intermediate module implements the common parts of mostly POSIX compliant distributions.
> +"""
> +
> import json
> from ipaddress import IPv4Interface, IPv6Interface
> from typing import TypedDict, Union
> @@ -17,43 +24,51 @@
>
>
> class LshwConfigurationOutput(TypedDict):
> + """The relevant parts of ``lshw``'s ``configuration`` section."""
> +
> + #:
> link: str
>
>
> class LshwOutput(TypedDict):
> - """
> - A model of the relevant information from json lshw output, e.g.:
> - {
> - ...
> - "businfo" : "pci@0000:08:00.0",
> - "logicalname" : "enp8s0",
> - "version" : "00",
> - "serial" : "52:54:00:59:e1:ac",
> - ...
> - "configuration" : {
> - ...
> - "link" : "yes",
> - ...
> - },
> - ...
> + """A model of the relevant information from ``lshw``'s json output.
> +
> + e.g.::
> +
> + {
> + ...
> + "businfo" : "pci@0000:08:00.0",
> + "logicalname" : "enp8s0",
> + "version" : "00",
> + "serial" : "52:54:00:59:e1:ac",
> + ...
> + "configuration" : {
> + ...
> + "link" : "yes",
> + ...
> + },
> + ...
> """
>
> + #:
> businfo: str
> + #:
> logicalname: NotRequired[str]
> + #:
> serial: NotRequired[str]
> + #:
> configuration: LshwConfigurationOutput
>
>
> class LinuxSession(PosixSession):
> - """
> - The implementation of non-Posix compliant parts of Linux remote sessions.
> - """
> + """The implementation of non-Posix compliant parts of Linux."""
>
> @staticmethod
> def _get_privileged_command(command: str) -> str:
> return f"sudo -- sh -c '{command}'"
>
> def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> + """Overrides :meth:`~.os_session.OSSession.get_remote_cpus`."""
> cpu_info = self.send_command("lscpu -p=CPU,CORE,SOCKET,NODE|grep -v \\#").stdout
> lcores = []
> for cpu_line in cpu_info.splitlines():
> @@ -65,18 +80,20 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> return lcores
>
> def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> + """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
> return dpdk_prefix
>
> - def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
> + def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
> + """Overrides :meth:`~.os_session.OSSession.setup_hugepages`."""
> self._logger.info("Getting Hugepage information.")
> hugepage_size = self._get_hugepage_size()
> hugepages_total = self._get_hugepages_total()
> self._numa_nodes = self._get_numa_nodes()
>
> - if force_first_numa or hugepages_total != hugepage_amount:
> + if force_first_numa or hugepages_total != hugepage_count:
> # when forcing numa, we need to clear existing hugepages regardless
> # of size, so they can be moved to the first numa node
> - self._configure_huge_pages(hugepage_amount, hugepage_size, force_first_numa)
> + self._configure_huge_pages(hugepage_count, hugepage_size, force_first_numa)
> else:
> self._logger.info("Hugepages already configured.")
> self._mount_huge_pages()
> @@ -140,6 +157,7 @@ def _configure_huge_pages(
> )
>
> def update_ports(self, ports: list[Port]) -> None:
> + """Overrides :meth:`~.os_session.OSSession.update_ports`."""
> self._logger.debug("Gathering port info.")
> for port in ports:
> assert (
> @@ -178,6 +196,7 @@ def _update_port_attr(
> )
>
> def configure_port_state(self, port: Port, enable: bool) -> None:
> + """Overrides :meth:`~.os_session.OSSession.configure_port_state`."""
> state = "up" if enable else "down"
> self.send_command(
> f"ip link set dev {port.logical_name} {state}", privileged=True
> @@ -189,6 +208,7 @@ def configure_port_ip_address(
> port: Port,
> delete: bool,
> ) -> None:
> + """Overrides :meth:`~.os_session.OSSession.configure_port_ip_address`."""
> command = "del" if delete else "add"
> self.send_command(
> f"ip address {command} {address} dev {port.logical_name}",
> @@ -197,5 +217,6 @@ def configure_port_ip_address(
> )
>
> def configure_ipv4_forwarding(self, enable: bool) -> None:
> + """Overrides :meth:`~.os_session.OSSession.configure_ipv4_forwarding`."""
> state = 1 if enable else 0
> self.send_command(f"sysctl -w net.ipv4.ip_forward={state}", privileged=True)
> diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
> index 1d1d5b1b26..a4824aa274 100644
> --- a/dts/framework/testbed_model/posix_session.py
> +++ b/dts/framework/testbed_model/posix_session.py
> @@ -2,6 +2,15 @@
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2023 University of New Hampshire
>
> +"""POSIX compliant OS translator.
> +
> +Translates OS-unaware calls into POSIX compliant calls/utilities. POSIX is a set of standards
> +for portability between Unix operating systems which not all Linux distributions
> +(or the tools most frequently bundled with said distributions) adhere to. Most of Linux
> +distributions are mostly compliant though.
> +This intermediate module implements the common parts of mostly POSIX compliant distributions.
> +"""
> +
> import re
> from collections.abc import Iterable
> from pathlib import PurePath, PurePosixPath
> @@ -15,13 +24,21 @@
>
>
> class PosixSession(OSSession):
> - """
> - An intermediary class implementing the Posix compliant parts of
> - Linux and other OS remote sessions.
> - """
> + """An intermediary class implementing the POSIX standard."""
>
> @staticmethod
> def combine_short_options(**opts: bool) -> str:
> + """Combine shell options into one argument.
> +
> + These are options such as ``-x``, ``-v``, ``-f`` which are combined into ``-xvf``.
> +
> + Args:
> + opts: The keys are option names (usually one letter) and the bool values indicate
> + whether to include the option in the resulting argument.
> +
> + Returns:
> + The options combined into one argument.
> + """
> ret_opts = ""
> for opt, include in opts.items():
> if include:
> @@ -33,17 +50,19 @@ def combine_short_options(**opts: bool) -> str:
> return ret_opts
>
> def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
> + """Overrides :meth:`~.os_session.OSSession.guess_dpdk_remote_dir`."""
> remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
> result = self.send_command(f"ls -d {remote_guess} | tail -1")
> return PurePosixPath(result.stdout)
>
> def get_remote_tmp_dir(self) -> PurePosixPath:
> + """Overrides :meth:`~.os_session.OSSession.get_remote_tmp_dir`."""
> return PurePosixPath("/tmp")
>
> def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
> - """
> - Create extra environment variables needed for i686 arch build. Get information
> - from the node if needed.
> + """Overrides :meth:`~.os_session.OSSession.get_dpdk_build_env_vars`.
> +
> + Supported architecture: ``i686``.
> """
> env_vars = {}
> if arch == Architecture.i686:
> @@ -63,6 +82,7 @@ def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
> return env_vars
>
> def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
> + """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
> return PurePosixPath(*args)
>
> def copy_from(
> @@ -70,6 +90,7 @@ def copy_from(
> source_file: str | PurePath,
> destination_file: str | PurePath,
> ) -> None:
> + """Overrides :meth:`~.os_session.OSSession.copy_from`."""
> self.remote_session.copy_from(source_file, destination_file)
>
> def copy_to(
> @@ -77,6 +98,7 @@ def copy_to(
> source_file: str | PurePath,
> destination_file: str | PurePath,
> ) -> None:
> + """Overrides :meth:`~.os_session.OSSession.copy_to`."""
> self.remote_session.copy_to(source_file, destination_file)
>
> def remove_remote_dir(
> @@ -85,6 +107,7 @@ def remove_remote_dir(
> recursive: bool = True,
> force: bool = True,
> ) -> None:
> + """Overrides :meth:`~.os_session.OSSession.remove_remote_dir`."""
> opts = PosixSession.combine_short_options(r=recursive, f=force)
> self.send_command(f"rm{opts} {remote_dir_path}")
>
> @@ -93,6 +116,7 @@ def extract_remote_tarball(
> remote_tarball_path: str | PurePath,
> expected_dir: str | PurePath | None = None,
> ) -> None:
> + """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
> self.send_command(
> f"tar xfm {remote_tarball_path} "
> f"-C {PurePosixPath(remote_tarball_path).parent}",
> @@ -110,6 +134,7 @@ def build_dpdk(
> rebuild: bool = False,
> timeout: float = SETTINGS.compile_timeout,
> ) -> None:
> + """Overrides :meth:`~.os_session.OSSession.build_dpdk`."""
> try:
> if rebuild:
> # reconfigure, then build
> @@ -140,12 +165,14 @@ def build_dpdk(
> raise DPDKBuildError(f"DPDK build failed when doing '{e.command}'.")
>
> def get_dpdk_version(self, build_dir: str | PurePath) -> str:
> + """Overrides :meth:`~.os_session.OSSession.get_dpdk_version`."""
> out = self.send_command(
> f"cat {self.join_remote_path(build_dir, 'VERSION')}", verify=True
> )
> return out.stdout
>
> def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
> + """Overrides :meth:`~.os_session.OSSession.kill_cleanup_dpdk_apps`."""
> self._logger.info("Cleaning up DPDK apps.")
> dpdk_runtime_dirs = self._get_dpdk_runtime_dirs(dpdk_prefix_list)
> if dpdk_runtime_dirs:
> @@ -159,6 +186,14 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
> def _get_dpdk_runtime_dirs(
> self, dpdk_prefix_list: Iterable[str]
> ) -> list[PurePosixPath]:
> + """Find runtime directories DPDK apps are currently using.
> +
> + Args:
> + dpdk_prefix_list: The prefixes DPDK apps were started with.
> +
> + Returns:
> + The paths of DPDK apps' runtime dirs.
> + """
> prefix = PurePosixPath("/var", "run", "dpdk")
> if not dpdk_prefix_list:
> remote_prefixes = self._list_remote_dirs(prefix)
> @@ -170,9 +205,13 @@ def _get_dpdk_runtime_dirs(
> return [PurePosixPath(prefix, dpdk_prefix) for dpdk_prefix in dpdk_prefix_list]
>
> def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
> - """
> - Return a list of directories of the remote_dir.
> - If remote_path doesn't exist, return None.
> + """Contents of remote_path.
> +
> + Args:
> + remote_path: List the contents of this path.
> +
> + Returns:
> + The contents of remote_path. If remote_path doesn't exist, return None.
> """
> out = self.send_command(
> f"ls -l {remote_path} | awk '/^d/ {{print $NF}}'"
> @@ -183,6 +222,17 @@ def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
> return out.splitlines()
>
> def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[int]:
> + """Find PIDs of running DPDK apps.
> +
> + Look at each "config" file found in dpdk_runtime_dirs and find the PIDs of processes
> + that opened those file.
> +
> + Args:
> + dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
> +
> + Returns:
> + The PIDs of running DPDK apps.
> + """
> pids = []
> pid_regex = r"p(\d+)"
> for dpdk_runtime_dir in dpdk_runtime_dirs:
> @@ -203,6 +253,14 @@ def _remote_files_exists(self, remote_path: PurePath) -> bool:
> def _check_dpdk_hugepages(
> self, dpdk_runtime_dirs: Iterable[str | PurePath]
> ) -> None:
> + """Check there aren't any leftover hugepages.
> +
> + If any hugegapes are found, emit a warning. The hugepages are investigated in the
hugegapes -> hugepages
> + "hugepage_info" file of dpdk_runtime_dirs.
> +
> + Args:
> + dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
> + """
> for dpdk_runtime_dir in dpdk_runtime_dirs:
> hugepage_info = PurePosixPath(dpdk_runtime_dir, "hugepage_info")
> if self._remote_files_exists(hugepage_info):
> @@ -220,9 +278,11 @@ def _remove_dpdk_runtime_dirs(
> self.remove_remote_dir(dpdk_runtime_dir)
>
> def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> + """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
> return ""
>
> def get_compiler_version(self, compiler_name: str) -> str:
> + """Overrides :meth:`~.os_session.OSSession.get_compiler_version`."""
> match compiler_name:
> case "gcc":
> return self.send_command(
> @@ -240,6 +300,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
> raise ValueError(f"Unknown compiler {compiler_name}")
>
> def get_node_info(self) -> NodeInfo:
> + """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
> os_release_info = self.send_command(
> "awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
> SETTINGS.timeout,
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 16/21] dts: posix and linux sessions docstring update
2023-11-22 13:24 ` Yoan Picchi
@ 2023-11-22 13:35 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 13:35 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Wed, Nov 22, 2023 at 2:24 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/testbed_model/linux_session.py | 63 ++++++++++-----
> > dts/framework/testbed_model/posix_session.py | 81 +++++++++++++++++---
> > 2 files changed, 113 insertions(+), 31 deletions(-)
> >
> > diff --git a/dts/framework/testbed_model/linux_session.py b/dts/framework/testbed_model/linux_session.py
> > index f472bb8f0f..279954ff63 100644
> > --- a/dts/framework/testbed_model/linux_session.py
> > +++ b/dts/framework/testbed_model/linux_session.py
> > @@ -2,6 +2,13 @@
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2023 University of New Hampshire
> >
> > +"""Linux OS translator.
> > +
> > +Translate OS-unaware calls into Linux calls/utilities. Most of Linux distributions are mostly
> > +compliant with POSIX standards, so this module only implements the parts that aren't.
> > +This intermediate module implements the common parts of mostly POSIX compliant distributions.
> > +"""
> > +
> > import json
> > from ipaddress import IPv4Interface, IPv6Interface
> > from typing import TypedDict, Union
> > @@ -17,43 +24,51 @@
> >
> >
> > class LshwConfigurationOutput(TypedDict):
> > + """The relevant parts of ``lshw``'s ``configuration`` section."""
> > +
> > + #:
> > link: str
> >
> >
> > class LshwOutput(TypedDict):
> > - """
> > - A model of the relevant information from json lshw output, e.g.:
> > - {
> > - ...
> > - "businfo" : "pci@0000:08:00.0",
> > - "logicalname" : "enp8s0",
> > - "version" : "00",
> > - "serial" : "52:54:00:59:e1:ac",
> > - ...
> > - "configuration" : {
> > - ...
> > - "link" : "yes",
> > - ...
> > - },
> > - ...
> > + """A model of the relevant information from ``lshw``'s json output.
> > +
> > + e.g.::
> > +
> > + {
> > + ...
> > + "businfo" : "pci@0000:08:00.0",
> > + "logicalname" : "enp8s0",
> > + "version" : "00",
> > + "serial" : "52:54:00:59:e1:ac",
> > + ...
> > + "configuration" : {
> > + ...
> > + "link" : "yes",
> > + ...
> > + },
> > + ...
> > """
> >
> > + #:
> > businfo: str
> > + #:
> > logicalname: NotRequired[str]
> > + #:
> > serial: NotRequired[str]
> > + #:
> > configuration: LshwConfigurationOutput
> >
> >
> > class LinuxSession(PosixSession):
> > - """
> > - The implementation of non-Posix compliant parts of Linux remote sessions.
> > - """
> > + """The implementation of non-Posix compliant parts of Linux."""
> >
> > @staticmethod
> > def _get_privileged_command(command: str) -> str:
> > return f"sudo -- sh -c '{command}'"
> >
> > def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> > + """Overrides :meth:`~.os_session.OSSession.get_remote_cpus`."""
> > cpu_info = self.send_command("lscpu -p=CPU,CORE,SOCKET,NODE|grep -v \\#").stdout
> > lcores = []
> > for cpu_line in cpu_info.splitlines():
> > @@ -65,18 +80,20 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> > return lcores
> >
> > def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> > + """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
> > return dpdk_prefix
> >
> > - def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
> > + def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.setup_hugepages`."""
> > self._logger.info("Getting Hugepage information.")
> > hugepage_size = self._get_hugepage_size()
> > hugepages_total = self._get_hugepages_total()
> > self._numa_nodes = self._get_numa_nodes()
> >
> > - if force_first_numa or hugepages_total != hugepage_amount:
> > + if force_first_numa or hugepages_total != hugepage_count:
> > # when forcing numa, we need to clear existing hugepages regardless
> > # of size, so they can be moved to the first numa node
> > - self._configure_huge_pages(hugepage_amount, hugepage_size, force_first_numa)
> > + self._configure_huge_pages(hugepage_count, hugepage_size, force_first_numa)
> > else:
> > self._logger.info("Hugepages already configured.")
> > self._mount_huge_pages()
> > @@ -140,6 +157,7 @@ def _configure_huge_pages(
> > )
> >
> > def update_ports(self, ports: list[Port]) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.update_ports`."""
> > self._logger.debug("Gathering port info.")
> > for port in ports:
> > assert (
> > @@ -178,6 +196,7 @@ def _update_port_attr(
> > )
> >
> > def configure_port_state(self, port: Port, enable: bool) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.configure_port_state`."""
> > state = "up" if enable else "down"
> > self.send_command(
> > f"ip link set dev {port.logical_name} {state}", privileged=True
> > @@ -189,6 +208,7 @@ def configure_port_ip_address(
> > port: Port,
> > delete: bool,
> > ) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.configure_port_ip_address`."""
> > command = "del" if delete else "add"
> > self.send_command(
> > f"ip address {command} {address} dev {port.logical_name}",
> > @@ -197,5 +217,6 @@ def configure_port_ip_address(
> > )
> >
> > def configure_ipv4_forwarding(self, enable: bool) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.configure_ipv4_forwarding`."""
> > state = 1 if enable else 0
> > self.send_command(f"sysctl -w net.ipv4.ip_forward={state}", privileged=True)
> > diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
> > index 1d1d5b1b26..a4824aa274 100644
> > --- a/dts/framework/testbed_model/posix_session.py
> > +++ b/dts/framework/testbed_model/posix_session.py
> > @@ -2,6 +2,15 @@
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2023 University of New Hampshire
> >
> > +"""POSIX compliant OS translator.
> > +
> > +Translates OS-unaware calls into POSIX compliant calls/utilities. POSIX is a set of standards
> > +for portability between Unix operating systems which not all Linux distributions
> > +(or the tools most frequently bundled with said distributions) adhere to. Most of Linux
> > +distributions are mostly compliant though.
> > +This intermediate module implements the common parts of mostly POSIX compliant distributions.
> > +"""
> > +
> > import re
> > from collections.abc import Iterable
> > from pathlib import PurePath, PurePosixPath
> > @@ -15,13 +24,21 @@
> >
> >
> > class PosixSession(OSSession):
> > - """
> > - An intermediary class implementing the Posix compliant parts of
> > - Linux and other OS remote sessions.
> > - """
> > + """An intermediary class implementing the POSIX standard."""
> >
> > @staticmethod
> > def combine_short_options(**opts: bool) -> str:
> > + """Combine shell options into one argument.
> > +
> > + These are options such as ``-x``, ``-v``, ``-f`` which are combined into ``-xvf``.
> > +
> > + Args:
> > + opts: The keys are option names (usually one letter) and the bool values indicate
> > + whether to include the option in the resulting argument.
> > +
> > + Returns:
> > + The options combined into one argument.
> > + """
> > ret_opts = ""
> > for opt, include in opts.items():
> > if include:
> > @@ -33,17 +50,19 @@ def combine_short_options(**opts: bool) -> str:
> > return ret_opts
> >
> > def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
> > + """Overrides :meth:`~.os_session.OSSession.guess_dpdk_remote_dir`."""
> > remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
> > result = self.send_command(f"ls -d {remote_guess} | tail -1")
> > return PurePosixPath(result.stdout)
> >
> > def get_remote_tmp_dir(self) -> PurePosixPath:
> > + """Overrides :meth:`~.os_session.OSSession.get_remote_tmp_dir`."""
> > return PurePosixPath("/tmp")
> >
> > def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
> > - """
> > - Create extra environment variables needed for i686 arch build. Get information
> > - from the node if needed.
> > + """Overrides :meth:`~.os_session.OSSession.get_dpdk_build_env_vars`.
> > +
> > + Supported architecture: ``i686``.
> > """
> > env_vars = {}
> > if arch == Architecture.i686:
> > @@ -63,6 +82,7 @@ def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
> > return env_vars
> >
> > def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
> > + """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
> > return PurePosixPath(*args)
> >
> > def copy_from(
> > @@ -70,6 +90,7 @@ def copy_from(
> > source_file: str | PurePath,
> > destination_file: str | PurePath,
> > ) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.copy_from`."""
> > self.remote_session.copy_from(source_file, destination_file)
> >
> > def copy_to(
> > @@ -77,6 +98,7 @@ def copy_to(
> > source_file: str | PurePath,
> > destination_file: str | PurePath,
> > ) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.copy_to`."""
> > self.remote_session.copy_to(source_file, destination_file)
> >
> > def remove_remote_dir(
> > @@ -85,6 +107,7 @@ def remove_remote_dir(
> > recursive: bool = True,
> > force: bool = True,
> > ) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.remove_remote_dir`."""
> > opts = PosixSession.combine_short_options(r=recursive, f=force)
> > self.send_command(f"rm{opts} {remote_dir_path}")
> >
> > @@ -93,6 +116,7 @@ def extract_remote_tarball(
> > remote_tarball_path: str | PurePath,
> > expected_dir: str | PurePath | None = None,
> > ) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
> > self.send_command(
> > f"tar xfm {remote_tarball_path} "
> > f"-C {PurePosixPath(remote_tarball_path).parent}",
> > @@ -110,6 +134,7 @@ def build_dpdk(
> > rebuild: bool = False,
> > timeout: float = SETTINGS.compile_timeout,
> > ) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.build_dpdk`."""
> > try:
> > if rebuild:
> > # reconfigure, then build
> > @@ -140,12 +165,14 @@ def build_dpdk(
> > raise DPDKBuildError(f"DPDK build failed when doing '{e.command}'.")
> >
> > def get_dpdk_version(self, build_dir: str | PurePath) -> str:
> > + """Overrides :meth:`~.os_session.OSSession.get_dpdk_version`."""
> > out = self.send_command(
> > f"cat {self.join_remote_path(build_dir, 'VERSION')}", verify=True
> > )
> > return out.stdout
> >
> > def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
> > + """Overrides :meth:`~.os_session.OSSession.kill_cleanup_dpdk_apps`."""
> > self._logger.info("Cleaning up DPDK apps.")
> > dpdk_runtime_dirs = self._get_dpdk_runtime_dirs(dpdk_prefix_list)
> > if dpdk_runtime_dirs:
> > @@ -159,6 +186,14 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
> > def _get_dpdk_runtime_dirs(
> > self, dpdk_prefix_list: Iterable[str]
> > ) -> list[PurePosixPath]:
> > + """Find runtime directories DPDK apps are currently using.
> > +
> > + Args:
> > + dpdk_prefix_list: The prefixes DPDK apps were started with.
> > +
> > + Returns:
> > + The paths of DPDK apps' runtime dirs.
> > + """
> > prefix = PurePosixPath("/var", "run", "dpdk")
> > if not dpdk_prefix_list:
> > remote_prefixes = self._list_remote_dirs(prefix)
> > @@ -170,9 +205,13 @@ def _get_dpdk_runtime_dirs(
> > return [PurePosixPath(prefix, dpdk_prefix) for dpdk_prefix in dpdk_prefix_list]
> >
> > def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
> > - """
> > - Return a list of directories of the remote_dir.
> > - If remote_path doesn't exist, return None.
> > + """Contents of remote_path.
> > +
> > + Args:
> > + remote_path: List the contents of this path.
> > +
> > + Returns:
> > + The contents of remote_path. If remote_path doesn't exist, return None.
> > """
> > out = self.send_command(
> > f"ls -l {remote_path} | awk '/^d/ {{print $NF}}'"
> > @@ -183,6 +222,17 @@ def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
> > return out.splitlines()
> >
> > def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[int]:
> > + """Find PIDs of running DPDK apps.
> > +
> > + Look at each "config" file found in dpdk_runtime_dirs and find the PIDs of processes
> > + that opened those file.
> > +
> > + Args:
> > + dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
> > +
> > + Returns:
> > + The PIDs of running DPDK apps.
> > + """
> > pids = []
> > pid_regex = r"p(\d+)"
> > for dpdk_runtime_dir in dpdk_runtime_dirs:
> > @@ -203,6 +253,14 @@ def _remote_files_exists(self, remote_path: PurePath) -> bool:
> > def _check_dpdk_hugepages(
> > self, dpdk_runtime_dirs: Iterable[str | PurePath]
> > ) -> None:
> > + """Check there aren't any leftover hugepages.
> > +
> > + If any hugegapes are found, emit a warning. The hugepages are investigated in the
>
> hugegapes -> hugepages
>
Ack.
> > + "hugepage_info" file of dpdk_runtime_dirs.
> > +
> > + Args:
> > + dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
> > + """
> > for dpdk_runtime_dir in dpdk_runtime_dirs:
> > hugepage_info = PurePosixPath(dpdk_runtime_dir, "hugepage_info")
> > if self._remote_files_exists(hugepage_info):
> > @@ -220,9 +278,11 @@ def _remove_dpdk_runtime_dirs(
> > self.remove_remote_dir(dpdk_runtime_dir)
> >
> > def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> > + """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
> > return ""
> >
> > def get_compiler_version(self, compiler_name: str) -> str:
> > + """Overrides :meth:`~.os_session.OSSession.get_compiler_version`."""
> > match compiler_name:
> > case "gcc":
> > return self.send_command(
> > @@ -240,6 +300,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
> > raise ValueError(f"Unknown compiler {compiler_name}")
> >
> > def get_node_info(self) -> NodeInfo:
> > + """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
> > os_release_info = self.send_command(
> > "awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
> > SETTINGS.timeout,
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 17/21] dts: node docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (15 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 16/21] dts: posix and linux sessions " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-22 12:18 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 18/21] dts: sut and tg nodes " Juraj Linkeš
` (4 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/node.py | 191 +++++++++++++++++++---------
1 file changed, 131 insertions(+), 60 deletions(-)
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index fa5b143cdd..f93b4acecd 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+A node is any host/server DTS connects to.
+
+The base class, :class:`Node`, provides functionality common to all nodes and is supposed
+to be extended by subclasses with functionality specific to each node type.
+The decorator :func:`Node.skip_setup` can be used without subclassing.
"""
from abc import ABC
@@ -35,10 +40,22 @@
class Node(ABC):
- """
- Basic class for node management. This class implements methods that
- manage a node, such as information gathering (of CPU/PCI/NIC) and
- environment setup.
+ """The base class for node management.
+
+ It shouldn't be instantiated, but rather subclassed.
+ It implements common methods to manage any node:
+
+ * Connection to the node,
+ * Hugepages setup.
+
+ Attributes:
+ main_session: The primary OS-aware remote session used to communicate with the node.
+ config: The node configuration.
+ name: The name of the node.
+ lcores: The list of logical cores that DTS can use on the node.
+ It's derived from logical cores present on the node and the test run configuration.
+ ports: The ports of this node specified in the test run configuration.
+ virtual_devices: The virtual devices used on the node.
"""
main_session: OSSession
@@ -52,6 +69,17 @@ class Node(ABC):
virtual_devices: list[VirtualDevice]
def __init__(self, node_config: NodeConfiguration):
+ """Connect to the node and gather info during initialization.
+
+ Extra gathered information:
+
+ * The list of available logical CPUs. This is then filtered by
+ the ``lcores`` configuration in the YAML test run configuration file,
+ * Information about ports from the YAML test run configuration file.
+
+ Args:
+ node_config: The node's test run configuration.
+ """
self.config = node_config
self.name = node_config.name
self._logger = getLogger(self.name)
@@ -60,7 +88,7 @@ def __init__(self, node_config: NodeConfiguration):
self._logger.info(f"Connected to node: {self.name}")
self._get_remote_cpus()
- # filter the node lcores according to user config
+ # filter the node lcores according to the test run configuration
self.lcores = LogicalCoreListFilter(
self.lcores, LogicalCoreList(self.config.lcores)
).filter()
@@ -76,9 +104,14 @@ def _init_ports(self) -> None:
self.configure_port_state(port)
def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- Perform the execution setup that will be done for each execution
- this node is part of.
+ """Execution setup steps.
+
+ Configure hugepages and call :meth:`_set_up_execution` where
+ the rest of the configuration steps (if any) are implemented.
+
+ Args:
+ execution_config: The execution test run configuration according to which
+ the setup steps will be taken.
"""
self._setup_hugepages()
self._set_up_execution(execution_config)
@@ -87,58 +120,74 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
self.virtual_devices.append(VirtualDevice(vdev))
def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution setup steps.
"""
def tear_down_execution(self) -> None:
- """
- Perform the execution teardown that will be done after each execution
- this node is part of concludes.
+ """Execution teardown steps.
+
+ There are currently no common execution teardown steps common to all DTS node types.
"""
self.virtual_devices = []
self._tear_down_execution()
def _tear_down_execution(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution teardown steps.
"""
def set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Perform the build target setup that will be done for each build target
- tested on this node.
+ """Build target setup steps.
+
+ There are currently no common build target setup steps common to all DTS node types.
+
+ Args:
+ build_target_config: The build target test run configuration according to which
+ the setup steps will be taken.
"""
self._set_up_build_target(build_target_config)
def _set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target setup steps.
"""
def tear_down_build_target(self) -> None:
- """
- Perform the build target teardown that will be done after each build target
- tested on this node.
+ """Build target teardown steps.
+
+ There are currently no common build target teardown steps common to all DTS node types.
"""
self._tear_down_build_target()
def _tear_down_build_target(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target teardown steps.
"""
def create_session(self, name: str) -> OSSession:
- """
- Create and return a new OSSession tailored to the remote OS.
+ """Create and return a new OS-aware remote session.
+
+ The returned session won't be used by the node creating it. The session must be used by
+ the caller. The session will be maintained for the entire lifecycle of the node object,
+ at the end of which the session will be cleaned up automatically.
+
+ Note:
+ Any number of these supplementary sessions may be created.
+
+ Args:
+ name: The name of the session.
+
+ Returns:
+ A new OS-aware remote session.
"""
session_name = f"{self.name} {name}"
connection = create_session(
@@ -156,19 +205,19 @@ def create_interactive_shell(
privileged: bool = False,
app_args: str = "",
) -> InteractiveShellType:
- """Create a handler for an interactive session.
+ """Factory for interactive session handlers.
- Instantiate shell_cls according to the remote OS specifics.
+ Instantiate `shell_cls` according to the remote OS specifics.
Args:
shell_cls: The class of the shell.
- timeout: Timeout for reading output from the SSH channel. If you are
- reading from the buffer and don't receive any data within the timeout
- it will throw an error.
+ timeout: Timeout for reading output from the SSH channel. If you are reading from
+ the buffer and don't receive any data within the timeout it will throw an error.
privileged: Whether to run the shell with administrative privileges.
app_args: The arguments to be passed to the application.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not shell_cls.dpdk_app:
shell_cls.path = self.main_session.join_remote_path(shell_cls.path)
@@ -185,14 +234,22 @@ def filter_lcores(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
) -> list[LogicalCore]:
- """
- Filter the LogicalCores found on the Node according to
- a LogicalCoreCount or a LogicalCoreList.
+ """Filter the node's logical cores that DTS can use.
+
+ Logical cores that DTS can use are the ones that are present on the node, but filtered
+ according to the test run configuration. The `filter_specifier` will filter cores from
+ those logical cores.
+
+ Args:
+ filter_specifier: Two different filters can be used, one that specifies the number
+ of logical cores per core, cores per socket and the number of sockets,
+ and another one that specifies a logical core list.
+ ascending: If :data:`True`, use cores with the lowest numerical id first and continue
+ in ascending order. If :data:`False`, start with the highest id and continue
+ in descending order. This ordering affects which sockets to consider first as well.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+ Returns:
+ The filtered logical cores.
"""
self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
return lcore_filter(
@@ -202,17 +259,14 @@ def filter_lcores(
).filter()
def _get_remote_cpus(self) -> None:
- """
- Scan CPUs in the remote OS and store a list of LogicalCores.
- """
+ """Scan CPUs in the remote OS and store a list of LogicalCores."""
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
def _setup_hugepages(self) -> None:
- """
- Setup hugepages on the Node. Different architectures can supply different
- amounts of memory for hugepages and numa-based hugepage allocation may need
- to be considered.
+ """Setup hugepages on the node.
+
+ Configure the hugepages only if they're specified in the node's test run configuration.
"""
if self.config.hugepages:
self.main_session.setup_hugepages(
@@ -220,8 +274,11 @@ def _setup_hugepages(self) -> None:
)
def configure_port_state(self, port: Port, enable: bool = True) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port`.
+
+ Args:
+ port: The port to enable/disable.
+ enable: :data:`True` to enable, :data:`False` to disable.
"""
self.main_session.configure_port_state(port, enable)
@@ -231,15 +288,17 @@ def configure_port_ip_address(
port: Port,
delete: bool = False,
) -> None:
- """
- Configure the IP address of a port on this node.
+ """Add an IP address to `port` on this node.
+
+ Args:
+ address: The IP address with mask in CIDR format. Can be either IPv4 or IPv6.
+ port: The port to which to add the address.
+ delete: If :data:`True`, will delete the address from the port instead of adding it.
"""
self.main_session.configure_port_ip_address(address, port, delete)
def close(self) -> None:
- """
- Close all connections and free other resources.
- """
+ """Close all connections and free other resources."""
if self.main_session:
self.main_session.close()
for session in self._other_sessions:
@@ -248,6 +307,11 @@ def close(self) -> None:
@staticmethod
def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+ """Skip the decorated function.
+
+ The :option:`--skip-setup` command line argument and the :envvar:`DTS_SKIP_SETUP`
+ environment variable enable the decorator.
+ """
if SETTINGS.skip_setup:
return lambda *args: None
else:
@@ -257,6 +321,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
def create_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> OSSession:
+ """Factory for OS-aware sessions.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
match node_config.os:
case OS.linux:
return LinuxSession(node_config, name, logger)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 17/21] dts: node docstring update
2023-11-15 13:09 ` [PATCH v7 17/21] dts: node " Juraj Linkeš
@ 2023-11-22 12:18 ` Yoan Picchi
2023-11-22 13:28 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-22 12:18 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/testbed_model/node.py | 191 +++++++++++++++++++---------
> 1 file changed, 131 insertions(+), 60 deletions(-)
>
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index fa5b143cdd..f93b4acecd 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -3,8 +3,13 @@
> # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2022-2023 University of New Hampshire
>
> -"""
> -A node is a generic host that DTS connects to and manages.
> +"""Common functionality for node management.
> +
> +A node is any host/server DTS connects to.
> +
> +The base class, :class:`Node`, provides functionality common to all nodes and is supposed
> +to be extended by subclasses with functionality specific to each node type.
functionality -> functionalities
> +The decorator :func:`Node.skip_setup` can be used without subclassing.
> """
>
> from abc import ABC
> @@ -35,10 +40,22 @@
>
>
> class Node(ABC):
> - """
> - Basic class for node management. This class implements methods that
> - manage a node, such as information gathering (of CPU/PCI/NIC) and
> - environment setup.
> + """The base class for node management.
> +
> + It shouldn't be instantiated, but rather subclassed.
> + It implements common methods to manage any node:
> +
> + * Connection to the node,
> + * Hugepages setup.
> +
> + Attributes:
> + main_session: The primary OS-aware remote session used to communicate with the node.
> + config: The node configuration.
> + name: The name of the node.
> + lcores: The list of logical cores that DTS can use on the node.
> + It's derived from logical cores present on the node and the test run configuration.
> + ports: The ports of this node specified in the test run configuration.
> + virtual_devices: The virtual devices used on the node.
> """
>
> main_session: OSSession
> @@ -52,6 +69,17 @@ class Node(ABC):
> virtual_devices: list[VirtualDevice]
>
> def __init__(self, node_config: NodeConfiguration):
> + """Connect to the node and gather info during initialization.
> +
> + Extra gathered information:
> +
> + * The list of available logical CPUs. This is then filtered by
> + the ``lcores`` configuration in the YAML test run configuration file,
> + * Information about ports from the YAML test run configuration file.
> +
> + Args:
> + node_config: The node's test run configuration.
> + """
> self.config = node_config
> self.name = node_config.name
> self._logger = getLogger(self.name)
> @@ -60,7 +88,7 @@ def __init__(self, node_config: NodeConfiguration):
> self._logger.info(f"Connected to node: {self.name}")
>
> self._get_remote_cpus()
> - # filter the node lcores according to user config
> + # filter the node lcores according to the test run configuration
> self.lcores = LogicalCoreListFilter(
> self.lcores, LogicalCoreList(self.config.lcores)
> ).filter()
> @@ -76,9 +104,14 @@ def _init_ports(self) -> None:
> self.configure_port_state(port)
>
> def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> - """
> - Perform the execution setup that will be done for each execution
> - this node is part of.
> + """Execution setup steps.
> +
> + Configure hugepages and call :meth:`_set_up_execution` where
> + the rest of the configuration steps (if any) are implemented.
> +
> + Args:
> + execution_config: The execution test run configuration according to which
> + the setup steps will be taken.
> """
> self._setup_hugepages()
> self._set_up_execution(execution_config)
> @@ -87,58 +120,74 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> self.virtual_devices.append(VirtualDevice(vdev))
>
> def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> - """
> - This method exists to be optionally overwritten by derived classes and
> - is not decorated so that the derived class doesn't have to use the decorator.
> + """Optional additional execution setup steps for subclasses.
> +
> + Subclasses should override this if they need to add additional execution setup steps.
> """
>
> def tear_down_execution(self) -> None:
> - """
> - Perform the execution teardown that will be done after each execution
> - this node is part of concludes.
> + """Execution teardown steps.
> +
> + There are currently no common execution teardown steps common to all DTS node types.
> """
> self.virtual_devices = []
> self._tear_down_execution()
>
> def _tear_down_execution(self) -> None:
> - """
> - This method exists to be optionally overwritten by derived classes and
> - is not decorated so that the derived class doesn't have to use the decorator.
> + """Optional additional execution teardown steps for subclasses.
> +
> + Subclasses should override this if they need to add additional execution teardown steps.
> """
>
> def set_up_build_target(
> self, build_target_config: BuildTargetConfiguration
> ) -> None:
> - """
> - Perform the build target setup that will be done for each build target
> - tested on this node.
> + """Build target setup steps.
> +
> + There are currently no common build target setup steps common to all DTS node types.
> +
> + Args:
> + build_target_config: The build target test run configuration according to which
> + the setup steps will be taken.
> """
> self._set_up_build_target(build_target_config)
>
> def _set_up_build_target(
> self, build_target_config: BuildTargetConfiguration
> ) -> None:
> - """
> - This method exists to be optionally overwritten by derived classes and
> - is not decorated so that the derived class doesn't have to use the decorator.
> + """Optional additional build target setup steps for subclasses.
> +
> + Subclasses should override this if they need to add additional build target setup steps.
> """
>
> def tear_down_build_target(self) -> None:
> - """
> - Perform the build target teardown that will be done after each build target
> - tested on this node.
> + """Build target teardown steps.
> +
> + There are currently no common build target teardown steps common to all DTS node types.
> """
> self._tear_down_build_target()
>
> def _tear_down_build_target(self) -> None:
> - """
> - This method exists to be optionally overwritten by derived classes and
> - is not decorated so that the derived class doesn't have to use the decorator.
> + """Optional additional build target teardown steps for subclasses.
> +
> + Subclasses should override this if they need to add additional build target teardown steps.
> """
>
> def create_session(self, name: str) -> OSSession:
> - """
> - Create and return a new OSSession tailored to the remote OS.
> + """Create and return a new OS-aware remote session.
> +
> + The returned session won't be used by the node creating it. The session must be used by
> + the caller. The session will be maintained for the entire lifecycle of the node object,
> + at the end of which the session will be cleaned up automatically.
> +
> + Note:
> + Any number of these supplementary sessions may be created.
> +
> + Args:
> + name: The name of the session.
> +
> + Returns:
> + A new OS-aware remote session.
> """
> session_name = f"{self.name} {name}"
> connection = create_session(
> @@ -156,19 +205,19 @@ def create_interactive_shell(
> privileged: bool = False,
> app_args: str = "",
> ) -> InteractiveShellType:
> - """Create a handler for an interactive session.
> + """Factory for interactive session handlers.
>
> - Instantiate shell_cls according to the remote OS specifics.
> + Instantiate `shell_cls` according to the remote OS specifics.
>
> Args:
> shell_cls: The class of the shell.
> - timeout: Timeout for reading output from the SSH channel. If you are
> - reading from the buffer and don't receive any data within the timeout
> - it will throw an error.
> + timeout: Timeout for reading output from the SSH channel. If you are reading from
> + the buffer and don't receive any data within the timeout it will throw an error.
> privileged: Whether to run the shell with administrative privileges.
> app_args: The arguments to be passed to the application.
> +
> Returns:
> - Instance of the desired interactive application.
> + An instance of the desired interactive application shell.
> """
> if not shell_cls.dpdk_app:
> shell_cls.path = self.main_session.join_remote_path(shell_cls.path)
> @@ -185,14 +234,22 @@ def filter_lcores(
> filter_specifier: LogicalCoreCount | LogicalCoreList,
> ascending: bool = True,
> ) -> list[LogicalCore]:
> - """
> - Filter the LogicalCores found on the Node according to
> - a LogicalCoreCount or a LogicalCoreList.
> + """Filter the node's logical cores that DTS can use.
> +
> + Logical cores that DTS can use are the ones that are present on the node, but filtered
> + according to the test run configuration. The `filter_specifier` will filter cores from
> + those logical cores.
> +
> + Args:
> + filter_specifier: Two different filters can be used, one that specifies the number
> + of logical cores per core, cores per socket and the number of sockets,
> + and another one that specifies a logical core list.
> + ascending: If :data:`True`, use cores with the lowest numerical id first and continue
> + in ascending order. If :data:`False`, start with the highest id and continue
> + in descending order. This ordering affects which sockets to consider first as well.
>
> - If ascending is True, use cores with the lowest numerical id first
> - and continue in ascending order. If False, start with the highest
> - id and continue in descending order. This ordering affects which
> - sockets to consider first as well.
> + Returns:
> + The filtered logical cores.
> """
> self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
> return lcore_filter(
> @@ -202,17 +259,14 @@ def filter_lcores(
> ).filter()
>
> def _get_remote_cpus(self) -> None:
> - """
> - Scan CPUs in the remote OS and store a list of LogicalCores.
> - """
> + """Scan CPUs in the remote OS and store a list of LogicalCores."""
> self._logger.info("Getting CPU information.")
> self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
>
> def _setup_hugepages(self) -> None:
> - """
> - Setup hugepages on the Node. Different architectures can supply different
> - amounts of memory for hugepages and numa-based hugepage allocation may need
> - to be considered.
> + """Setup hugepages on the node.
> +
> + Configure the hugepages only if they're specified in the node's test run configuration.
> """
> if self.config.hugepages:
> self.main_session.setup_hugepages(
> @@ -220,8 +274,11 @@ def _setup_hugepages(self) -> None:
> )
>
> def configure_port_state(self, port: Port, enable: bool = True) -> None:
> - """
> - Enable/disable port.
> + """Enable/disable `port`.
> +
> + Args:
> + port: The port to enable/disable.
> + enable: :data:`True` to enable, :data:`False` to disable.
> """
> self.main_session.configure_port_state(port, enable)
>
> @@ -231,15 +288,17 @@ def configure_port_ip_address(
> port: Port,
> delete: bool = False,
> ) -> None:
> - """
> - Configure the IP address of a port on this node.
> + """Add an IP address to `port` on this node.
> +
> + Args:
> + address: The IP address with mask in CIDR format. Can be either IPv4 or IPv6.
> + port: The port to which to add the address.
> + delete: If :data:`True`, will delete the address from the port instead of adding it.
> """
> self.main_session.configure_port_ip_address(address, port, delete)
>
> def close(self) -> None:
> - """
> - Close all connections and free other resources.
> - """
> + """Close all connections and free other resources."""
> if self.main_session:
> self.main_session.close()
> for session in self._other_sessions:
> @@ -248,6 +307,11 @@ def close(self) -> None:
>
> @staticmethod
> def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> + """Skip the decorated function.
> +
> + The :option:`--skip-setup` command line argument and the :envvar:`DTS_SKIP_SETUP`
> + environment variable enable the decorator.
> + """
> if SETTINGS.skip_setup:
> return lambda *args: None
> else:
> @@ -257,6 +321,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> def create_session(
> node_config: NodeConfiguration, name: str, logger: DTSLOG
> ) -> OSSession:
> + """Factory for OS-aware sessions.
> +
> + Args:
> + node_config: The test run configuration of the node to connect to.
> + name: The name of the session.
> + logger: The logger instance this session will use.
> + """
> match node_config.os:
> case OS.linux:
> return LinuxSession(node_config, name, logger)
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 17/21] dts: node docstring update
2023-11-22 12:18 ` Yoan Picchi
@ 2023-11-22 13:28 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 13:28 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Wed, Nov 22, 2023 at 1:18 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/testbed_model/node.py | 191 +++++++++++++++++++---------
> > 1 file changed, 131 insertions(+), 60 deletions(-)
> >
> > diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> > index fa5b143cdd..f93b4acecd 100644
> > --- a/dts/framework/testbed_model/node.py
> > +++ b/dts/framework/testbed_model/node.py
> > @@ -3,8 +3,13 @@
> > # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2022-2023 University of New Hampshire
> >
> > -"""
> > -A node is a generic host that DTS connects to and manages.
> > +"""Common functionality for node management.
> > +
> > +A node is any host/server DTS connects to.
> > +
> > +The base class, :class:`Node`, provides functionality common to all nodes and is supposed
> > +to be extended by subclasses with functionality specific to each node type.
>
> functionality -> functionalities
>
Ack.
> > +The decorator :func:`Node.skip_setup` can be used without subclassing.
> > """
> >
> > from abc import ABC
> > @@ -35,10 +40,22 @@
> >
> >
> > class Node(ABC):
> > - """
> > - Basic class for node management. This class implements methods that
> > - manage a node, such as information gathering (of CPU/PCI/NIC) and
> > - environment setup.
> > + """The base class for node management.
> > +
> > + It shouldn't be instantiated, but rather subclassed.
> > + It implements common methods to manage any node:
> > +
> > + * Connection to the node,
> > + * Hugepages setup.
> > +
> > + Attributes:
> > + main_session: The primary OS-aware remote session used to communicate with the node.
> > + config: The node configuration.
> > + name: The name of the node.
> > + lcores: The list of logical cores that DTS can use on the node.
> > + It's derived from logical cores present on the node and the test run configuration.
> > + ports: The ports of this node specified in the test run configuration.
> > + virtual_devices: The virtual devices used on the node.
> > """
> >
> > main_session: OSSession
> > @@ -52,6 +69,17 @@ class Node(ABC):
> > virtual_devices: list[VirtualDevice]
> >
> > def __init__(self, node_config: NodeConfiguration):
> > + """Connect to the node and gather info during initialization.
> > +
> > + Extra gathered information:
> > +
> > + * The list of available logical CPUs. This is then filtered by
> > + the ``lcores`` configuration in the YAML test run configuration file,
> > + * Information about ports from the YAML test run configuration file.
> > +
> > + Args:
> > + node_config: The node's test run configuration.
> > + """
> > self.config = node_config
> > self.name = node_config.name
> > self._logger = getLogger(self.name)
> > @@ -60,7 +88,7 @@ def __init__(self, node_config: NodeConfiguration):
> > self._logger.info(f"Connected to node: {self.name}")
> >
> > self._get_remote_cpus()
> > - # filter the node lcores according to user config
> > + # filter the node lcores according to the test run configuration
> > self.lcores = LogicalCoreListFilter(
> > self.lcores, LogicalCoreList(self.config.lcores)
> > ).filter()
> > @@ -76,9 +104,14 @@ def _init_ports(self) -> None:
> > self.configure_port_state(port)
> >
> > def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> > - """
> > - Perform the execution setup that will be done for each execution
> > - this node is part of.
> > + """Execution setup steps.
> > +
> > + Configure hugepages and call :meth:`_set_up_execution` where
> > + the rest of the configuration steps (if any) are implemented.
> > +
> > + Args:
> > + execution_config: The execution test run configuration according to which
> > + the setup steps will be taken.
> > """
> > self._setup_hugepages()
> > self._set_up_execution(execution_config)
> > @@ -87,58 +120,74 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> > self.virtual_devices.append(VirtualDevice(vdev))
> >
> > def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> > - """
> > - This method exists to be optionally overwritten by derived classes and
> > - is not decorated so that the derived class doesn't have to use the decorator.
> > + """Optional additional execution setup steps for subclasses.
> > +
> > + Subclasses should override this if they need to add additional execution setup steps.
> > """
> >
> > def tear_down_execution(self) -> None:
> > - """
> > - Perform the execution teardown that will be done after each execution
> > - this node is part of concludes.
> > + """Execution teardown steps.
> > +
> > + There are currently no common execution teardown steps common to all DTS node types.
> > """
> > self.virtual_devices = []
> > self._tear_down_execution()
> >
> > def _tear_down_execution(self) -> None:
> > - """
> > - This method exists to be optionally overwritten by derived classes and
> > - is not decorated so that the derived class doesn't have to use the decorator.
> > + """Optional additional execution teardown steps for subclasses.
> > +
> > + Subclasses should override this if they need to add additional execution teardown steps.
> > """
> >
> > def set_up_build_target(
> > self, build_target_config: BuildTargetConfiguration
> > ) -> None:
> > - """
> > - Perform the build target setup that will be done for each build target
> > - tested on this node.
> > + """Build target setup steps.
> > +
> > + There are currently no common build target setup steps common to all DTS node types.
> > +
> > + Args:
> > + build_target_config: The build target test run configuration according to which
> > + the setup steps will be taken.
> > """
> > self._set_up_build_target(build_target_config)
> >
> > def _set_up_build_target(
> > self, build_target_config: BuildTargetConfiguration
> > ) -> None:
> > - """
> > - This method exists to be optionally overwritten by derived classes and
> > - is not decorated so that the derived class doesn't have to use the decorator.
> > + """Optional additional build target setup steps for subclasses.
> > +
> > + Subclasses should override this if they need to add additional build target setup steps.
> > """
> >
> > def tear_down_build_target(self) -> None:
> > - """
> > - Perform the build target teardown that will be done after each build target
> > - tested on this node.
> > + """Build target teardown steps.
> > +
> > + There are currently no common build target teardown steps common to all DTS node types.
> > """
> > self._tear_down_build_target()
> >
> > def _tear_down_build_target(self) -> None:
> > - """
> > - This method exists to be optionally overwritten by derived classes and
> > - is not decorated so that the derived class doesn't have to use the decorator.
> > + """Optional additional build target teardown steps for subclasses.
> > +
> > + Subclasses should override this if they need to add additional build target teardown steps.
> > """
> >
> > def create_session(self, name: str) -> OSSession:
> > - """
> > - Create and return a new OSSession tailored to the remote OS.
> > + """Create and return a new OS-aware remote session.
> > +
> > + The returned session won't be used by the node creating it. The session must be used by
> > + the caller. The session will be maintained for the entire lifecycle of the node object,
> > + at the end of which the session will be cleaned up automatically.
> > +
> > + Note:
> > + Any number of these supplementary sessions may be created.
> > +
> > + Args:
> > + name: The name of the session.
> > +
> > + Returns:
> > + A new OS-aware remote session.
> > """
> > session_name = f"{self.name} {name}"
> > connection = create_session(
> > @@ -156,19 +205,19 @@ def create_interactive_shell(
> > privileged: bool = False,
> > app_args: str = "",
> > ) -> InteractiveShellType:
> > - """Create a handler for an interactive session.
> > + """Factory for interactive session handlers.
> >
> > - Instantiate shell_cls according to the remote OS specifics.
> > + Instantiate `shell_cls` according to the remote OS specifics.
> >
> > Args:
> > shell_cls: The class of the shell.
> > - timeout: Timeout for reading output from the SSH channel. If you are
> > - reading from the buffer and don't receive any data within the timeout
> > - it will throw an error.
> > + timeout: Timeout for reading output from the SSH channel. If you are reading from
> > + the buffer and don't receive any data within the timeout it will throw an error.
> > privileged: Whether to run the shell with administrative privileges.
> > app_args: The arguments to be passed to the application.
> > +
> > Returns:
> > - Instance of the desired interactive application.
> > + An instance of the desired interactive application shell.
> > """
> > if not shell_cls.dpdk_app:
> > shell_cls.path = self.main_session.join_remote_path(shell_cls.path)
> > @@ -185,14 +234,22 @@ def filter_lcores(
> > filter_specifier: LogicalCoreCount | LogicalCoreList,
> > ascending: bool = True,
> > ) -> list[LogicalCore]:
> > - """
> > - Filter the LogicalCores found on the Node according to
> > - a LogicalCoreCount or a LogicalCoreList.
> > + """Filter the node's logical cores that DTS can use.
> > +
> > + Logical cores that DTS can use are the ones that are present on the node, but filtered
> > + according to the test run configuration. The `filter_specifier` will filter cores from
> > + those logical cores.
> > +
> > + Args:
> > + filter_specifier: Two different filters can be used, one that specifies the number
> > + of logical cores per core, cores per socket and the number of sockets,
> > + and another one that specifies a logical core list.
> > + ascending: If :data:`True`, use cores with the lowest numerical id first and continue
> > + in ascending order. If :data:`False`, start with the highest id and continue
> > + in descending order. This ordering affects which sockets to consider first as well.
> >
> > - If ascending is True, use cores with the lowest numerical id first
> > - and continue in ascending order. If False, start with the highest
> > - id and continue in descending order. This ordering affects which
> > - sockets to consider first as well.
> > + Returns:
> > + The filtered logical cores.
> > """
> > self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
> > return lcore_filter(
> > @@ -202,17 +259,14 @@ def filter_lcores(
> > ).filter()
> >
> > def _get_remote_cpus(self) -> None:
> > - """
> > - Scan CPUs in the remote OS and store a list of LogicalCores.
> > - """
> > + """Scan CPUs in the remote OS and store a list of LogicalCores."""
> > self._logger.info("Getting CPU information.")
> > self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
> >
> > def _setup_hugepages(self) -> None:
> > - """
> > - Setup hugepages on the Node. Different architectures can supply different
> > - amounts of memory for hugepages and numa-based hugepage allocation may need
> > - to be considered.
> > + """Setup hugepages on the node.
> > +
> > + Configure the hugepages only if they're specified in the node's test run configuration.
> > """
> > if self.config.hugepages:
> > self.main_session.setup_hugepages(
> > @@ -220,8 +274,11 @@ def _setup_hugepages(self) -> None:
> > )
> >
> > def configure_port_state(self, port: Port, enable: bool = True) -> None:
> > - """
> > - Enable/disable port.
> > + """Enable/disable `port`.
> > +
> > + Args:
> > + port: The port to enable/disable.
> > + enable: :data:`True` to enable, :data:`False` to disable.
> > """
> > self.main_session.configure_port_state(port, enable)
> >
> > @@ -231,15 +288,17 @@ def configure_port_ip_address(
> > port: Port,
> > delete: bool = False,
> > ) -> None:
> > - """
> > - Configure the IP address of a port on this node.
> > + """Add an IP address to `port` on this node.
> > +
> > + Args:
> > + address: The IP address with mask in CIDR format. Can be either IPv4 or IPv6.
> > + port: The port to which to add the address.
> > + delete: If :data:`True`, will delete the address from the port instead of adding it.
> > """
> > self.main_session.configure_port_ip_address(address, port, delete)
> >
> > def close(self) -> None:
> > - """
> > - Close all connections and free other resources.
> > - """
> > + """Close all connections and free other resources."""
> > if self.main_session:
> > self.main_session.close()
> > for session in self._other_sessions:
> > @@ -248,6 +307,11 @@ def close(self) -> None:
> >
> > @staticmethod
> > def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> > + """Skip the decorated function.
> > +
> > + The :option:`--skip-setup` command line argument and the :envvar:`DTS_SKIP_SETUP`
> > + environment variable enable the decorator.
> > + """
> > if SETTINGS.skip_setup:
> > return lambda *args: None
> > else:
> > @@ -257,6 +321,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> > def create_session(
> > node_config: NodeConfiguration, name: str, logger: DTSLOG
> > ) -> OSSession:
> > + """Factory for OS-aware sessions.
> > +
> > + Args:
> > + node_config: The test run configuration of the node to connect to.
> > + name: The name of the session.
> > + logger: The logger instance this session will use.
> > + """
> > match node_config.os:
> > case OS.linux:
> > return LinuxSession(node_config, name, logger)
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 18/21] dts: sut and tg nodes docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (16 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 17/21] dts: node " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-22 13:12 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 19/21] dts: base traffic generators " Juraj Linkeš
` (3 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/sut_node.py | 224 ++++++++++++++++--------
dts/framework/testbed_model/tg_node.py | 42 +++--
2 files changed, 173 insertions(+), 93 deletions(-)
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 17deea06e2..123b16fee0 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -3,6 +3,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""System under test (DPDK + hardware) node.
+
+A system under test (SUT) is the combination of DPDK
+and the hardware we're testing with DPDK (NICs, crypto and other devices).
+An SUT node is where this SUT runs.
+"""
+
+
import os
import tarfile
import time
@@ -26,6 +34,11 @@
class EalParameters(object):
+ """The environment abstraction layer parameters.
+
+ The string representation can be created by converting the instance to a string.
+ """
+
def __init__(
self,
lcore_list: LogicalCoreList,
@@ -35,21 +48,23 @@ def __init__(
vdevs: list[VirtualDevice],
other_eal_param: str,
):
- """
- Generate eal parameters character string;
- :param lcore_list: the list of logical cores to use.
- :param memory_channels: the number of memory channels to use.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
+ """Initialize the parameters according to inputs.
+
+ Process the parameters into the format used on the command line.
+
+ Args:
+ lcore_list: The list of logical cores to use.
+ memory_channels: The number of memory channels to use.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``
"""
self._lcore_list = f"-l {lcore_list}"
self._memory_channels = f"-n {memory_channels}"
@@ -61,6 +76,7 @@ def __init__(
self._other_eal_param = other_eal_param
def __str__(self) -> str:
+ """Create the EAL string."""
return (
f"{self._lcore_list} "
f"{self._memory_channels} "
@@ -72,11 +88,21 @@ def __str__(self) -> str:
class SutNode(Node):
- """
- A class for managing connections to the System under Test, providing
- methods that retrieve the necessary information about the node (such as
- CPU, memory and NIC details) and configuration capabilities.
- Another key capability is building DPDK according to given build target.
+ """The system under test node.
+
+ The SUT node extends :class:`Node` with DPDK specific features:
+
+ * DPDK build,
+ * Gathering of DPDK build info,
+ * The running of DPDK apps, interactively or one-time execution,
+ * DPDK apps cleanup.
+
+ The :option:`--tarball` command line argument and the :envvar:`DTS_DPDK_TARBALL`
+ environment variable configure the path to the DPDK tarball
+ or the git commit ID, tag ID or tree ID to test.
+
+ Attributes:
+ config: The SUT node configuration
"""
config: SutNodeConfiguration
@@ -94,6 +120,11 @@ class SutNode(Node):
_path_to_devbind_script: PurePath | None
def __init__(self, node_config: SutNodeConfiguration):
+ """Extend the constructor with SUT node specifics.
+
+ Args:
+ node_config: The SUT node's test run configuration.
+ """
super(SutNode, self).__init__(node_config)
self._dpdk_prefix_list = []
self._build_target_config = None
@@ -113,6 +144,12 @@ def __init__(self, node_config: SutNodeConfiguration):
@property
def _remote_dpdk_dir(self) -> PurePath:
+ """The remote DPDK dir.
+
+ This internal property should be set after extracting the DPDK tarball. If it's not set,
+ that implies the DPDK setup step has been skipped, in which case we can guess where
+ a previous build was located.
+ """
if self.__remote_dpdk_dir is None:
self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
return self.__remote_dpdk_dir
@@ -123,6 +160,11 @@ def _remote_dpdk_dir(self, value: PurePath) -> None:
@property
def remote_dpdk_build_dir(self) -> PurePath:
+ """The remote DPDK build directory.
+
+ This is the directory where DPDK was built.
+ We assume it was built in a subdirectory of the extracted tarball.
+ """
if self._build_target_config:
return self.main_session.join_remote_path(
self._remote_dpdk_dir, self._build_target_config.name
@@ -132,6 +174,7 @@ def remote_dpdk_build_dir(self) -> PurePath:
@property
def dpdk_version(self) -> str:
+ """Last built DPDK version."""
if self._dpdk_version is None:
self._dpdk_version = self.main_session.get_dpdk_version(
self._remote_dpdk_dir
@@ -140,12 +183,14 @@ def dpdk_version(self) -> str:
@property
def node_info(self) -> NodeInfo:
+ """Additional node information."""
if self._node_info is None:
self._node_info = self.main_session.get_node_info()
return self._node_info
@property
def compiler_version(self) -> str:
+ """The node's compiler version."""
if self._compiler_version is None:
if self._build_target_config is not None:
self._compiler_version = self.main_session.get_compiler_version(
@@ -161,6 +206,7 @@ def compiler_version(self) -> str:
@property
def path_to_devbind_script(self) -> PurePath:
+ """The path to the dpdk-devbind.py script on the node."""
if self._path_to_devbind_script is None:
self._path_to_devbind_script = self.main_session.join_remote_path(
self._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
@@ -168,6 +214,11 @@ def path_to_devbind_script(self) -> PurePath:
return self._path_to_devbind_script
def get_build_target_info(self) -> BuildTargetInfo:
+ """Get additional build target information.
+
+ Returns:
+ The build target information,
+ """
return BuildTargetInfo(
dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
)
@@ -178,8 +229,9 @@ def _guess_dpdk_remote_dir(self) -> PurePath:
def _set_up_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Setup DPDK on the SUT node.
+ """Setup DPDK on the SUT node.
+
+ Additional build target setup steps on top of those in :class:`Node`.
"""
# we want to ensure that dpdk_version and compiler_version is reset for new
# build targets
@@ -200,9 +252,7 @@ def _tear_down_build_target(self) -> None:
def _configure_build_target(
self, build_target_config: BuildTargetConfiguration
) -> None:
- """
- Populate common environment variables and set build target config.
- """
+ """Populate common environment variables and set build target config."""
self._env_vars = {}
self._build_target_config = build_target_config
self._env_vars.update(
@@ -217,9 +267,7 @@ def _configure_build_target(
@Node.skip_setup
def _copy_dpdk_tarball(self) -> None:
- """
- Copy to and extract DPDK tarball on the SUT node.
- """
+ """Copy to and extract DPDK tarball on the SUT node."""
self._logger.info("Copying DPDK tarball to SUT.")
self.main_session.copy_to(SETTINGS.dpdk_tarball_path, self._remote_tmp_dir)
@@ -250,8 +298,9 @@ def _copy_dpdk_tarball(self) -> None:
@Node.skip_setup
def _build_dpdk(self) -> None:
- """
- Build DPDK. Uses the already configured target. Assumes that the tarball has
+ """Build DPDK.
+
+ Uses the already configured target. Assumes that the tarball has
already been copied to and extracted on the SUT node.
"""
self.main_session.build_dpdk(
@@ -262,15 +311,19 @@ def _build_dpdk(self) -> None:
)
def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePath:
- """
- Build one or all DPDK apps. Requires DPDK to be already built on the SUT node.
- When app_name is 'all', build all example apps.
- When app_name is any other string, tries to build that example app.
- Return the directory path of the built app. If building all apps, return
- the path to the examples directory (where all apps reside).
- The meson_dpdk_args are keyword arguments
- found in meson_option.txt in root DPDK directory. Do not use -D with them,
- for example: enable_kmods=True.
+ """Build one or all DPDK apps.
+
+ Requires DPDK to be already built on the SUT node.
+
+ Args:
+ app_name: The name of the DPDK app to build.
+ When `app_name` is ``all``, build all example apps.
+ meson_dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Returns:
+ The directory path of the built app. If building all apps, return
+ the path to the examples directory (where all apps reside).
"""
self.main_session.build_dpdk(
self._env_vars,
@@ -291,9 +344,7 @@ def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePa
)
def kill_cleanup_dpdk_apps(self) -> None:
- """
- Kill all dpdk applications on the SUT. Cleanup hugepages.
- """
+ """Kill all dpdk applications on the SUT, then clean up hugepages."""
if self._dpdk_kill_session and self._dpdk_kill_session.is_alive():
# we can use the session if it exists and responds
self._dpdk_kill_session.kill_cleanup_dpdk_apps(self._dpdk_prefix_list)
@@ -312,33 +363,34 @@ def create_eal_parameters(
vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
- """
- Generate eal parameters character string;
- :param lcore_filter_specifier: a number of lcores/cores/sockets to use
- or a list of lcore ids to use.
- The default will select one lcore for each of two cores
- on one socket, in ascending order of core ids.
- :param ascending_cores: True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the
- highest id and continue in descending order. This ordering
- affects which sockets to consider first as well.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param append_prefix_timestamp: if True, will append a timestamp to
- DPDK file prefix.
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
- :return: eal param string, eg:
- '-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420';
- """
+ """Compose the EAL parameters.
+
+ Process the list of cores and the DPDK prefix and pass that along with
+ the rest of the arguments.
+ Args:
+ lcore_filter_specifier: A number of lcores/cores/sockets to use
+ or a list of lcore ids to use.
+ The default will select one lcore for each of two cores
+ on one socket, in ascending order of core ids.
+ ascending_cores: Sort cores in ascending order (lowest to highest IDs).
+ If :data:`False`, sort in descending order.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ append_prefix_timestamp: If :data:`True`, will append a timestamp to DPDK file prefix.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``.
+
+ Returns:
+ An EAL param string, such as
+ ``-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420``.
+ """
lcore_list = LogicalCoreList(
self.filter_lcores(lcore_filter_specifier, ascending_cores)
)
@@ -364,14 +416,29 @@ def create_eal_parameters(
def run_dpdk_app(
self, app_path: PurePath, eal_args: "EalParameters", timeout: float = 30
) -> CommandResult:
- """
- Run DPDK application on the remote node.
+ """Run DPDK application on the remote node.
+
+ The application is not run interactively - the command that starts the application
+ is executed and then the call waits for it to finish execution.
+
+ Args:
+ app_path: The remote path to the DPDK application.
+ eal_args: EAL parameters to run the DPDK application with.
+ timeout: Wait at most this long in seconds to execute the command.
+
+ Returns:
+ The result of the DPDK app execution.
"""
return self.main_session.send_command(
f"{app_path} {eal_args}", timeout, privileged=True, verify=True
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Enable/disable IPv4 forwarding on the node.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
+ """
self.main_session.configure_ipv4_forwarding(enable)
def create_interactive_shell(
@@ -381,9 +448,13 @@ def create_interactive_shell(
privileged: bool = False,
eal_parameters: EalParameters | str | None = None,
) -> InteractiveShellType:
- """Factory method for creating a handler for an interactive session.
+ """Extend the factory for interactive session handlers.
+
+ The extensions are SUT node specific:
- Instantiate shell_cls according to the remote OS specifics.
+ * The default for `eal_parameters`,
+ * The interactive shell path `shell_cls.path` is prepended with path to the remote
+ DPDK build directory for DPDK apps.
Args:
shell_cls: The class of the shell.
@@ -393,9 +464,10 @@ def create_interactive_shell(
privileged: Whether to run the shell with administrative privileges.
eal_parameters: List of EAL parameters to use to launch the app. If this
isn't provided or an empty string is passed, it will default to calling
- create_eal_parameters().
+ :meth:`create_eal_parameters`.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not eal_parameters:
eal_parameters = self.create_eal_parameters()
@@ -414,8 +486,8 @@ def bind_ports_to_driver(self, for_dpdk: bool = True) -> None:
"""Bind all ports on the SUT to a driver.
Args:
- for_dpdk: Boolean that, when True, binds ports to os_driver_for_dpdk
- or, when False, binds to os_driver. Defaults to True.
+ for_dpdk: If :data:`True`, binds ports to os_driver_for_dpdk.
+ If :data:`False`, binds to os_driver.
"""
for port in self.ports:
driver = port.os_driver_for_dpdk if for_dpdk else port.os_driver
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 166eb8430e..69eb33ccb1 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -5,13 +5,8 @@
"""Traffic generator node.
-This is the node where the traffic generator resides.
-The distinction between a node and a traffic generator is as follows:
-A node is a host that DTS connects to. It could be a baremetal server,
-a VM or a container.
-A traffic generator is software running on the node.
-A traffic generator node is a node running a traffic generator.
-A node can be a traffic generator node as well as system under test node.
+A traffic generator (TG) generates traffic that's sent towards the SUT node.
+A TG node is where the TG runs.
"""
from scapy.packet import Packet # type: ignore[import]
@@ -24,13 +19,16 @@
class TGNode(Node):
- """Manage connections to a node with a traffic generator.
+ """The traffic generator node.
- Apart from basic node management capabilities, the Traffic Generator node has
- specialized methods for handling the traffic generator running on it.
+ The TG node extends :class:`Node` with TG specific features:
- Arguments:
- node_config: The user configuration of the traffic generator node.
+ * Traffic generator initialization,
+ * The sending of traffic and receiving packets,
+ * The sending of traffic without receiving packets.
+
+ Not all traffic generators are capable of capturing traffic, which is why there
+ must be a way to send traffic without that.
Attributes:
traffic_generator: The traffic generator running on the node.
@@ -39,6 +37,13 @@ class TGNode(Node):
traffic_generator: CapturingTrafficGenerator
def __init__(self, node_config: TGNodeConfiguration):
+ """Extend the constructor with TG node specifics.
+
+ Initialize the traffic generator on the TG node.
+
+ Args:
+ node_config: The TG node's test run configuration.
+ """
super(TGNode, self).__init__(node_config)
self.traffic_generator = create_traffic_generator(
self, node_config.traffic_generator
@@ -52,17 +57,17 @@ def send_packet_and_capture(
receive_port: Port,
duration: float = 1,
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet`, return received traffic.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given duration. Also record the captured traffic
in a pcap file.
Args:
packet: The packet to send.
send_port: The egress port on the TG node.
receive_port: The ingress port in the TG node.
- duration: Capture traffic for this amount of time after sending the packet.
+ duration: Capture traffic for this amount of time after sending `packet`.
Returns:
A list of received packets. May be empty if no packets are captured.
@@ -72,6 +77,9 @@ def send_packet_and_capture(
)
def close(self) -> None:
- """Free all resources used by the node"""
+ """Free all resources used by the node.
+
+ This extends the superclass method with TG cleanup.
+ """
self.traffic_generator.close()
super(TGNode, self).close()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 18/21] dts: sut and tg nodes docstring update
2023-11-15 13:09 ` [PATCH v7 18/21] dts: sut and tg nodes " Juraj Linkeš
@ 2023-11-22 13:12 ` Yoan Picchi
2023-11-22 13:34 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-22 13:12 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/testbed_model/sut_node.py | 224 ++++++++++++++++--------
> dts/framework/testbed_model/tg_node.py | 42 +++--
> 2 files changed, 173 insertions(+), 93 deletions(-)
>
> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> index 17deea06e2..123b16fee0 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -3,6 +3,14 @@
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2023 University of New Hampshire
>
> +"""System under test (DPDK + hardware) node.
> +
> +A system under test (SUT) is the combination of DPDK
> +and the hardware we're testing with DPDK (NICs, crypto and other devices).
> +An SUT node is where this SUT runs.
> +"""
> +
> +
> import os
> import tarfile
> import time
> @@ -26,6 +34,11 @@
>
>
> class EalParameters(object):
> + """The environment abstraction layer parameters.
> +
> + The string representation can be created by converting the instance to a string.
> + """
> +
> def __init__(
> self,
> lcore_list: LogicalCoreList,
> @@ -35,21 +48,23 @@ def __init__(
> vdevs: list[VirtualDevice],
> other_eal_param: str,
> ):
> - """
> - Generate eal parameters character string;
> - :param lcore_list: the list of logical cores to use.
> - :param memory_channels: the number of memory channels to use.
> - :param prefix: set file prefix string, eg:
> - prefix='vf'
> - :param no_pci: switch of disable PCI bus eg:
> - no_pci=True
> - :param vdevs: virtual device list, eg:
> - vdevs=[
> - VirtualDevice('net_ring0'),
> - VirtualDevice('net_ring1')
> - ]
> - :param other_eal_param: user defined DPDK eal parameters, eg:
> - other_eal_param='--single-file-segments'
> + """Initialize the parameters according to inputs.
> +
> + Process the parameters into the format used on the command line.
> +
> + Args:
> + lcore_list: The list of logical cores to use.
> + memory_channels: The number of memory channels to use.
> + prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
> + no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
> + vdevs: Virtual devices, e.g.::
> +
> + vdevs=[
> + VirtualDevice('net_ring0'),
> + VirtualDevice('net_ring1')
> + ]
> + other_eal_param: user defined DPDK EAL parameters, e.g.:
> + ``other_eal_param='--single-file-segments'``
> """
> self._lcore_list = f"-l {lcore_list}"
> self._memory_channels = f"-n {memory_channels}"
> @@ -61,6 +76,7 @@ def __init__(
> self._other_eal_param = other_eal_param
>
> def __str__(self) -> str:
> + """Create the EAL string."""
> return (
> f"{self._lcore_list} "
> f"{self._memory_channels} "
> @@ -72,11 +88,21 @@ def __str__(self) -> str:
>
>
> class SutNode(Node):
> - """
> - A class for managing connections to the System under Test, providing
> - methods that retrieve the necessary information about the node (such as
> - CPU, memory and NIC details) and configuration capabilities.
> - Another key capability is building DPDK according to given build target.
> + """The system under test node.
> +
> + The SUT node extends :class:`Node` with DPDK specific features:
> +
> + * DPDK build,
> + * Gathering of DPDK build info,
> + * The running of DPDK apps, interactively or one-time execution,
> + * DPDK apps cleanup.
> +
> + The :option:`--tarball` command line argument and the :envvar:`DTS_DPDK_TARBALL`
> + environment variable configure the path to the DPDK tarball
> + or the git commit ID, tag ID or tree ID to test.
I just want to make sure. We use the --tarball option also to set a git
commit id instead of a tarball as the source?
> +
> + Attributes:
> + config: The SUT node configuration
> """
>
> config: SutNodeConfiguration
> @@ -94,6 +120,11 @@ class SutNode(Node):
> _path_to_devbind_script: PurePath | None
>
> def __init__(self, node_config: SutNodeConfiguration):
> + """Extend the constructor with SUT node specifics.
> +
> + Args:
> + node_config: The SUT node's test run configuration.
> + """
> super(SutNode, self).__init__(node_config)
> self._dpdk_prefix_list = []
> self._build_target_config = None
> @@ -113,6 +144,12 @@ def __init__(self, node_config: SutNodeConfiguration):
>
> @property
> def _remote_dpdk_dir(self) -> PurePath:
> + """The remote DPDK dir.
> +
> + This internal property should be set after extracting the DPDK tarball. If it's not set,
> + that implies the DPDK setup step has been skipped, in which case we can guess where
> + a previous build was located.
> + """
> if self.__remote_dpdk_dir is None:
> self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
> return self.__remote_dpdk_dir
> @@ -123,6 +160,11 @@ def _remote_dpdk_dir(self, value: PurePath) -> None:
>
> @property
> def remote_dpdk_build_dir(self) -> PurePath:
> + """The remote DPDK build directory.
> +
> + This is the directory where DPDK was built.
> + We assume it was built in a subdirectory of the extracted tarball.
> + """
> if self._build_target_config:
> return self.main_session.join_remote_path(
> self._remote_dpdk_dir, self._build_target_config.name
> @@ -132,6 +174,7 @@ def remote_dpdk_build_dir(self) -> PurePath:
>
> @property
> def dpdk_version(self) -> str:
> + """Last built DPDK version."""
> if self._dpdk_version is None:
> self._dpdk_version = self.main_session.get_dpdk_version(
> self._remote_dpdk_dir
> @@ -140,12 +183,14 @@ def dpdk_version(self) -> str:
>
> @property
> def node_info(self) -> NodeInfo:
> + """Additional node information."""
> if self._node_info is None:
> self._node_info = self.main_session.get_node_info()
> return self._node_info
>
> @property
> def compiler_version(self) -> str:
> + """The node's compiler version."""
> if self._compiler_version is None:
> if self._build_target_config is not None:
> self._compiler_version = self.main_session.get_compiler_version(
> @@ -161,6 +206,7 @@ def compiler_version(self) -> str:
>
> @property
> def path_to_devbind_script(self) -> PurePath:
> + """The path to the dpdk-devbind.py script on the node."""
> if self._path_to_devbind_script is None:
> self._path_to_devbind_script = self.main_session.join_remote_path(
> self._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
> @@ -168,6 +214,11 @@ def path_to_devbind_script(self) -> PurePath:
> return self._path_to_devbind_script
>
> def get_build_target_info(self) -> BuildTargetInfo:
> + """Get additional build target information.
> +
> + Returns:
> + The build target information,
> + """
> return BuildTargetInfo(
> dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
> )
> @@ -178,8 +229,9 @@ def _guess_dpdk_remote_dir(self) -> PurePath:
> def _set_up_build_target(
> self, build_target_config: BuildTargetConfiguration
> ) -> None:
> - """
> - Setup DPDK on the SUT node.
> + """Setup DPDK on the SUT node.
> +
> + Additional build target setup steps on top of those in :class:`Node`.
> """
> # we want to ensure that dpdk_version and compiler_version is reset for new
> # build targets
> @@ -200,9 +252,7 @@ def _tear_down_build_target(self) -> None:
> def _configure_build_target(
> self, build_target_config: BuildTargetConfiguration
> ) -> None:
> - """
> - Populate common environment variables and set build target config.
> - """
> + """Populate common environment variables and set build target config."""
> self._env_vars = {}
> self._build_target_config = build_target_config
> self._env_vars.update(
> @@ -217,9 +267,7 @@ def _configure_build_target(
>
> @Node.skip_setup
> def _copy_dpdk_tarball(self) -> None:
> - """
> - Copy to and extract DPDK tarball on the SUT node.
> - """
> + """Copy to and extract DPDK tarball on the SUT node."""
> self._logger.info("Copying DPDK tarball to SUT.")
> self.main_session.copy_to(SETTINGS.dpdk_tarball_path, self._remote_tmp_dir)
>
> @@ -250,8 +298,9 @@ def _copy_dpdk_tarball(self) -> None:
>
> @Node.skip_setup
> def _build_dpdk(self) -> None:
> - """
> - Build DPDK. Uses the already configured target. Assumes that the tarball has
> + """Build DPDK.
> +
> + Uses the already configured target. Assumes that the tarball has
> already been copied to and extracted on the SUT node.
> """
> self.main_session.build_dpdk(
> @@ -262,15 +311,19 @@ def _build_dpdk(self) -> None:
> )
>
> def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePath:
> - """
> - Build one or all DPDK apps. Requires DPDK to be already built on the SUT node.
> - When app_name is 'all', build all example apps.
> - When app_name is any other string, tries to build that example app.
> - Return the directory path of the built app. If building all apps, return
> - the path to the examples directory (where all apps reside).
> - The meson_dpdk_args are keyword arguments
> - found in meson_option.txt in root DPDK directory. Do not use -D with them,
> - for example: enable_kmods=True.
> + """Build one or all DPDK apps.
> +
> + Requires DPDK to be already built on the SUT node.
> +
> + Args:
> + app_name: The name of the DPDK app to build.
> + When `app_name` is ``all``, build all example apps.
> + meson_dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
> + Do not use ``-D`` with them.
> +
> + Returns:
> + The directory path of the built app. If building all apps, return
> + the path to the examples directory (where all apps reside).
> """
> self.main_session.build_dpdk(
> self._env_vars,
> @@ -291,9 +344,7 @@ def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePa
> )
>
> def kill_cleanup_dpdk_apps(self) -> None:
> - """
> - Kill all dpdk applications on the SUT. Cleanup hugepages.
> - """
> + """Kill all dpdk applications on the SUT, then clean up hugepages."""
> if self._dpdk_kill_session and self._dpdk_kill_session.is_alive():
> # we can use the session if it exists and responds
> self._dpdk_kill_session.kill_cleanup_dpdk_apps(self._dpdk_prefix_list)
> @@ -312,33 +363,34 @@ def create_eal_parameters(
> vdevs: list[VirtualDevice] | None = None,
> other_eal_param: str = "",
> ) -> "EalParameters":
> - """
> - Generate eal parameters character string;
> - :param lcore_filter_specifier: a number of lcores/cores/sockets to use
> - or a list of lcore ids to use.
> - The default will select one lcore for each of two cores
> - on one socket, in ascending order of core ids.
> - :param ascending_cores: True, use cores with the lowest numerical id first
> - and continue in ascending order. If False, start with the
> - highest id and continue in descending order. This ordering
> - affects which sockets to consider first as well.
> - :param prefix: set file prefix string, eg:
> - prefix='vf'
> - :param append_prefix_timestamp: if True, will append a timestamp to
> - DPDK file prefix.
> - :param no_pci: switch of disable PCI bus eg:
> - no_pci=True
> - :param vdevs: virtual device list, eg:
> - vdevs=[
> - VirtualDevice('net_ring0'),
> - VirtualDevice('net_ring1')
> - ]
> - :param other_eal_param: user defined DPDK eal parameters, eg:
> - other_eal_param='--single-file-segments'
> - :return: eal param string, eg:
> - '-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420';
> - """
> + """Compose the EAL parameters.
> +
> + Process the list of cores and the DPDK prefix and pass that along with
> + the rest of the arguments.
>
> + Args:
> + lcore_filter_specifier: A number of lcores/cores/sockets to use
> + or a list of lcore ids to use.
> + The default will select one lcore for each of two cores
> + on one socket, in ascending order of core ids.
> + ascending_cores: Sort cores in ascending order (lowest to highest IDs).
> + If :data:`False`, sort in descending order.
> + prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
> + append_prefix_timestamp: If :data:`True`, will append a timestamp to DPDK file prefix.
> + no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
> + vdevs: Virtual devices, e.g.::
> +
> + vdevs=[
> + VirtualDevice('net_ring0'),
> + VirtualDevice('net_ring1')
> + ]
> + other_eal_param: user defined DPDK EAL parameters, e.g.:
> + ``other_eal_param='--single-file-segments'``.
> +
> + Returns:
> + An EAL param string, such as
> + ``-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420``.
> + """
> lcore_list = LogicalCoreList(
> self.filter_lcores(lcore_filter_specifier, ascending_cores)
> )
> @@ -364,14 +416,29 @@ def create_eal_parameters(
> def run_dpdk_app(
> self, app_path: PurePath, eal_args: "EalParameters", timeout: float = 30
> ) -> CommandResult:
> - """
> - Run DPDK application on the remote node.
> + """Run DPDK application on the remote node.
> +
> + The application is not run interactively - the command that starts the application
> + is executed and then the call waits for it to finish execution.
> +
> + Args:
> + app_path: The remote path to the DPDK application.
> + eal_args: EAL parameters to run the DPDK application with.
> + timeout: Wait at most this long in seconds to execute the command.
confusing timeout
> +
> + Returns:
> + The result of the DPDK app execution.
> """
> return self.main_session.send_command(
> f"{app_path} {eal_args}", timeout, privileged=True, verify=True
> )
>
> def configure_ipv4_forwarding(self, enable: bool) -> None:
> + """Enable/disable IPv4 forwarding on the node.
> +
> + Args:
> + enable: If :data:`True`, enable the forwarding, otherwise disable it.
> + """
> self.main_session.configure_ipv4_forwarding(enable)
>
> def create_interactive_shell(
> @@ -381,9 +448,13 @@ def create_interactive_shell(
> privileged: bool = False,
> eal_parameters: EalParameters | str | None = None,
> ) -> InteractiveShellType:
> - """Factory method for creating a handler for an interactive session.
> + """Extend the factory for interactive session handlers.
> +
> + The extensions are SUT node specific:
>
> - Instantiate shell_cls according to the remote OS specifics.
> + * The default for `eal_parameters`,
> + * The interactive shell path `shell_cls.path` is prepended with path to the remote
> + DPDK build directory for DPDK apps.
>
> Args:
> shell_cls: The class of the shell.
> @@ -393,9 +464,10 @@ def create_interactive_shell(
> privileged: Whether to run the shell with administrative privileges.
> eal_parameters: List of EAL parameters to use to launch the app. If this
> isn't provided or an empty string is passed, it will default to calling
> - create_eal_parameters().
> + :meth:`create_eal_parameters`.
> +
> Returns:
> - Instance of the desired interactive application.
> + An instance of the desired interactive application shell.
> """
> if not eal_parameters:
> eal_parameters = self.create_eal_parameters()
> @@ -414,8 +486,8 @@ def bind_ports_to_driver(self, for_dpdk: bool = True) -> None:
> """Bind all ports on the SUT to a driver.
>
> Args:
> - for_dpdk: Boolean that, when True, binds ports to os_driver_for_dpdk
> - or, when False, binds to os_driver. Defaults to True.
> + for_dpdk: If :data:`True`, binds ports to os_driver_for_dpdk.
> + If :data:`False`, binds to os_driver.
> """
> for port in self.ports:
> driver = port.os_driver_for_dpdk if for_dpdk else port.os_driver
> diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
> index 166eb8430e..69eb33ccb1 100644
> --- a/dts/framework/testbed_model/tg_node.py
> +++ b/dts/framework/testbed_model/tg_node.py
> @@ -5,13 +5,8 @@
>
> """Traffic generator node.
>
> -This is the node where the traffic generator resides.
> -The distinction between a node and a traffic generator is as follows:
> -A node is a host that DTS connects to. It could be a baremetal server,
> -a VM or a container.
> -A traffic generator is software running on the node.
> -A traffic generator node is a node running a traffic generator.
> -A node can be a traffic generator node as well as system under test node.
> +A traffic generator (TG) generates traffic that's sent towards the SUT node.
> +A TG node is where the TG runs.
> """
>
> from scapy.packet import Packet # type: ignore[import]
> @@ -24,13 +19,16 @@
>
>
> class TGNode(Node):
> - """Manage connections to a node with a traffic generator.
> + """The traffic generator node.
>
> - Apart from basic node management capabilities, the Traffic Generator node has
> - specialized methods for handling the traffic generator running on it.
> + The TG node extends :class:`Node` with TG specific features:
>
> - Arguments:
> - node_config: The user configuration of the traffic generator node.
> + * Traffic generator initialization,
> + * The sending of traffic and receiving packets,
> + * The sending of traffic without receiving packets.
> +
> + Not all traffic generators are capable of capturing traffic, which is why there
> + must be a way to send traffic without that.
>
> Attributes:
> traffic_generator: The traffic generator running on the node.
> @@ -39,6 +37,13 @@ class TGNode(Node):
> traffic_generator: CapturingTrafficGenerator
>
> def __init__(self, node_config: TGNodeConfiguration):
> + """Extend the constructor with TG node specifics.
> +
> + Initialize the traffic generator on the TG node.
> +
> + Args:
> + node_config: The TG node's test run configuration.
> + """
> super(TGNode, self).__init__(node_config)
> self.traffic_generator = create_traffic_generator(
> self, node_config.traffic_generator
> @@ -52,17 +57,17 @@ def send_packet_and_capture(
> receive_port: Port,
> duration: float = 1,
> ) -> list[Packet]:
> - """Send a packet, return received traffic.
> + """Send `packet`, return received traffic.
>
> - Send a packet on the send_port and then return all traffic captured
> - on the receive_port for the given duration. Also record the captured traffic
> + Send `packet` on `send_port` and then return all traffic captured
> + on `receive_port` for the given duration. Also record the captured traffic
> in a pcap file.
>
> Args:
> packet: The packet to send.
> send_port: The egress port on the TG node.
> receive_port: The ingress port in the TG node.
> - duration: Capture traffic for this amount of time after sending the packet.
> + duration: Capture traffic for this amount of time after sending `packet`.
>
> Returns:
> A list of received packets. May be empty if no packets are captured.
> @@ -72,6 +77,9 @@ def send_packet_and_capture(
> )
>
> def close(self) -> None:
> - """Free all resources used by the node"""
> + """Free all resources used by the node.
> +
> + This extends the superclass method with TG cleanup.
> + """
> self.traffic_generator.close()
> super(TGNode, self).close()
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 18/21] dts: sut and tg nodes docstring update
2023-11-22 13:12 ` Yoan Picchi
@ 2023-11-22 13:34 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 13:34 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Wed, Nov 22, 2023 at 2:13 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/framework/testbed_model/sut_node.py | 224 ++++++++++++++++--------
> > dts/framework/testbed_model/tg_node.py | 42 +++--
> > 2 files changed, 173 insertions(+), 93 deletions(-)
> >
> > diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> > index 17deea06e2..123b16fee0 100644
> > --- a/dts/framework/testbed_model/sut_node.py
> > +++ b/dts/framework/testbed_model/sut_node.py
> > @@ -3,6 +3,14 @@
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> > # Copyright(c) 2023 University of New Hampshire
> >
> > +"""System under test (DPDK + hardware) node.
> > +
> > +A system under test (SUT) is the combination of DPDK
> > +and the hardware we're testing with DPDK (NICs, crypto and other devices).
> > +An SUT node is where this SUT runs.
> > +"""
> > +
> > +
> > import os
> > import tarfile
> > import time
> > @@ -26,6 +34,11 @@
> >
> >
> > class EalParameters(object):
> > + """The environment abstraction layer parameters.
> > +
> > + The string representation can be created by converting the instance to a string.
> > + """
> > +
> > def __init__(
> > self,
> > lcore_list: LogicalCoreList,
> > @@ -35,21 +48,23 @@ def __init__(
> > vdevs: list[VirtualDevice],
> > other_eal_param: str,
> > ):
> > - """
> > - Generate eal parameters character string;
> > - :param lcore_list: the list of logical cores to use.
> > - :param memory_channels: the number of memory channels to use.
> > - :param prefix: set file prefix string, eg:
> > - prefix='vf'
> > - :param no_pci: switch of disable PCI bus eg:
> > - no_pci=True
> > - :param vdevs: virtual device list, eg:
> > - vdevs=[
> > - VirtualDevice('net_ring0'),
> > - VirtualDevice('net_ring1')
> > - ]
> > - :param other_eal_param: user defined DPDK eal parameters, eg:
> > - other_eal_param='--single-file-segments'
> > + """Initialize the parameters according to inputs.
> > +
> > + Process the parameters into the format used on the command line.
> > +
> > + Args:
> > + lcore_list: The list of logical cores to use.
> > + memory_channels: The number of memory channels to use.
> > + prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
> > + no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
> > + vdevs: Virtual devices, e.g.::
> > +
> > + vdevs=[
> > + VirtualDevice('net_ring0'),
> > + VirtualDevice('net_ring1')
> > + ]
> > + other_eal_param: user defined DPDK EAL parameters, e.g.:
> > + ``other_eal_param='--single-file-segments'``
> > """
> > self._lcore_list = f"-l {lcore_list}"
> > self._memory_channels = f"-n {memory_channels}"
> > @@ -61,6 +76,7 @@ def __init__(
> > self._other_eal_param = other_eal_param
> >
> > def __str__(self) -> str:
> > + """Create the EAL string."""
> > return (
> > f"{self._lcore_list} "
> > f"{self._memory_channels} "
> > @@ -72,11 +88,21 @@ def __str__(self) -> str:
> >
> >
> > class SutNode(Node):
> > - """
> > - A class for managing connections to the System under Test, providing
> > - methods that retrieve the necessary information about the node (such as
> > - CPU, memory and NIC details) and configuration capabilities.
> > - Another key capability is building DPDK according to given build target.
> > + """The system under test node.
> > +
> > + The SUT node extends :class:`Node` with DPDK specific features:
> > +
> > + * DPDK build,
> > + * Gathering of DPDK build info,
> > + * The running of DPDK apps, interactively or one-time execution,
> > + * DPDK apps cleanup.
> > +
> > + The :option:`--tarball` command line argument and the :envvar:`DTS_DPDK_TARBALL`
> > + environment variable configure the path to the DPDK tarball
> > + or the git commit ID, tag ID or tree ID to test.
>
> I just want to make sure. We use the --tarball option also to set a git
> commit id instead of a tarball as the source?
>
Yes. The purpose of the option is to specify the DPDK to test and
there are different accepted formats for the option (a tarball or a
git commit id). There are actually three aliases for the option:
--tarball, --snapshot, --git-ref, but none of the aliases capture the
dichotomous nature of the option.
> > +
> > + Attributes:
> > + config: The SUT node configuration
> > """
> >
> > config: SutNodeConfiguration
> > @@ -94,6 +120,11 @@ class SutNode(Node):
> > _path_to_devbind_script: PurePath | None
> >
> > def __init__(self, node_config: SutNodeConfiguration):
> > + """Extend the constructor with SUT node specifics.
> > +
> > + Args:
> > + node_config: The SUT node's test run configuration.
> > + """
> > super(SutNode, self).__init__(node_config)
> > self._dpdk_prefix_list = []
> > self._build_target_config = None
> > @@ -113,6 +144,12 @@ def __init__(self, node_config: SutNodeConfiguration):
> >
> > @property
> > def _remote_dpdk_dir(self) -> PurePath:
> > + """The remote DPDK dir.
> > +
> > + This internal property should be set after extracting the DPDK tarball. If it's not set,
> > + that implies the DPDK setup step has been skipped, in which case we can guess where
> > + a previous build was located.
> > + """
> > if self.__remote_dpdk_dir is None:
> > self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
> > return self.__remote_dpdk_dir
> > @@ -123,6 +160,11 @@ def _remote_dpdk_dir(self, value: PurePath) -> None:
> >
> > @property
> > def remote_dpdk_build_dir(self) -> PurePath:
> > + """The remote DPDK build directory.
> > +
> > + This is the directory where DPDK was built.
> > + We assume it was built in a subdirectory of the extracted tarball.
> > + """
> > if self._build_target_config:
> > return self.main_session.join_remote_path(
> > self._remote_dpdk_dir, self._build_target_config.name
> > @@ -132,6 +174,7 @@ def remote_dpdk_build_dir(self) -> PurePath:
> >
> > @property
> > def dpdk_version(self) -> str:
> > + """Last built DPDK version."""
> > if self._dpdk_version is None:
> > self._dpdk_version = self.main_session.get_dpdk_version(
> > self._remote_dpdk_dir
> > @@ -140,12 +183,14 @@ def dpdk_version(self) -> str:
> >
> > @property
> > def node_info(self) -> NodeInfo:
> > + """Additional node information."""
> > if self._node_info is None:
> > self._node_info = self.main_session.get_node_info()
> > return self._node_info
> >
> > @property
> > def compiler_version(self) -> str:
> > + """The node's compiler version."""
> > if self._compiler_version is None:
> > if self._build_target_config is not None:
> > self._compiler_version = self.main_session.get_compiler_version(
> > @@ -161,6 +206,7 @@ def compiler_version(self) -> str:
> >
> > @property
> > def path_to_devbind_script(self) -> PurePath:
> > + """The path to the dpdk-devbind.py script on the node."""
> > if self._path_to_devbind_script is None:
> > self._path_to_devbind_script = self.main_session.join_remote_path(
> > self._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
> > @@ -168,6 +214,11 @@ def path_to_devbind_script(self) -> PurePath:
> > return self._path_to_devbind_script
> >
> > def get_build_target_info(self) -> BuildTargetInfo:
> > + """Get additional build target information.
> > +
> > + Returns:
> > + The build target information,
> > + """
> > return BuildTargetInfo(
> > dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
> > )
> > @@ -178,8 +229,9 @@ def _guess_dpdk_remote_dir(self) -> PurePath:
> > def _set_up_build_target(
> > self, build_target_config: BuildTargetConfiguration
> > ) -> None:
> > - """
> > - Setup DPDK on the SUT node.
> > + """Setup DPDK on the SUT node.
> > +
> > + Additional build target setup steps on top of those in :class:`Node`.
> > """
> > # we want to ensure that dpdk_version and compiler_version is reset for new
> > # build targets
> > @@ -200,9 +252,7 @@ def _tear_down_build_target(self) -> None:
> > def _configure_build_target(
> > self, build_target_config: BuildTargetConfiguration
> > ) -> None:
> > - """
> > - Populate common environment variables and set build target config.
> > - """
> > + """Populate common environment variables and set build target config."""
> > self._env_vars = {}
> > self._build_target_config = build_target_config
> > self._env_vars.update(
> > @@ -217,9 +267,7 @@ def _configure_build_target(
> >
> > @Node.skip_setup
> > def _copy_dpdk_tarball(self) -> None:
> > - """
> > - Copy to and extract DPDK tarball on the SUT node.
> > - """
> > + """Copy to and extract DPDK tarball on the SUT node."""
> > self._logger.info("Copying DPDK tarball to SUT.")
> > self.main_session.copy_to(SETTINGS.dpdk_tarball_path, self._remote_tmp_dir)
> >
> > @@ -250,8 +298,9 @@ def _copy_dpdk_tarball(self) -> None:
> >
> > @Node.skip_setup
> > def _build_dpdk(self) -> None:
> > - """
> > - Build DPDK. Uses the already configured target. Assumes that the tarball has
> > + """Build DPDK.
> > +
> > + Uses the already configured target. Assumes that the tarball has
> > already been copied to and extracted on the SUT node.
> > """
> > self.main_session.build_dpdk(
> > @@ -262,15 +311,19 @@ def _build_dpdk(self) -> None:
> > )
> >
> > def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePath:
> > - """
> > - Build one or all DPDK apps. Requires DPDK to be already built on the SUT node.
> > - When app_name is 'all', build all example apps.
> > - When app_name is any other string, tries to build that example app.
> > - Return the directory path of the built app. If building all apps, return
> > - the path to the examples directory (where all apps reside).
> > - The meson_dpdk_args are keyword arguments
> > - found in meson_option.txt in root DPDK directory. Do not use -D with them,
> > - for example: enable_kmods=True.
> > + """Build one or all DPDK apps.
> > +
> > + Requires DPDK to be already built on the SUT node.
> > +
> > + Args:
> > + app_name: The name of the DPDK app to build.
> > + When `app_name` is ``all``, build all example apps.
> > + meson_dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
> > + Do not use ``-D`` with them.
> > +
> > + Returns:
> > + The directory path of the built app. If building all apps, return
> > + the path to the examples directory (where all apps reside).
> > """
> > self.main_session.build_dpdk(
> > self._env_vars,
> > @@ -291,9 +344,7 @@ def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePa
> > )
> >
> > def kill_cleanup_dpdk_apps(self) -> None:
> > - """
> > - Kill all dpdk applications on the SUT. Cleanup hugepages.
> > - """
> > + """Kill all dpdk applications on the SUT, then clean up hugepages."""
> > if self._dpdk_kill_session and self._dpdk_kill_session.is_alive():
> > # we can use the session if it exists and responds
> > self._dpdk_kill_session.kill_cleanup_dpdk_apps(self._dpdk_prefix_list)
> > @@ -312,33 +363,34 @@ def create_eal_parameters(
> > vdevs: list[VirtualDevice] | None = None,
> > other_eal_param: str = "",
> > ) -> "EalParameters":
> > - """
> > - Generate eal parameters character string;
> > - :param lcore_filter_specifier: a number of lcores/cores/sockets to use
> > - or a list of lcore ids to use.
> > - The default will select one lcore for each of two cores
> > - on one socket, in ascending order of core ids.
> > - :param ascending_cores: True, use cores with the lowest numerical id first
> > - and continue in ascending order. If False, start with the
> > - highest id and continue in descending order. This ordering
> > - affects which sockets to consider first as well.
> > - :param prefix: set file prefix string, eg:
> > - prefix='vf'
> > - :param append_prefix_timestamp: if True, will append a timestamp to
> > - DPDK file prefix.
> > - :param no_pci: switch of disable PCI bus eg:
> > - no_pci=True
> > - :param vdevs: virtual device list, eg:
> > - vdevs=[
> > - VirtualDevice('net_ring0'),
> > - VirtualDevice('net_ring1')
> > - ]
> > - :param other_eal_param: user defined DPDK eal parameters, eg:
> > - other_eal_param='--single-file-segments'
> > - :return: eal param string, eg:
> > - '-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420';
> > - """
> > + """Compose the EAL parameters.
> > +
> > + Process the list of cores and the DPDK prefix and pass that along with
> > + the rest of the arguments.
> >
> > + Args:
> > + lcore_filter_specifier: A number of lcores/cores/sockets to use
> > + or a list of lcore ids to use.
> > + The default will select one lcore for each of two cores
> > + on one socket, in ascending order of core ids.
> > + ascending_cores: Sort cores in ascending order (lowest to highest IDs).
> > + If :data:`False`, sort in descending order.
> > + prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
> > + append_prefix_timestamp: If :data:`True`, will append a timestamp to DPDK file prefix.
> > + no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
> > + vdevs: Virtual devices, e.g.::
> > +
> > + vdevs=[
> > + VirtualDevice('net_ring0'),
> > + VirtualDevice('net_ring1')
> > + ]
> > + other_eal_param: user defined DPDK EAL parameters, e.g.:
> > + ``other_eal_param='--single-file-segments'``.
> > +
> > + Returns:
> > + An EAL param string, such as
> > + ``-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420``.
> > + """
> > lcore_list = LogicalCoreList(
> > self.filter_lcores(lcore_filter_specifier, ascending_cores)
> > )
> > @@ -364,14 +416,29 @@ def create_eal_parameters(
> > def run_dpdk_app(
> > self, app_path: PurePath, eal_args: "EalParameters", timeout: float = 30
> > ) -> CommandResult:
> > - """
> > - Run DPDK application on the remote node.
> > + """Run DPDK application on the remote node.
> > +
> > + The application is not run interactively - the command that starts the application
> > + is executed and then the call waits for it to finish execution.
> > +
> > + Args:
> > + app_path: The remote path to the DPDK application.
> > + eal_args: EAL parameters to run the DPDK application with.
> > + timeout: Wait at most this long in seconds to execute the command.
>
> confusing timeout
>
Ack.
> > +
> > + Returns:
> > + The result of the DPDK app execution.
> > """
> > return self.main_session.send_command(
> > f"{app_path} {eal_args}", timeout, privileged=True, verify=True
> > )
> >
> > def configure_ipv4_forwarding(self, enable: bool) -> None:
> > + """Enable/disable IPv4 forwarding on the node.
> > +
> > + Args:
> > + enable: If :data:`True`, enable the forwarding, otherwise disable it.
> > + """
> > self.main_session.configure_ipv4_forwarding(enable)
> >
> > def create_interactive_shell(
> > @@ -381,9 +448,13 @@ def create_interactive_shell(
> > privileged: bool = False,
> > eal_parameters: EalParameters | str | None = None,
> > ) -> InteractiveShellType:
> > - """Factory method for creating a handler for an interactive session.
> > + """Extend the factory for interactive session handlers.
> > +
> > + The extensions are SUT node specific:
> >
> > - Instantiate shell_cls according to the remote OS specifics.
> > + * The default for `eal_parameters`,
> > + * The interactive shell path `shell_cls.path` is prepended with path to the remote
> > + DPDK build directory for DPDK apps.
> >
> > Args:
> > shell_cls: The class of the shell.
> > @@ -393,9 +464,10 @@ def create_interactive_shell(
> > privileged: Whether to run the shell with administrative privileges.
> > eal_parameters: List of EAL parameters to use to launch the app. If this
> > isn't provided or an empty string is passed, it will default to calling
> > - create_eal_parameters().
> > + :meth:`create_eal_parameters`.
> > +
> > Returns:
> > - Instance of the desired interactive application.
> > + An instance of the desired interactive application shell.
> > """
> > if not eal_parameters:
> > eal_parameters = self.create_eal_parameters()
> > @@ -414,8 +486,8 @@ def bind_ports_to_driver(self, for_dpdk: bool = True) -> None:
> > """Bind all ports on the SUT to a driver.
> >
> > Args:
> > - for_dpdk: Boolean that, when True, binds ports to os_driver_for_dpdk
> > - or, when False, binds to os_driver. Defaults to True.
> > + for_dpdk: If :data:`True`, binds ports to os_driver_for_dpdk.
> > + If :data:`False`, binds to os_driver.
> > """
> > for port in self.ports:
> > driver = port.os_driver_for_dpdk if for_dpdk else port.os_driver
> > diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
> > index 166eb8430e..69eb33ccb1 100644
> > --- a/dts/framework/testbed_model/tg_node.py
> > +++ b/dts/framework/testbed_model/tg_node.py
> > @@ -5,13 +5,8 @@
> >
> > """Traffic generator node.
> >
> > -This is the node where the traffic generator resides.
> > -The distinction between a node and a traffic generator is as follows:
> > -A node is a host that DTS connects to. It could be a baremetal server,
> > -a VM or a container.
> > -A traffic generator is software running on the node.
> > -A traffic generator node is a node running a traffic generator.
> > -A node can be a traffic generator node as well as system under test node.
> > +A traffic generator (TG) generates traffic that's sent towards the SUT node.
> > +A TG node is where the TG runs.
> > """
> >
> > from scapy.packet import Packet # type: ignore[import]
> > @@ -24,13 +19,16 @@
> >
> >
> > class TGNode(Node):
> > - """Manage connections to a node with a traffic generator.
> > + """The traffic generator node.
> >
> > - Apart from basic node management capabilities, the Traffic Generator node has
> > - specialized methods for handling the traffic generator running on it.
> > + The TG node extends :class:`Node` with TG specific features:
> >
> > - Arguments:
> > - node_config: The user configuration of the traffic generator node.
> > + * Traffic generator initialization,
> > + * The sending of traffic and receiving packets,
> > + * The sending of traffic without receiving packets.
> > +
> > + Not all traffic generators are capable of capturing traffic, which is why there
> > + must be a way to send traffic without that.
> >
> > Attributes:
> > traffic_generator: The traffic generator running on the node.
> > @@ -39,6 +37,13 @@ class TGNode(Node):
> > traffic_generator: CapturingTrafficGenerator
> >
> > def __init__(self, node_config: TGNodeConfiguration):
> > + """Extend the constructor with TG node specifics.
> > +
> > + Initialize the traffic generator on the TG node.
> > +
> > + Args:
> > + node_config: The TG node's test run configuration.
> > + """
> > super(TGNode, self).__init__(node_config)
> > self.traffic_generator = create_traffic_generator(
> > self, node_config.traffic_generator
> > @@ -52,17 +57,17 @@ def send_packet_and_capture(
> > receive_port: Port,
> > duration: float = 1,
> > ) -> list[Packet]:
> > - """Send a packet, return received traffic.
> > + """Send `packet`, return received traffic.
> >
> > - Send a packet on the send_port and then return all traffic captured
> > - on the receive_port for the given duration. Also record the captured traffic
> > + Send `packet` on `send_port` and then return all traffic captured
> > + on `receive_port` for the given duration. Also record the captured traffic
> > in a pcap file.
> >
> > Args:
> > packet: The packet to send.
> > send_port: The egress port on the TG node.
> > receive_port: The ingress port in the TG node.
> > - duration: Capture traffic for this amount of time after sending the packet.
> > + duration: Capture traffic for this amount of time after sending `packet`.
> >
> > Returns:
> > A list of received packets. May be empty if no packets are captured.
> > @@ -72,6 +77,9 @@ def send_packet_and_capture(
> > )
> >
> > def close(self) -> None:
> > - """Free all resources used by the node"""
> > + """Free all resources used by the node.
> > +
> > + This extends the superclass method with TG cleanup.
> > + """
> > self.traffic_generator.close()
> > super(TGNode, self).close()
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 19/21] dts: base traffic generators docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (17 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 18/21] dts: sut and tg nodes " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-21 16:20 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 20/21] dts: scapy tg " Juraj Linkeš
` (2 subsequent siblings)
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../traffic_generator/__init__.py | 22 ++++++++-
.../capturing_traffic_generator.py | 46 +++++++++++--------
.../traffic_generator/traffic_generator.py | 33 +++++++------
3 files changed, 68 insertions(+), 33 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index 11bfa1ee0f..51cca77da4 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -1,6 +1,19 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""DTS traffic generators.
+
+A traffic generator is capable of generating traffic and then monitor returning traffic.
+A traffic generator may just count the number of received packets
+and it may additionally capture individual packets.
+
+A traffic generator may be software running on generic hardware or it could be specialized hardware.
+
+The traffic generators that only count the number of received packets are suitable only for
+performance testing. In functional testing, we need to be able to dissect each arrived packet
+and a capturing traffic generator is required.
+"""
+
from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
from framework.exception import ConfigurationError
from framework.testbed_model.node import Node
@@ -12,8 +25,15 @@
def create_traffic_generator(
tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
+ """The factory function for creating traffic generator objects from the test run configuration.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ traffic_generator_config: The traffic generator config.
+ Returns:
+ A traffic generator capable of capturing received packets.
+ """
match traffic_generator_config.traffic_generator_type:
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGenerator(tg_node, traffic_generator_config)
diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index e521211ef0..b0a43ad003 100644
--- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -23,19 +23,22 @@
def _get_default_capture_name() -> str:
- """
- This is the function used for the default implementation of capture names.
- """
return str(uuid.uuid4())
class CapturingTrafficGenerator(TrafficGenerator):
"""Capture packets after sending traffic.
- A mixin interface which enables a packet generator to declare that it can capture
+ The intermediary interface which enables a packet generator to declare that it can capture
packets and return them to the user.
+ Similarly to
+ :class:`~framework.testbed_model.traffic_generator.traffic_generator.TrafficGenerator`,
+ this class exposes the public methods specific to capturing traffic generators and defines
+ a private method that must implement the traffic generation and capturing logic in subclasses.
+
The methods of capturing traffic generators obey the following workflow:
+
1. send packets
2. capture packets
3. write the capture to a .pcap file
@@ -44,6 +47,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
@property
def is_capturing(self) -> bool:
+ """This traffic generator can capture traffic."""
return True
def send_packet_and_capture(
@@ -54,11 +58,12 @@ def send_packet_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet` and capture received traffic.
+
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ The captured traffic is recorded in the `capture_name`.pcap file.
Args:
packet: The packet to send.
@@ -68,7 +73,7 @@ def send_packet_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
return self.send_packets_and_capture(
[packet], send_port, receive_port, duration, capture_name
@@ -82,11 +87,14 @@ def send_packets_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send packets, return received traffic.
+ """Send `packets` and capture received traffic.
- Send packets on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ Send `packets` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
+
+ The captured traffic is recorded in the `capture_name`.pcap file. The target directory
+ can be configured with the :option:`--output-dir` command line argument or
+ the :envvar:`DTS_OUTPUT_DIR` environment variable.
Args:
packets: The packets to send.
@@ -96,7 +104,7 @@ def send_packets_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
self._logger.debug(get_packet_summaries(packets))
self._logger.debug(
@@ -124,10 +132,12 @@ def _send_packets_and_capture(
receive_port: Port,
duration: float,
) -> list[Packet]:
- """
- The extended classes must implement this method which
- sends packets on send_port and receives packets on the receive_port
- for the specified duration. It must be able to handle no received packets.
+ """The implementation of :method:`send_packets_and_capture`.
+
+ The subclasses must implement this method which sends `packets` on `send_port`
+ and receives packets on `receive_port` for the specified `duration`.
+
+ It must be able to handle no received packets.
"""
def _write_capture_from_packets(
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index ea7c3963da..ed396c6a2f 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -22,7 +22,8 @@
class TrafficGenerator(ABC):
"""The base traffic generator.
- Defines the few basic methods that each traffic generator must implement.
+ Exposes the common public methods of all traffic generators and defines private methods
+ that must implement the traffic generation logic in subclasses.
"""
_config: TrafficGeneratorConfig
@@ -30,6 +31,12 @@ class TrafficGenerator(ABC):
_logger: DTSLOG
def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ """Initialize the traffic generator.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ config: The traffic generator's test run configuration.
+ """
self._config = config
self._tg_node = tg_node
self._logger = getLogger(
@@ -37,9 +44,9 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
)
def send_packet(self, packet: Packet, port: Port) -> None:
- """Send a packet and block until it is fully sent.
+ """Send `packet` and block until it is fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packet` on `port`, then wait until `packet` is fully sent.
Args:
packet: The packet to send.
@@ -48,9 +55,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
self.send_packets([packet], port)
def send_packets(self, packets: list[Packet], port: Port) -> None:
- """Send packets and block until they are fully sent.
+ """Send `packets` and block until they are fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packets` on `port`, then wait until `packets` are fully sent.
Args:
packets: The packets to send.
@@ -62,19 +69,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
@abstractmethod
def _send_packets(self, packets: list[Packet], port: Port) -> None:
- """
- The extended classes must implement this method which
- sends packets on send_port. The method should block until all packets
- are fully sent.
+ """The implementation of :method:`send_packets`.
+
+ The subclasses must implement this method which sends `packets` on `port`.
+ The method should block until all `packets` are fully sent.
+
+ What full sent means is defined by the traffic generator.
"""
@property
def is_capturing(self) -> bool:
- """Whether this traffic generator can capture traffic.
-
- Returns:
- True if the traffic generator can capture traffic, False otherwise.
- """
+ """This traffic generator can't capture traffic."""
return False
@abstractmethod
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 19/21] dts: base traffic generators docstring update
2023-11-15 13:09 ` [PATCH v7 19/21] dts: base traffic generators " Juraj Linkeš
@ 2023-11-21 16:20 ` Yoan Picchi
2023-11-22 11:38 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-21 16:20 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> .../traffic_generator/__init__.py | 22 ++++++++-
> .../capturing_traffic_generator.py | 46 +++++++++++--------
> .../traffic_generator/traffic_generator.py | 33 +++++++------
> 3 files changed, 68 insertions(+), 33 deletions(-)
>
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> index 11bfa1ee0f..51cca77da4 100644
> --- a/dts/framework/testbed_model/traffic_generator/__init__.py
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -1,6 +1,19 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> +"""DTS traffic generators.
> +
> +A traffic generator is capable of generating traffic and then monitor returning traffic.
> +A traffic generator may just count the number of received packets
> +and it may additionally capture individual packets.
The sentence feels odd. Isn't it supposed to be "or" here? and no need
for that early of a line break
> +
> +A traffic generator may be software running on generic hardware or it could be specialized hardware.
> +
> +The traffic generators that only count the number of received packets are suitable only for
> +performance testing. In functional testing, we need to be able to dissect each arrived packet
> +and a capturing traffic generator is required.
> +"""
> +
> from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
> from framework.exception import ConfigurationError
> from framework.testbed_model.node import Node
> @@ -12,8 +25,15 @@
> def create_traffic_generator(
> tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
> ) -> CapturingTrafficGenerator:
> - """A factory function for creating traffic generator object from user config."""
> + """The factory function for creating traffic generator objects from the test run configuration.
> +
> + Args:
> + tg_node: The traffic generator node where the created traffic generator will be running.
> + traffic_generator_config: The traffic generator config.
>
> + Returns:
> + A traffic generator capable of capturing received packets.
> + """
> match traffic_generator_config.traffic_generator_type:
> case TrafficGeneratorType.SCAPY:
> return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> index e521211ef0..b0a43ad003 100644
> --- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> @@ -23,19 +23,22 @@
>
>
> def _get_default_capture_name() -> str:
> - """
> - This is the function used for the default implementation of capture names.
> - """
> return str(uuid.uuid4())
>
>
> class CapturingTrafficGenerator(TrafficGenerator):
> """Capture packets after sending traffic.
>
> - A mixin interface which enables a packet generator to declare that it can capture
> + The intermediary interface which enables a packet generator to declare that it can capture
> packets and return them to the user.
>
> + Similarly to
> + :class:`~framework.testbed_model.traffic_generator.traffic_generator.TrafficGenerator`,
> + this class exposes the public methods specific to capturing traffic generators and defines
> + a private method that must implement the traffic generation and capturing logic in subclasses.
> +
> The methods of capturing traffic generators obey the following workflow:
> +
> 1. send packets
> 2. capture packets
> 3. write the capture to a .pcap file
> @@ -44,6 +47,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
>
> @property
> def is_capturing(self) -> bool:
> + """This traffic generator can capture traffic."""
> return True
>
> def send_packet_and_capture(
> @@ -54,11 +58,12 @@ def send_packet_and_capture(
> duration: float,
> capture_name: str = _get_default_capture_name(),
> ) -> list[Packet]:
> - """Send a packet, return received traffic.
> + """Send `packet` and capture received traffic.
> +
> + Send `packet` on `send_port` and then return all traffic captured
> + on `receive_port` for the given `duration`.
>
> - Send a packet on the send_port and then return all traffic captured
> - on the receive_port for the given duration. Also record the captured traffic
> - in a pcap file.
> + The captured traffic is recorded in the `capture_name`.pcap file.
>
> Args:
> packet: The packet to send.
> @@ -68,7 +73,7 @@ def send_packet_and_capture(
> capture_name: The name of the .pcap file where to store the capture.
>
> Returns:
> - A list of received packets. May be empty if no packets are captured.
> + The received packets. May be empty if no packets are captured.
> """
> return self.send_packets_and_capture(
> [packet], send_port, receive_port, duration, capture_name
> @@ -82,11 +87,14 @@ def send_packets_and_capture(
> duration: float,
> capture_name: str = _get_default_capture_name(),
> ) -> list[Packet]:
> - """Send packets, return received traffic.
> + """Send `packets` and capture received traffic.
>
> - Send packets on the send_port and then return all traffic captured
> - on the receive_port for the given duration. Also record the captured traffic
> - in a pcap file.
> + Send `packets` on `send_port` and then return all traffic captured
> + on `receive_port` for the given `duration`.
> +
> + The captured traffic is recorded in the `capture_name`.pcap file. The target directory
> + can be configured with the :option:`--output-dir` command line argument or
> + the :envvar:`DTS_OUTPUT_DIR` environment variable.
>
> Args:
> packets: The packets to send.
> @@ -96,7 +104,7 @@ def send_packets_and_capture(
> capture_name: The name of the .pcap file where to store the capture.
>
> Returns:
> - A list of received packets. May be empty if no packets are captured.
> + The received packets. May be empty if no packets are captured.
> """
> self._logger.debug(get_packet_summaries(packets))
> self._logger.debug(
> @@ -124,10 +132,12 @@ def _send_packets_and_capture(
> receive_port: Port,
> duration: float,
> ) -> list[Packet]:
> - """
> - The extended classes must implement this method which
> - sends packets on send_port and receives packets on the receive_port
> - for the specified duration. It must be able to handle no received packets.
> + """The implementation of :method:`send_packets_and_capture`.
> +
> + The subclasses must implement this method which sends `packets` on `send_port`
> + and receives packets on `receive_port` for the specified `duration`.
> +
> + It must be able to handle no received packets.
This sentence feels odd too. Maybe "It must be able to handle receiving
no packets."
> """
>
> def _write_capture_from_packets(
> diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index ea7c3963da..ed396c6a2f 100644
> --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -22,7 +22,8 @@
> class TrafficGenerator(ABC):
> """The base traffic generator.
>
> - Defines the few basic methods that each traffic generator must implement.
> + Exposes the common public methods of all traffic generators and defines private methods
> + that must implement the traffic generation logic in subclasses.
> """
>
> _config: TrafficGeneratorConfig
> @@ -30,6 +31,12 @@ class TrafficGenerator(ABC):
> _logger: DTSLOG
>
> def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> + """Initialize the traffic generator.
> +
> + Args:
> + tg_node: The traffic generator node where the created traffic generator will be running.
> + config: The traffic generator's test run configuration.
> + """
> self._config = config
> self._tg_node = tg_node
> self._logger = getLogger(
> @@ -37,9 +44,9 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> )
>
> def send_packet(self, packet: Packet, port: Port) -> None:
> - """Send a packet and block until it is fully sent.
> + """Send `packet` and block until it is fully sent.
>
> - What fully sent means is defined by the traffic generator.
> + Send `packet` on `port`, then wait until `packet` is fully sent.
>
> Args:
> packet: The packet to send.
> @@ -48,9 +55,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
> self.send_packets([packet], port)
>
> def send_packets(self, packets: list[Packet], port: Port) -> None:
> - """Send packets and block until they are fully sent.
> + """Send `packets` and block until they are fully sent.
>
> - What fully sent means is defined by the traffic generator.
> + Send `packets` on `port`, then wait until `packets` are fully sent.
>
> Args:
> packets: The packets to send.
> @@ -62,19 +69,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
>
> @abstractmethod
> def _send_packets(self, packets: list[Packet], port: Port) -> None:
> - """
> - The extended classes must implement this method which
> - sends packets on send_port. The method should block until all packets
> - are fully sent.
> + """The implementation of :method:`send_packets`.
> +
> + The subclasses must implement this method which sends `packets` on `port`.
> + The method should block until all `packets` are fully sent.
> +
> + What full sent means is defined by the traffic generator.
full -> fully
> """
>
> @property
> def is_capturing(self) -> bool:
> - """Whether this traffic generator can capture traffic.
> -
> - Returns:
> - True if the traffic generator can capture traffic, False otherwise.
> - """
> + """This traffic generator can't capture traffic."""
> return False
>
> @abstractmethod
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 19/21] dts: base traffic generators docstring update
2023-11-21 16:20 ` Yoan Picchi
@ 2023-11-22 11:38 ` Juraj Linkeš
2023-11-22 11:56 ` Yoan Picchi
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 11:38 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Tue, Nov 21, 2023 at 5:20 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > .../traffic_generator/__init__.py | 22 ++++++++-
> > .../capturing_traffic_generator.py | 46 +++++++++++--------
> > .../traffic_generator/traffic_generator.py | 33 +++++++------
> > 3 files changed, 68 insertions(+), 33 deletions(-)
> >
> > diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> > index 11bfa1ee0f..51cca77da4 100644
> > --- a/dts/framework/testbed_model/traffic_generator/__init__.py
> > +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> > @@ -1,6 +1,19 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >
> > +"""DTS traffic generators.
> > +
> > +A traffic generator is capable of generating traffic and then monitor returning traffic.
> > +A traffic generator may just count the number of received packets
> > +and it may additionally capture individual packets.
>
> The sentence feels odd. Isn't it supposed to be "or" here? and no need
> for that early of a line break
>
There are two mays, so there probably should be an or. But I'd like to
reword it to this:
All traffic generators count the number of received packets, and they
may additionally
capture individual packets.
What do you think?
> > +
> > +A traffic generator may be software running on generic hardware or it could be specialized hardware.
> > +
> > +The traffic generators that only count the number of received packets are suitable only for
> > +performance testing. In functional testing, we need to be able to dissect each arrived packet
> > +and a capturing traffic generator is required.
> > +"""
> > +
> > from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
> > from framework.exception import ConfigurationError
> > from framework.testbed_model.node import Node
> > @@ -12,8 +25,15 @@
> > def create_traffic_generator(
> > tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
> > ) -> CapturingTrafficGenerator:
> > - """A factory function for creating traffic generator object from user config."""
> > + """The factory function for creating traffic generator objects from the test run configuration.
> > +
> > + Args:
> > + tg_node: The traffic generator node where the created traffic generator will be running.
> > + traffic_generator_config: The traffic generator config.
> >
> > + Returns:
> > + A traffic generator capable of capturing received packets.
> > + """
> > match traffic_generator_config.traffic_generator_type:
> > case TrafficGeneratorType.SCAPY:
> > return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> > diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> > index e521211ef0..b0a43ad003 100644
> > --- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> > +++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> > @@ -23,19 +23,22 @@
> >
> >
> > def _get_default_capture_name() -> str:
> > - """
> > - This is the function used for the default implementation of capture names.
> > - """
> > return str(uuid.uuid4())
> >
> >
> > class CapturingTrafficGenerator(TrafficGenerator):
> > """Capture packets after sending traffic.
> >
> > - A mixin interface which enables a packet generator to declare that it can capture
> > + The intermediary interface which enables a packet generator to declare that it can capture
> > packets and return them to the user.
> >
> > + Similarly to
> > + :class:`~framework.testbed_model.traffic_generator.traffic_generator.TrafficGenerator`,
> > + this class exposes the public methods specific to capturing traffic generators and defines
> > + a private method that must implement the traffic generation and capturing logic in subclasses.
> > +
> > The methods of capturing traffic generators obey the following workflow:
> > +
> > 1. send packets
> > 2. capture packets
> > 3. write the capture to a .pcap file
> > @@ -44,6 +47,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
> >
> > @property
> > def is_capturing(self) -> bool:
> > + """This traffic generator can capture traffic."""
> > return True
> >
> > def send_packet_and_capture(
> > @@ -54,11 +58,12 @@ def send_packet_and_capture(
> > duration: float,
> > capture_name: str = _get_default_capture_name(),
> > ) -> list[Packet]:
> > - """Send a packet, return received traffic.
> > + """Send `packet` and capture received traffic.
> > +
> > + Send `packet` on `send_port` and then return all traffic captured
> > + on `receive_port` for the given `duration`.
> >
> > - Send a packet on the send_port and then return all traffic captured
> > - on the receive_port for the given duration. Also record the captured traffic
> > - in a pcap file.
> > + The captured traffic is recorded in the `capture_name`.pcap file.
> >
> > Args:
> > packet: The packet to send.
> > @@ -68,7 +73,7 @@ def send_packet_and_capture(
> > capture_name: The name of the .pcap file where to store the capture.
> >
> > Returns:
> > - A list of received packets. May be empty if no packets are captured.
> > + The received packets. May be empty if no packets are captured.
> > """
> > return self.send_packets_and_capture(
> > [packet], send_port, receive_port, duration, capture_name
> > @@ -82,11 +87,14 @@ def send_packets_and_capture(
> > duration: float,
> > capture_name: str = _get_default_capture_name(),
> > ) -> list[Packet]:
> > - """Send packets, return received traffic.
> > + """Send `packets` and capture received traffic.
> >
> > - Send packets on the send_port and then return all traffic captured
> > - on the receive_port for the given duration. Also record the captured traffic
> > - in a pcap file.
> > + Send `packets` on `send_port` and then return all traffic captured
> > + on `receive_port` for the given `duration`.
> > +
> > + The captured traffic is recorded in the `capture_name`.pcap file. The target directory
> > + can be configured with the :option:`--output-dir` command line argument or
> > + the :envvar:`DTS_OUTPUT_DIR` environment variable.
> >
> > Args:
> > packets: The packets to send.
> > @@ -96,7 +104,7 @@ def send_packets_and_capture(
> > capture_name: The name of the .pcap file where to store the capture.
> >
> > Returns:
> > - A list of received packets. May be empty if no packets are captured.
> > + The received packets. May be empty if no packets are captured.
> > """
> > self._logger.debug(get_packet_summaries(packets))
> > self._logger.debug(
> > @@ -124,10 +132,12 @@ def _send_packets_and_capture(
> > receive_port: Port,
> > duration: float,
> > ) -> list[Packet]:
> > - """
> > - The extended classes must implement this method which
> > - sends packets on send_port and receives packets on the receive_port
> > - for the specified duration. It must be able to handle no received packets.
> > + """The implementation of :method:`send_packets_and_capture`.
> > +
> > + The subclasses must implement this method which sends `packets` on `send_port`
> > + and receives packets on `receive_port` for the specified `duration`.
> > +
> > + It must be able to handle no received packets.
>
> This sentence feels odd too. Maybe "It must be able to handle receiving
> no packets."
>
Right, your suggestion is better.
> > """
> >
> > def _write_capture_from_packets(
> > diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> > index ea7c3963da..ed396c6a2f 100644
> > --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> > +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> > @@ -22,7 +22,8 @@
> > class TrafficGenerator(ABC):
> > """The base traffic generator.
> >
> > - Defines the few basic methods that each traffic generator must implement.
> > + Exposes the common public methods of all traffic generators and defines private methods
> > + that must implement the traffic generation logic in subclasses.
> > """
> >
> > _config: TrafficGeneratorConfig
> > @@ -30,6 +31,12 @@ class TrafficGenerator(ABC):
> > _logger: DTSLOG
> >
> > def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> > + """Initialize the traffic generator.
> > +
> > + Args:
> > + tg_node: The traffic generator node where the created traffic generator will be running.
> > + config: The traffic generator's test run configuration.
> > + """
> > self._config = config
> > self._tg_node = tg_node
> > self._logger = getLogger(
> > @@ -37,9 +44,9 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> > )
> >
> > def send_packet(self, packet: Packet, port: Port) -> None:
> > - """Send a packet and block until it is fully sent.
> > + """Send `packet` and block until it is fully sent.
> >
> > - What fully sent means is defined by the traffic generator.
> > + Send `packet` on `port`, then wait until `packet` is fully sent.
> >
> > Args:
> > packet: The packet to send.
> > @@ -48,9 +55,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
> > self.send_packets([packet], port)
> >
> > def send_packets(self, packets: list[Packet], port: Port) -> None:
> > - """Send packets and block until they are fully sent.
> > + """Send `packets` and block until they are fully sent.
> >
> > - What fully sent means is defined by the traffic generator.
> > + Send `packets` on `port`, then wait until `packets` are fully sent.
> >
> > Args:
> > packets: The packets to send.
> > @@ -62,19 +69,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
> >
> > @abstractmethod
> > def _send_packets(self, packets: list[Packet], port: Port) -> None:
> > - """
> > - The extended classes must implement this method which
> > - sends packets on send_port. The method should block until all packets
> > - are fully sent.
> > + """The implementation of :method:`send_packets`.
> > +
> > + The subclasses must implement this method which sends `packets` on `port`.
> > + The method should block until all `packets` are fully sent.
> > +
> > + What full sent means is defined by the traffic generator.
>
> full -> fully
>
> > """
> >
> > @property
> > def is_capturing(self) -> bool:
> > - """Whether this traffic generator can capture traffic.
> > -
> > - Returns:
> > - True if the traffic generator can capture traffic, False otherwise.
> > - """
> > + """This traffic generator can't capture traffic."""
> > return False
> >
> > @abstractmethod
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 19/21] dts: base traffic generators docstring update
2023-11-22 11:38 ` Juraj Linkeš
@ 2023-11-22 11:56 ` Yoan Picchi
2023-11-22 13:11 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-22 11:56 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On 11/22/23 11:38, Juraj Linkeš wrote:
> On Tue, Nov 21, 2023 at 5:20 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>>
>> On 11/15/23 13:09, Juraj Linkeš wrote:
>>> Format according to the Google format and PEP257, with slight
>>> deviations.
>>>
>>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>>> ---
>>> .../traffic_generator/__init__.py | 22 ++++++++-
>>> .../capturing_traffic_generator.py | 46 +++++++++++--------
>>> .../traffic_generator/traffic_generator.py | 33 +++++++------
>>> 3 files changed, 68 insertions(+), 33 deletions(-)
>>>
>>> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
>>> index 11bfa1ee0f..51cca77da4 100644
>>> --- a/dts/framework/testbed_model/traffic_generator/__init__.py
>>> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
>>> @@ -1,6 +1,19 @@
>>> # SPDX-License-Identifier: BSD-3-Clause
>>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>>>
>>> +"""DTS traffic generators.
>>> +
>>> +A traffic generator is capable of generating traffic and then monitor returning traffic.
>>> +A traffic generator may just count the number of received packets
>>> +and it may additionally capture individual packets.
>>
>> The sentence feels odd. Isn't it supposed to be "or" here? and no need
>> for that early of a line break
>>
>
> There are two mays, so there probably should be an or. But I'd like to
> reword it to this:
>
> All traffic generators count the number of received packets, and they
> may additionally
> capture individual packets.
>
> What do you think?
I think it's better with the new sentence. But I think it'd be even
better to split into two sentences to highlight the must/may:
All traffic generators must count the number of received packets. Some
may additionally capture individual packets.
>
>>> +
>>> +A traffic generator may be software running on generic hardware or it could be specialized hardware.
>>> +
>>> +The traffic generators that only count the number of received packets are suitable only for
>>> +performance testing. In functional testing, we need to be able to dissect each arrived packet
>>> +and a capturing traffic generator is required.
>>> +"""
>>> +
>>> from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
>>> from framework.exception import ConfigurationError
>>> from framework.testbed_model.node import Node
>>> @@ -12,8 +25,15 @@
>>> def create_traffic_generator(
>>> tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
>>> ) -> CapturingTrafficGenerator:
>>> - """A factory function for creating traffic generator object from user config."""
>>> + """The factory function for creating traffic generator objects from the test run configuration.
>>> +
>>> + Args:
>>> + tg_node: The traffic generator node where the created traffic generator will be running.
>>> + traffic_generator_config: The traffic generator config.
>>>
>>> + Returns:
>>> + A traffic generator capable of capturing received packets.
>>> + """
>>> match traffic_generator_config.traffic_generator_type:
>>> case TrafficGeneratorType.SCAPY:
>>> return ScapyTrafficGenerator(tg_node, traffic_generator_config)
>>> diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
>>> index e521211ef0..b0a43ad003 100644
>>> --- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
>>> +++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
>>> @@ -23,19 +23,22 @@
>>>
>>>
>>> def _get_default_capture_name() -> str:
>>> - """
>>> - This is the function used for the default implementation of capture names.
>>> - """
>>> return str(uuid.uuid4())
>>>
>>>
>>> class CapturingTrafficGenerator(TrafficGenerator):
>>> """Capture packets after sending traffic.
>>>
>>> - A mixin interface which enables a packet generator to declare that it can capture
>>> + The intermediary interface which enables a packet generator to declare that it can capture
>>> packets and return them to the user.
>>>
>>> + Similarly to
>>> + :class:`~framework.testbed_model.traffic_generator.traffic_generator.TrafficGenerator`,
>>> + this class exposes the public methods specific to capturing traffic generators and defines
>>> + a private method that must implement the traffic generation and capturing logic in subclasses.
>>> +
>>> The methods of capturing traffic generators obey the following workflow:
>>> +
>>> 1. send packets
>>> 2. capture packets
>>> 3. write the capture to a .pcap file
>>> @@ -44,6 +47,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
>>>
>>> @property
>>> def is_capturing(self) -> bool:
>>> + """This traffic generator can capture traffic."""
>>> return True
>>>
>>> def send_packet_and_capture(
>>> @@ -54,11 +58,12 @@ def send_packet_and_capture(
>>> duration: float,
>>> capture_name: str = _get_default_capture_name(),
>>> ) -> list[Packet]:
>>> - """Send a packet, return received traffic.
>>> + """Send `packet` and capture received traffic.
>>> +
>>> + Send `packet` on `send_port` and then return all traffic captured
>>> + on `receive_port` for the given `duration`.
>>>
>>> - Send a packet on the send_port and then return all traffic captured
>>> - on the receive_port for the given duration. Also record the captured traffic
>>> - in a pcap file.
>>> + The captured traffic is recorded in the `capture_name`.pcap file.
>>>
>>> Args:
>>> packet: The packet to send.
>>> @@ -68,7 +73,7 @@ def send_packet_and_capture(
>>> capture_name: The name of the .pcap file where to store the capture.
>>>
>>> Returns:
>>> - A list of received packets. May be empty if no packets are captured.
>>> + The received packets. May be empty if no packets are captured.
>>> """
>>> return self.send_packets_and_capture(
>>> [packet], send_port, receive_port, duration, capture_name
>>> @@ -82,11 +87,14 @@ def send_packets_and_capture(
>>> duration: float,
>>> capture_name: str = _get_default_capture_name(),
>>> ) -> list[Packet]:
>>> - """Send packets, return received traffic.
>>> + """Send `packets` and capture received traffic.
>>>
>>> - Send packets on the send_port and then return all traffic captured
>>> - on the receive_port for the given duration. Also record the captured traffic
>>> - in a pcap file.
>>> + Send `packets` on `send_port` and then return all traffic captured
>>> + on `receive_port` for the given `duration`.
>>> +
>>> + The captured traffic is recorded in the `capture_name`.pcap file. The target directory
>>> + can be configured with the :option:`--output-dir` command line argument or
>>> + the :envvar:`DTS_OUTPUT_DIR` environment variable.
>>>
>>> Args:
>>> packets: The packets to send.
>>> @@ -96,7 +104,7 @@ def send_packets_and_capture(
>>> capture_name: The name of the .pcap file where to store the capture.
>>>
>>> Returns:
>>> - A list of received packets. May be empty if no packets are captured.
>>> + The received packets. May be empty if no packets are captured.
>>> """
>>> self._logger.debug(get_packet_summaries(packets))
>>> self._logger.debug(
>>> @@ -124,10 +132,12 @@ def _send_packets_and_capture(
>>> receive_port: Port,
>>> duration: float,
>>> ) -> list[Packet]:
>>> - """
>>> - The extended classes must implement this method which
>>> - sends packets on send_port and receives packets on the receive_port
>>> - for the specified duration. It must be able to handle no received packets.
>>> + """The implementation of :method:`send_packets_and_capture`.
>>> +
>>> + The subclasses must implement this method which sends `packets` on `send_port`
>>> + and receives packets on `receive_port` for the specified `duration`.
>>> +
>>> + It must be able to handle no received packets.
>>
>> This sentence feels odd too. Maybe "It must be able to handle receiving
>> no packets."
>>
>
> Right, your suggestion is better.
>
>>> """
>>>
>>> def _write_capture_from_packets(
>>> diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
>>> index ea7c3963da..ed396c6a2f 100644
>>> --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
>>> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
>>> @@ -22,7 +22,8 @@
>>> class TrafficGenerator(ABC):
>>> """The base traffic generator.
>>>
>>> - Defines the few basic methods that each traffic generator must implement.
>>> + Exposes the common public methods of all traffic generators and defines private methods
>>> + that must implement the traffic generation logic in subclasses.
>>> """
>>>
>>> _config: TrafficGeneratorConfig
>>> @@ -30,6 +31,12 @@ class TrafficGenerator(ABC):
>>> _logger: DTSLOG
>>>
>>> def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
>>> + """Initialize the traffic generator.
>>> +
>>> + Args:
>>> + tg_node: The traffic generator node where the created traffic generator will be running.
>>> + config: The traffic generator's test run configuration.
>>> + """
>>> self._config = config
>>> self._tg_node = tg_node
>>> self._logger = getLogger(
>>> @@ -37,9 +44,9 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
>>> )
>>>
>>> def send_packet(self, packet: Packet, port: Port) -> None:
>>> - """Send a packet and block until it is fully sent.
>>> + """Send `packet` and block until it is fully sent.
>>>
>>> - What fully sent means is defined by the traffic generator.
>>> + Send `packet` on `port`, then wait until `packet` is fully sent.
>>>
>>> Args:
>>> packet: The packet to send.
>>> @@ -48,9 +55,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
>>> self.send_packets([packet], port)
>>>
>>> def send_packets(self, packets: list[Packet], port: Port) -> None:
>>> - """Send packets and block until they are fully sent.
>>> + """Send `packets` and block until they are fully sent.
>>>
>>> - What fully sent means is defined by the traffic generator.
>>> + Send `packets` on `port`, then wait until `packets` are fully sent.
>>>
>>> Args:
>>> packets: The packets to send.
>>> @@ -62,19 +69,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
>>>
>>> @abstractmethod
>>> def _send_packets(self, packets: list[Packet], port: Port) -> None:
>>> - """
>>> - The extended classes must implement this method which
>>> - sends packets on send_port. The method should block until all packets
>>> - are fully sent.
>>> + """The implementation of :method:`send_packets`.
>>> +
>>> + The subclasses must implement this method which sends `packets` on `port`.
>>> + The method should block until all `packets` are fully sent.
>>> +
>>> + What full sent means is defined by the traffic generator.
>>
>> full -> fully
>>
>>> """
>>>
>>> @property
>>> def is_capturing(self) -> bool:
>>> - """Whether this traffic generator can capture traffic.
>>> -
>>> - Returns:
>>> - True if the traffic generator can capture traffic, False otherwise.
>>> - """
>>> + """This traffic generator can't capture traffic."""
>>> return False
>>>
>>> @abstractmethod
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 19/21] dts: base traffic generators docstring update
2023-11-22 11:56 ` Yoan Picchi
@ 2023-11-22 13:11 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 13:11 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Wed, Nov 22, 2023 at 1:05 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/22/23 11:38, Juraj Linkeš wrote:
> > On Tue, Nov 21, 2023 at 5:20 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
> >>
> >> On 11/15/23 13:09, Juraj Linkeš wrote:
> >>> Format according to the Google format and PEP257, with slight
> >>> deviations.
> >>>
> >>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> >>> ---
> >>> .../traffic_generator/__init__.py | 22 ++++++++-
> >>> .../capturing_traffic_generator.py | 46 +++++++++++--------
> >>> .../traffic_generator/traffic_generator.py | 33 +++++++------
> >>> 3 files changed, 68 insertions(+), 33 deletions(-)
> >>>
> >>> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> >>> index 11bfa1ee0f..51cca77da4 100644
> >>> --- a/dts/framework/testbed_model/traffic_generator/__init__.py
> >>> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> >>> @@ -1,6 +1,19 @@
> >>> # SPDX-License-Identifier: BSD-3-Clause
> >>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >>>
> >>> +"""DTS traffic generators.
> >>> +
> >>> +A traffic generator is capable of generating traffic and then monitor returning traffic.
> >>> +A traffic generator may just count the number of received packets
> >>> +and it may additionally capture individual packets.
> >>
> >> The sentence feels odd. Isn't it supposed to be "or" here? and no need
> >> for that early of a line break
> >>
> >
> > There are two mays, so there probably should be an or. But I'd like to
> > reword it to this:
> >
> > All traffic generators count the number of received packets, and they
> > may additionally
> > capture individual packets.
> >
> > What do you think?
>
> I think it's better with the new sentence. But I think it'd be even
> better to split into two sentences to highlight the must/may:
> All traffic generators must count the number of received packets. Some
> may additionally capture individual packets.
>
I like this, I'll reword it.
> >
> >>> +
> >>> +A traffic generator may be software running on generic hardware or it could be specialized hardware.
> >>> +
> >>> +The traffic generators that only count the number of received packets are suitable only for
> >>> +performance testing. In functional testing, we need to be able to dissect each arrived packet
> >>> +and a capturing traffic generator is required.
> >>> +"""
> >>> +
> >>> from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
> >>> from framework.exception import ConfigurationError
> >>> from framework.testbed_model.node import Node
> >>> @@ -12,8 +25,15 @@
> >>> def create_traffic_generator(
> >>> tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
> >>> ) -> CapturingTrafficGenerator:
> >>> - """A factory function for creating traffic generator object from user config."""
> >>> + """The factory function for creating traffic generator objects from the test run configuration.
> >>> +
> >>> + Args:
> >>> + tg_node: The traffic generator node where the created traffic generator will be running.
> >>> + traffic_generator_config: The traffic generator config.
> >>>
> >>> + Returns:
> >>> + A traffic generator capable of capturing received packets.
> >>> + """
> >>> match traffic_generator_config.traffic_generator_type:
> >>> case TrafficGeneratorType.SCAPY:
> >>> return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> >>> diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> >>> index e521211ef0..b0a43ad003 100644
> >>> --- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> >>> +++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> >>> @@ -23,19 +23,22 @@
> >>>
> >>>
> >>> def _get_default_capture_name() -> str:
> >>> - """
> >>> - This is the function used for the default implementation of capture names.
> >>> - """
> >>> return str(uuid.uuid4())
> >>>
> >>>
> >>> class CapturingTrafficGenerator(TrafficGenerator):
> >>> """Capture packets after sending traffic.
> >>>
> >>> - A mixin interface which enables a packet generator to declare that it can capture
> >>> + The intermediary interface which enables a packet generator to declare that it can capture
> >>> packets and return them to the user.
> >>>
> >>> + Similarly to
> >>> + :class:`~framework.testbed_model.traffic_generator.traffic_generator.TrafficGenerator`,
> >>> + this class exposes the public methods specific to capturing traffic generators and defines
> >>> + a private method that must implement the traffic generation and capturing logic in subclasses.
> >>> +
> >>> The methods of capturing traffic generators obey the following workflow:
> >>> +
> >>> 1. send packets
> >>> 2. capture packets
> >>> 3. write the capture to a .pcap file
> >>> @@ -44,6 +47,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
> >>>
> >>> @property
> >>> def is_capturing(self) -> bool:
> >>> + """This traffic generator can capture traffic."""
> >>> return True
> >>>
> >>> def send_packet_and_capture(
> >>> @@ -54,11 +58,12 @@ def send_packet_and_capture(
> >>> duration: float,
> >>> capture_name: str = _get_default_capture_name(),
> >>> ) -> list[Packet]:
> >>> - """Send a packet, return received traffic.
> >>> + """Send `packet` and capture received traffic.
> >>> +
> >>> + Send `packet` on `send_port` and then return all traffic captured
> >>> + on `receive_port` for the given `duration`.
> >>>
> >>> - Send a packet on the send_port and then return all traffic captured
> >>> - on the receive_port for the given duration. Also record the captured traffic
> >>> - in a pcap file.
> >>> + The captured traffic is recorded in the `capture_name`.pcap file.
> >>>
> >>> Args:
> >>> packet: The packet to send.
> >>> @@ -68,7 +73,7 @@ def send_packet_and_capture(
> >>> capture_name: The name of the .pcap file where to store the capture.
> >>>
> >>> Returns:
> >>> - A list of received packets. May be empty if no packets are captured.
> >>> + The received packets. May be empty if no packets are captured.
> >>> """
> >>> return self.send_packets_and_capture(
> >>> [packet], send_port, receive_port, duration, capture_name
> >>> @@ -82,11 +87,14 @@ def send_packets_and_capture(
> >>> duration: float,
> >>> capture_name: str = _get_default_capture_name(),
> >>> ) -> list[Packet]:
> >>> - """Send packets, return received traffic.
> >>> + """Send `packets` and capture received traffic.
> >>>
> >>> - Send packets on the send_port and then return all traffic captured
> >>> - on the receive_port for the given duration. Also record the captured traffic
> >>> - in a pcap file.
> >>> + Send `packets` on `send_port` and then return all traffic captured
> >>> + on `receive_port` for the given `duration`.
> >>> +
> >>> + The captured traffic is recorded in the `capture_name`.pcap file. The target directory
> >>> + can be configured with the :option:`--output-dir` command line argument or
> >>> + the :envvar:`DTS_OUTPUT_DIR` environment variable.
> >>>
> >>> Args:
> >>> packets: The packets to send.
> >>> @@ -96,7 +104,7 @@ def send_packets_and_capture(
> >>> capture_name: The name of the .pcap file where to store the capture.
> >>>
> >>> Returns:
> >>> - A list of received packets. May be empty if no packets are captured.
> >>> + The received packets. May be empty if no packets are captured.
> >>> """
> >>> self._logger.debug(get_packet_summaries(packets))
> >>> self._logger.debug(
> >>> @@ -124,10 +132,12 @@ def _send_packets_and_capture(
> >>> receive_port: Port,
> >>> duration: float,
> >>> ) -> list[Packet]:
> >>> - """
> >>> - The extended classes must implement this method which
> >>> - sends packets on send_port and receives packets on the receive_port
> >>> - for the specified duration. It must be able to handle no received packets.
> >>> + """The implementation of :method:`send_packets_and_capture`.
> >>> +
> >>> + The subclasses must implement this method which sends `packets` on `send_port`
> >>> + and receives packets on `receive_port` for the specified `duration`.
> >>> +
> >>> + It must be able to handle no received packets.
> >>
> >> This sentence feels odd too. Maybe "It must be able to handle receiving
> >> no packets."
> >>
> >
> > Right, your suggestion is better.
> >
> >>> """
> >>>
> >>> def _write_capture_from_packets(
> >>> diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> >>> index ea7c3963da..ed396c6a2f 100644
> >>> --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> >>> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> >>> @@ -22,7 +22,8 @@
> >>> class TrafficGenerator(ABC):
> >>> """The base traffic generator.
> >>>
> >>> - Defines the few basic methods that each traffic generator must implement.
> >>> + Exposes the common public methods of all traffic generators and defines private methods
> >>> + that must implement the traffic generation logic in subclasses.
> >>> """
> >>>
> >>> _config: TrafficGeneratorConfig
> >>> @@ -30,6 +31,12 @@ class TrafficGenerator(ABC):
> >>> _logger: DTSLOG
> >>>
> >>> def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> >>> + """Initialize the traffic generator.
> >>> +
> >>> + Args:
> >>> + tg_node: The traffic generator node where the created traffic generator will be running.
> >>> + config: The traffic generator's test run configuration.
> >>> + """
> >>> self._config = config
> >>> self._tg_node = tg_node
> >>> self._logger = getLogger(
> >>> @@ -37,9 +44,9 @@ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> >>> )
> >>>
> >>> def send_packet(self, packet: Packet, port: Port) -> None:
> >>> - """Send a packet and block until it is fully sent.
> >>> + """Send `packet` and block until it is fully sent.
> >>>
> >>> - What fully sent means is defined by the traffic generator.
> >>> + Send `packet` on `port`, then wait until `packet` is fully sent.
> >>>
> >>> Args:
> >>> packet: The packet to send.
> >>> @@ -48,9 +55,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
> >>> self.send_packets([packet], port)
> >>>
> >>> def send_packets(self, packets: list[Packet], port: Port) -> None:
> >>> - """Send packets and block until they are fully sent.
> >>> + """Send `packets` and block until they are fully sent.
> >>>
> >>> - What fully sent means is defined by the traffic generator.
> >>> + Send `packets` on `port`, then wait until `packets` are fully sent.
> >>>
> >>> Args:
> >>> packets: The packets to send.
> >>> @@ -62,19 +69,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
> >>>
> >>> @abstractmethod
> >>> def _send_packets(self, packets: list[Packet], port: Port) -> None:
> >>> - """
> >>> - The extended classes must implement this method which
> >>> - sends packets on send_port. The method should block until all packets
> >>> - are fully sent.
> >>> + """The implementation of :method:`send_packets`.
> >>> +
> >>> + The subclasses must implement this method which sends `packets` on `port`.
> >>> + The method should block until all `packets` are fully sent.
> >>> +
> >>> + What full sent means is defined by the traffic generator.
> >>
> >> full -> fully
> >>
> >>> """
> >>>
> >>> @property
> >>> def is_capturing(self) -> bool:
> >>> - """Whether this traffic generator can capture traffic.
> >>> -
> >>> - Returns:
> >>> - True if the traffic generator can capture traffic, False otherwise.
> >>> - """
> >>> + """This traffic generator can't capture traffic."""
> >>> return False
> >>>
> >>> @abstractmethod
> >>
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 20/21] dts: scapy tg docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (18 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 19/21] dts: base traffic generators " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-21 16:33 ` Yoan Picchi
2023-11-15 13:09 ` [PATCH v7 21/21] dts: test suites " Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
1 file changed, 54 insertions(+), 37 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
index 51864b6e6b..ed4f879925 100644
--- a/dts/framework/testbed_model/traffic_generator/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -2,14 +2,15 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""Scapy traffic generator.
+"""The Scapy traffic generator.
-Traffic generator used for functional testing, implemented using the Scapy library.
+A traffic generator used for functional testing, implemented with
+`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
The traffic generator uses an XML-RPC server to run Scapy on the remote TG node.
-The XML-RPC server runs in an interactive remote SSH session running Python console,
-where we start the server. The communication with the server is facilitated with
-a local server proxy.
+The traffic generator uses the :mod:`xmlrpc.server` module to run an XML-RPC server
+in an interactive remote Python SSH session. The communication with the server is facilitated
+with a local server proxy from the :mod:`xmlrpc.client` module.
"""
import inspect
@@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
recv_iface: str,
duration: float,
) -> list[bytes]:
- """RPC function to send and capture packets.
+ """The RPC function to send and capture packets.
- The function is meant to be executed on the remote TG node.
+ The function is meant to be executed on the remote TG node via the server proxy.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
recv_iface: The logical name of the ingress interface.
duration: Capture for this amount of time, in seconds.
Returns:
A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
+ to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
sniffer = scapy.all.AsyncSniffer(
@@ -98,19 +99,15 @@ def scapy_send_packets_and_capture(
def scapy_send_packets(
xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: str
) -> None:
- """RPC function to send packets.
+ """The RPC function to send packets.
- The function is meant to be executed on the remote TG node.
- It doesn't return anything, only sends packets.
+ The function is meant to be executed on the remote TG node via the server proxy.
+ It only sends `xmlrpc_packets`, without capturing them.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
-
- Returns:
- A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True, verbose=True)
@@ -130,11 +127,19 @@ def scapy_send_packets(
class QuittableXMLRPCServer(SimpleXMLRPCServer):
- """Basic XML-RPC server that may be extended
- by functions serializable by the marshal module.
+ r"""Basic XML-RPC server.
+
+ The server may be augmented by functions serializable by the :mod:`marshal` module.
"""
def __init__(self, *args, **kwargs):
+ """Extend the XML-RPC server initialization.
+
+ Args:
+ args: The positional arguments that will be passed to the superclass's constructor.
+ kwargs: The keyword arguments that will be passed to the superclass's constructor.
+ The `allow_none` argument will be set to :data:`True`.
+ """
kwargs["allow_none"] = True
super().__init__(*args, **kwargs)
self.register_introspection_functions()
@@ -142,13 +147,12 @@ def __init__(self, *args, **kwargs):
self.register_function(self.add_rpc_function)
def quit(self) -> None:
+ """Quit the server."""
self._BaseServer__shutdown_request = True
return None
def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
- """Add a function to the server.
-
- This is meant to be executed remotely.
+ """Add a function to the server from the local server proxy.
Args:
name: The name of the function.
@@ -159,6 +163,11 @@ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> N
self.register_function(function)
def serve_forever(self, poll_interval: float = 0.5) -> None:
+ """Extend the superclass method with an additional print.
+
+ Once executed in the local server proxy, the print gives us a clear string to expect
+ when starting the server. The print means the function was executed on the XML-RPC server.
+ """
print("XMLRPC OK")
super().serve_forever(poll_interval)
@@ -166,19 +175,12 @@ def serve_forever(self, poll_interval: float = 0.5) -> None:
class ScapyTrafficGenerator(CapturingTrafficGenerator):
"""Provides access to scapy functions via an RPC interface.
- The traffic generator first starts an XML-RPC on the remote TG node.
- Then it populates the server with functions which use the Scapy library
- to send/receive traffic.
-
- Any packets sent to the remote server are first converted to bytes.
- They are received as xmlrpc.client.Binary objects on the server side.
- When the server sends the packets back, they are also received as
- xmlrpc.client.Binary object on the client side, are converted back to Scapy
- packets and only then returned from the methods.
+ The class extends the base with remote execution of scapy functions.
- Arguments:
- tg_node: The node where the traffic generator resides.
- config: The user configuration of the traffic generator.
+ Any packets sent to the remote server are first converted to bytes. They are received as
+ :class:`~xmlrpc.client.Binary` objects on the server side. When the server sends the packets
+ back, they are also received as :class:`~xmlrpc.client.Binary` objects on the client side, are
+ converted back to :class:`scapy.packet.Packet` objects and only then returned from the methods.
Attributes:
session: The exclusive interactive remote session created by the Scapy
@@ -192,6 +194,22 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
_config: ScapyTrafficGeneratorConfig
def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ """Extend the constructor with Scapy TG specifics.
+
+ The traffic generator first starts an XML-RPC on the remote `tg_node`.
+ Then it populates the server with functions which use the Scapy library
+ to send/receive traffic:
+
+ * :func:`scapy_send_packets_and_capture`
+ * :func:`scapy_send_packets`
+
+ To enable verbose logging from the xmlrpc client, use the :option:`--verbose`
+ command line argument or the :envvar:`DTS_VERBOSE` environment variable.
+
+ Args:
+ tg_node: The node where the traffic generator resides.
+ config: The traffic generator's test run configuration.
+ """
super().__init__(tg_node, config)
assert (
@@ -237,10 +255,8 @@ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
[line for line in src.splitlines() if not line.isspace() and line != ""]
)
- spacing = "\n" * 4
-
# execute it in the python terminal
- self.session.send_command(spacing + src + spacing)
+ self.session.send_command(src + "\n")
self.session.send_command(
f"server = QuittableXMLRPCServer(('0.0.0.0', {listen_port}));"
f"server.serve_forever()",
@@ -274,6 +290,7 @@ def _send_packets_and_capture(
return scapy_packets
def close(self) -> None:
+ """Close the traffic generator."""
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 20/21] dts: scapy tg docstring update
2023-11-15 13:09 ` [PATCH v7 20/21] dts: scapy tg " Juraj Linkeš
@ 2023-11-21 16:33 ` Yoan Picchi
2023-11-22 13:18 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-21 16:33 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> .../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
> 1 file changed, 54 insertions(+), 37 deletions(-)
>
> diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
> index 51864b6e6b..ed4f879925 100644
> --- a/dts/framework/testbed_model/traffic_generator/scapy.py
> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
> @@ -2,14 +2,15 @@
> # Copyright(c) 2022 University of New Hampshire
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> -"""Scapy traffic generator.
> +"""The Scapy traffic generator.
>
> -Traffic generator used for functional testing, implemented using the Scapy library.
> +A traffic generator used for functional testing, implemented with
> +`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
> The traffic generator uses an XML-RPC server to run Scapy on the remote TG node.
>
> -The XML-RPC server runs in an interactive remote SSH session running Python console,
> -where we start the server. The communication with the server is facilitated with
> -a local server proxy.
> +The traffic generator uses the :mod:`xmlrpc.server` module to run an XML-RPC server
> +in an interactive remote Python SSH session. The communication with the server is facilitated
> +with a local server proxy from the :mod:`xmlrpc.client` module.
> """
>
> import inspect
> @@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
> recv_iface: str,
> duration: float,
> ) -> list[bytes]:
> - """RPC function to send and capture packets.
> + """The RPC function to send and capture packets.
>
> - The function is meant to be executed on the remote TG node.
> + The function is meant to be executed on the remote TG node via the server proxy.
>
> Args:
> xmlrpc_packets: The packets to send. These need to be converted to
> - xmlrpc.client.Binary before sending to the remote server.
> + :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
The string is not raw and no \s. As per you explanation a few commits
earlier this might cause an issue with the tilda ?
Looking around I see it also happen several time here and also in the
previous commit.
> send_iface: The logical name of the egress interface.
> recv_iface: The logical name of the ingress interface.
> duration: Capture for this amount of time, in seconds.
>
> Returns:
> A list of bytes. Each item in the list represents one packet, which needs
> - to be converted back upon transfer from the remote node.
> + to be converted back upon transfer from the remote node.
> """
> scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
> sniffer = scapy.all.AsyncSniffer(
> @@ -98,19 +99,15 @@ def scapy_send_packets_and_capture(
> def scapy_send_packets(
> xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: str
> ) -> None:
> - """RPC function to send packets.
> + """The RPC function to send packets.
>
> - The function is meant to be executed on the remote TG node.
> - It doesn't return anything, only sends packets.
> + The function is meant to be executed on the remote TG node via the server proxy.
> + It only sends `xmlrpc_packets`, without capturing them.
>
> Args:
> xmlrpc_packets: The packets to send. These need to be converted to
> - xmlrpc.client.Binary before sending to the remote server.
> + :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
> send_iface: The logical name of the egress interface.
> -
> - Returns:
> - A list of bytes. Each item in the list represents one packet, which needs
> - to be converted back upon transfer from the remote node.
> """
> scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
> scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True, verbose=True)
> @@ -130,11 +127,19 @@ def scapy_send_packets(
>
>
> class QuittableXMLRPCServer(SimpleXMLRPCServer):
> - """Basic XML-RPC server that may be extended
> - by functions serializable by the marshal module.
> + r"""Basic XML-RPC server.
But you have a raw string here, and I don't see the need why.
> +
> + The server may be augmented by functions serializable by the :mod:`marshal` module.
> """
>
> def __init__(self, *args, **kwargs):
> + """Extend the XML-RPC server initialization.
> +
> + Args:
> + args: The positional arguments that will be passed to the superclass's constructor.
> + kwargs: The keyword arguments that will be passed to the superclass's constructor.
> + The `allow_none` argument will be set to :data:`True`.
> + """
> kwargs["allow_none"] = True
> super().__init__(*args, **kwargs)
> self.register_introspection_functions()
> @@ -142,13 +147,12 @@ def __init__(self, *args, **kwargs):
> self.register_function(self.add_rpc_function)
>
> def quit(self) -> None:
> + """Quit the server."""
> self._BaseServer__shutdown_request = True
> return None
>
> def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
> - """Add a function to the server.
> -
> - This is meant to be executed remotely.
> + """Add a function to the server from the local server proxy.
>
> Args:
> name: The name of the function.
> @@ -159,6 +163,11 @@ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> N
> self.register_function(function)
>
> def serve_forever(self, poll_interval: float = 0.5) -> None:
> + """Extend the superclass method with an additional print.
> +
> + Once executed in the local server proxy, the print gives us a clear string to expect
> + when starting the server. The print means the function was executed on the XML-RPC server.
> + """
> print("XMLRPC OK")
> super().serve_forever(poll_interval)
>
> @@ -166,19 +175,12 @@ def serve_forever(self, poll_interval: float = 0.5) -> None:
> class ScapyTrafficGenerator(CapturingTrafficGenerator):
> """Provides access to scapy functions via an RPC interface.
>
> - The traffic generator first starts an XML-RPC on the remote TG node.
> - Then it populates the server with functions which use the Scapy library
> - to send/receive traffic.
> -
> - Any packets sent to the remote server are first converted to bytes.
> - They are received as xmlrpc.client.Binary objects on the server side.
> - When the server sends the packets back, they are also received as
> - xmlrpc.client.Binary object on the client side, are converted back to Scapy
> - packets and only then returned from the methods.
> + The class extends the base with remote execution of scapy functions.
>
> - Arguments:
> - tg_node: The node where the traffic generator resides.
> - config: The user configuration of the traffic generator.
> + Any packets sent to the remote server are first converted to bytes. They are received as
> + :class:`~xmlrpc.client.Binary` objects on the server side. When the server sends the packets
> + back, they are also received as :class:`~xmlrpc.client.Binary` objects on the client side, are
> + converted back to :class:`scapy.packet.Packet` objects and only then returned from the methods.
>
> Attributes:
> session: The exclusive interactive remote session created by the Scapy
> @@ -192,6 +194,22 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
> _config: ScapyTrafficGeneratorConfig
>
> def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
> + """Extend the constructor with Scapy TG specifics.
> +
> + The traffic generator first starts an XML-RPC on the remote `tg_node`.
> + Then it populates the server with functions which use the Scapy library
> + to send/receive traffic:
> +
> + * :func:`scapy_send_packets_and_capture`
> + * :func:`scapy_send_packets`
> +
> + To enable verbose logging from the xmlrpc client, use the :option:`--verbose`
> + command line argument or the :envvar:`DTS_VERBOSE` environment variable.
> +
> + Args:
> + tg_node: The node where the traffic generator resides.
> + config: The traffic generator's test run configuration.
> + """
> super().__init__(tg_node, config)
>
> assert (
> @@ -237,10 +255,8 @@ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
> [line for line in src.splitlines() if not line.isspace() and line != ""]
> )
>
> - spacing = "\n" * 4
> -
> # execute it in the python terminal
> - self.session.send_command(spacing + src + spacing)
> + self.session.send_command(src + "\n")
> self.session.send_command(
> f"server = QuittableXMLRPCServer(('0.0.0.0', {listen_port}));"
> f"server.serve_forever()",
> @@ -274,6 +290,7 @@ def _send_packets_and_capture(
> return scapy_packets
>
> def close(self) -> None:
> + """Close the traffic generator."""
> try:
> self.rpc_server_proxy.quit()
> except ConnectionRefusedError:
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 20/21] dts: scapy tg docstring update
2023-11-21 16:33 ` Yoan Picchi
@ 2023-11-22 13:18 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 13:18 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Tue, Nov 21, 2023 at 5:33 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > .../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
> > 1 file changed, 54 insertions(+), 37 deletions(-)
> >
> > diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
> > index 51864b6e6b..ed4f879925 100644
> > --- a/dts/framework/testbed_model/traffic_generator/scapy.py
> > +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
> > @@ -2,14 +2,15 @@
> > # Copyright(c) 2022 University of New Hampshire
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >
> > -"""Scapy traffic generator.
> > +"""The Scapy traffic generator.
> >
> > -Traffic generator used for functional testing, implemented using the Scapy library.
> > +A traffic generator used for functional testing, implemented with
> > +`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
> > The traffic generator uses an XML-RPC server to run Scapy on the remote TG node.
> >
> > -The XML-RPC server runs in an interactive remote SSH session running Python console,
> > -where we start the server. The communication with the server is facilitated with
> > -a local server proxy.
> > +The traffic generator uses the :mod:`xmlrpc.server` module to run an XML-RPC server
> > +in an interactive remote Python SSH session. The communication with the server is facilitated
> > +with a local server proxy from the :mod:`xmlrpc.client` module.
> > """
> >
> > import inspect
> > @@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
> > recv_iface: str,
> > duration: float,
> > ) -> list[bytes]:
> > - """RPC function to send and capture packets.
> > + """The RPC function to send and capture packets.
> >
> > - The function is meant to be executed on the remote TG node.
> > + The function is meant to be executed on the remote TG node via the server proxy.
> >
> > Args:
> > xmlrpc_packets: The packets to send. These need to be converted to
> > - xmlrpc.client.Binary before sending to the remote server.
> > + :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
>
> The string is not raw and no \s. As per you explanation a few commits
> earlier this might cause an issue with the tilda ?
> Looking around I see it also happen several time here and also in the
> previous commit.
>
The issue is not with the tilda, but with backticks. When backticks
are followed by certain characters (I mentioned alphanumeric
characters, but there may be others), the character right after the
backtick must be escaped and the raw string results in the right
escaping for Sphinx.
> > send_iface: The logical name of the egress interface.
> > recv_iface: The logical name of the ingress interface.
> > duration: Capture for this amount of time, in seconds.
> >
> > Returns:
> > A list of bytes. Each item in the list represents one packet, which needs
> > - to be converted back upon transfer from the remote node.
> > + to be converted back upon transfer from the remote node.
> > """
> > scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
> > sniffer = scapy.all.AsyncSniffer(
> > @@ -98,19 +99,15 @@ def scapy_send_packets_and_capture(
> > def scapy_send_packets(
> > xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: str
> > ) -> None:
> > - """RPC function to send packets.
> > + """The RPC function to send packets.
> >
> > - The function is meant to be executed on the remote TG node.
> > - It doesn't return anything, only sends packets.
> > + The function is meant to be executed on the remote TG node via the server proxy.
> > + It only sends `xmlrpc_packets`, without capturing them.
> >
> > Args:
> > xmlrpc_packets: The packets to send. These need to be converted to
> > - xmlrpc.client.Binary before sending to the remote server.
> > + :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
> > send_iface: The logical name of the egress interface.
> > -
> > - Returns:
> > - A list of bytes. Each item in the list represents one packet, which needs
> > - to be converted back upon transfer from the remote node.
> > """
> > scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
> > scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True, verbose=True)
> > @@ -130,11 +127,19 @@ def scapy_send_packets(
> >
> >
> > class QuittableXMLRPCServer(SimpleXMLRPCServer):
> > - """Basic XML-RPC server that may be extended
> > - by functions serializable by the marshal module.
> > + r"""Basic XML-RPC server.
>
> But you have a raw string here, and I don't see the need why.
>
There is no need here, I'll remove it. It's a remnant from a much
bigger docstring which caused issues when sending the code to the TG
node. I've talked to Jeremy and we'll fix it in a separate patch which
will introduce the full docstring.
> > +
> > + The server may be augmented by functions serializable by the :mod:`marshal` module.
> > """
> >
> > def __init__(self, *args, **kwargs):
> > + """Extend the XML-RPC server initialization.
> > +
> > + Args:
> > + args: The positional arguments that will be passed to the superclass's constructor.
> > + kwargs: The keyword arguments that will be passed to the superclass's constructor.
> > + The `allow_none` argument will be set to :data:`True`.
> > + """
> > kwargs["allow_none"] = True
> > super().__init__(*args, **kwargs)
> > self.register_introspection_functions()
> > @@ -142,13 +147,12 @@ def __init__(self, *args, **kwargs):
> > self.register_function(self.add_rpc_function)
> >
> > def quit(self) -> None:
> > + """Quit the server."""
> > self._BaseServer__shutdown_request = True
> > return None
> >
> > def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
> > - """Add a function to the server.
> > -
> > - This is meant to be executed remotely.
> > + """Add a function to the server from the local server proxy.
> >
> > Args:
> > name: The name of the function.
> > @@ -159,6 +163,11 @@ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> N
> > self.register_function(function)
> >
> > def serve_forever(self, poll_interval: float = 0.5) -> None:
> > + """Extend the superclass method with an additional print.
> > +
> > + Once executed in the local server proxy, the print gives us a clear string to expect
> > + when starting the server. The print means the function was executed on the XML-RPC server.
> > + """
> > print("XMLRPC OK")
> > super().serve_forever(poll_interval)
> >
> > @@ -166,19 +175,12 @@ def serve_forever(self, poll_interval: float = 0.5) -> None:
> > class ScapyTrafficGenerator(CapturingTrafficGenerator):
> > """Provides access to scapy functions via an RPC interface.
> >
> > - The traffic generator first starts an XML-RPC on the remote TG node.
> > - Then it populates the server with functions which use the Scapy library
> > - to send/receive traffic.
> > -
> > - Any packets sent to the remote server are first converted to bytes.
> > - They are received as xmlrpc.client.Binary objects on the server side.
> > - When the server sends the packets back, they are also received as
> > - xmlrpc.client.Binary object on the client side, are converted back to Scapy
> > - packets and only then returned from the methods.
> > + The class extends the base with remote execution of scapy functions.
> >
> > - Arguments:
> > - tg_node: The node where the traffic generator resides.
> > - config: The user configuration of the traffic generator.
> > + Any packets sent to the remote server are first converted to bytes. They are received as
> > + :class:`~xmlrpc.client.Binary` objects on the server side. When the server sends the packets
> > + back, they are also received as :class:`~xmlrpc.client.Binary` objects on the client side, are
> > + converted back to :class:`scapy.packet.Packet` objects and only then returned from the methods.
> >
> > Attributes:
> > session: The exclusive interactive remote session created by the Scapy
> > @@ -192,6 +194,22 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
> > _config: ScapyTrafficGeneratorConfig
> >
> > def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
> > + """Extend the constructor with Scapy TG specifics.
> > +
> > + The traffic generator first starts an XML-RPC on the remote `tg_node`.
> > + Then it populates the server with functions which use the Scapy library
> > + to send/receive traffic:
> > +
> > + * :func:`scapy_send_packets_and_capture`
> > + * :func:`scapy_send_packets`
> > +
> > + To enable verbose logging from the xmlrpc client, use the :option:`--verbose`
> > + command line argument or the :envvar:`DTS_VERBOSE` environment variable.
> > +
> > + Args:
> > + tg_node: The node where the traffic generator resides.
> > + config: The traffic generator's test run configuration.
> > + """
> > super().__init__(tg_node, config)
> >
> > assert (
> > @@ -237,10 +255,8 @@ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
> > [line for line in src.splitlines() if not line.isspace() and line != ""]
> > )
> >
> > - spacing = "\n" * 4
> > -
> > # execute it in the python terminal
> > - self.session.send_command(spacing + src + spacing)
> > + self.session.send_command(src + "\n")
> > self.session.send_command(
> > f"server = QuittableXMLRPCServer(('0.0.0.0', {listen_port}));"
> > f"server.serve_forever()",
> > @@ -274,6 +290,7 @@ def _send_packets_and_capture(
> > return scapy_packets
> >
> > def close(self) -> None:
> > + """Close the traffic generator."""
> > try:
> > self.rpc_server_proxy.quit()
> > except ConnectionRefusedError:
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 21/21] dts: test suites docstring update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (19 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 20/21] dts: scapy tg " Juraj Linkeš
@ 2023-11-15 13:09 ` Juraj Linkeš
2023-11-16 17:36 ` Yoan Picchi
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:09 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/tests/TestSuite_hello_world.py | 16 +++++----
dts/tests/TestSuite_os_udp.py | 19 +++++++----
dts/tests/TestSuite_smoke_tests.py | 53 +++++++++++++++++++++++++++---
3 files changed, 70 insertions(+), 18 deletions(-)
diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
index 7e3d95c0cf..662a8f8726 100644
--- a/dts/tests/TestSuite_hello_world.py
+++ b/dts/tests/TestSuite_hello_world.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2010-2014 Intel Corporation
-"""
+"""The DPDK hello world app test suite.
+
Run the helloworld example app and verify it prints a message for each used core.
No other EAL parameters apart from cores are used.
"""
@@ -15,22 +16,25 @@
class TestHelloWorld(TestSuite):
+ """DPDK hello world app test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Build the app we're about to test - helloworld.
"""
self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
def test_hello_world_single_core(self) -> None:
- """
+ """Single core test case.
+
Steps:
Run the helloworld app on the first usable logical core.
Verify:
The app prints a message from the used core:
"hello from core <core_id>"
"""
-
# get the first usable core
lcore_amount = LogicalCoreCount(1, 1, 1)
lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
@@ -44,14 +48,14 @@ def test_hello_world_single_core(self) -> None:
)
def test_hello_world_all_cores(self) -> None:
- """
+ """All cores test case.
+
Steps:
Run the helloworld app on all usable logical cores.
Verify:
The app prints a message from all used cores:
"hello from core <core_id>"
"""
-
# get the maximum logical core number
eal_para = self.sut_node.create_eal_parameters(
lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
index bf6b93deb5..e0c5239612 100644
--- a/dts/tests/TestSuite_os_udp.py
+++ b/dts/tests/TestSuite_os_udp.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
+"""Basic IPv4 OS routing test suite.
+
Configure SUT node to route traffic from if1 to if2.
Send a packet to the SUT node, verify it comes back on the second port on the TG node.
"""
@@ -13,24 +14,27 @@
class TestOSUdp(TestSuite):
+ """IPv4 UDP OS routing test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
- Configure SUT ports and SUT to route traffic from if1 to if2.
+ Bind the SUT ports to the OS driver, configure the ports and configure the SUT
+ to route traffic from if1 to if2.
"""
- # This test uses kernel drivers
self.sut_node.bind_ports_to_driver(for_dpdk=False)
self.configure_testbed_ipv4()
def test_os_udp(self) -> None:
- """
+ """Basic UDP IPv4 traffic test case.
+
Steps:
Send a UDP packet.
Verify:
The packet with proper addresses arrives at the other TG port.
"""
-
packet = Ether() / IP() / UDP()
received_packets = self.send_packet_and_capture(packet)
@@ -40,7 +44,8 @@ def test_os_udp(self) -> None:
self.verify_packets(expected_packet, received_packets)
def tear_down_suite(self) -> None:
- """
+ """Tear down the test suite.
+
Teardown:
Remove the SUT port configuration configured in setup.
"""
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index e8016d1b54..6fae099a0e 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -1,6 +1,17 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Smoke test suite.
+
+Smoke tests are a class of tests which are used for validating a minimal set of important features.
+These are the most important features without which (or when they're faulty) the software wouldn't
+work properly. Thus, if any failure occurs while testing these features,
+there isn't that much of a reason to continue testing, as the software is fundamentally broken.
+
+These tests don't have to include only DPDK tests, as the reason for failures could be
+in the infrastructure (a faulty link between NICs or a misconfiguration).
+"""
+
import re
from framework.config import PortConfig
@@ -11,13 +22,25 @@
class SmokeTests(TestSuite):
+ """DPDK and infrastructure smoke test suite.
+
+ The test cases validate the most basic DPDK functionality needed for all other test suites.
+ The infrastructure also needs to be tested, as that is also used by all other test suites.
+
+ Attributes:
+ is_blocking: This test suite will block the execution of all other test suites
+ in the build target after it.
+ nics_in_node: The NICs present on the SUT node.
+ """
+
is_blocking = True
# dicts in this list are expected to have two keys:
# "pci_address" and "current_driver"
nics_in_node: list[PortConfig] = []
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Set the build directory path and generate a list of NICs in the SUT node.
"""
@@ -25,7 +48,13 @@ def set_up_suite(self) -> None:
self.nics_in_node = self.sut_node.config.ports
def test_unit_tests(self) -> None:
- """
+ """DPDK meson fast-tests unit tests.
+
+ The DPDK unit tests are basic tests that indicate regressions and other critical failures.
+ These need to be addressed before other testing.
+
+ The fast-tests unit tests are a subset with only the most basic tests.
+
Test:
Run the fast-test unit-test suite through meson.
"""
@@ -37,7 +66,14 @@ def test_unit_tests(self) -> None:
)
def test_driver_tests(self) -> None:
- """
+ """DPDK meson driver-tests unit tests.
+
+ The DPDK unit tests are basic tests that indicate regressions and other critical failures.
+ These need to be addressed before other testing.
+
+ The driver-tests unit tests are a subset that test only drivers. These may be run
+ with virtual devices as well.
+
Test:
Run the driver-test unit-test suite through meson.
"""
@@ -63,7 +99,10 @@ def test_driver_tests(self) -> None:
)
def test_devices_listed_in_testpmd(self) -> None:
- """
+ """Testpmd device discovery.
+
+ If the configured devices can't be found in testpmd, they can't be tested.
+
Test:
Uses testpmd driver to verify that devices have been found by testpmd.
"""
@@ -79,7 +118,11 @@ def test_devices_listed_in_testpmd(self) -> None:
)
def test_device_bound_to_driver(self) -> None:
- """
+ """Device driver in OS.
+
+ The devices must be bound to the proper driver, otherwise they can't be used by DPDK
+ or the traffic generators.
+
Test:
Ensure that all drivers listed in the config are bound to the correct
driver.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 21/21] dts: test suites docstring update
2023-11-15 13:09 ` [PATCH v7 21/21] dts: test suites " Juraj Linkeš
@ 2023-11-16 17:36 ` Yoan Picchi
2023-11-20 10:17 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-16 17:36 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek
Cc: dev
On 11/15/23 13:09, Juraj Linkeš wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/tests/TestSuite_hello_world.py | 16 +++++----
> dts/tests/TestSuite_os_udp.py | 19 +++++++----
> dts/tests/TestSuite_smoke_tests.py | 53 +++++++++++++++++++++++++++---
> 3 files changed, 70 insertions(+), 18 deletions(-)
>
> diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
> index 7e3d95c0cf..662a8f8726 100644
> --- a/dts/tests/TestSuite_hello_world.py
> +++ b/dts/tests/TestSuite_hello_world.py
> @@ -1,7 +1,8 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2010-2014 Intel Corporation
>
> -"""
> +"""The DPDK hello world app test suite.
> +
> Run the helloworld example app and verify it prints a message for each used core.
> No other EAL parameters apart from cores are used.
> """
> @@ -15,22 +16,25 @@
>
>
> class TestHelloWorld(TestSuite):
> + """DPDK hello world app test suite."""
> +
> def set_up_suite(self) -> None:
> - """
> + """Set up the test suite.
> +
> Setup:
> Build the app we're about to test - helloworld.
> """
> self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
>
> def test_hello_world_single_core(self) -> None:
> - """
> + """Single core test case.
> +
> Steps:
> Run the helloworld app on the first usable logical core.
> Verify:
> The app prints a message from the used core:
> "hello from core <core_id>"
> """
> -
> # get the first usable core
> lcore_amount = LogicalCoreCount(1, 1, 1)
> lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
> @@ -44,14 +48,14 @@ def test_hello_world_single_core(self) -> None:
> )
>
> def test_hello_world_all_cores(self) -> None:
> - """
> + """All cores test case.
> +
> Steps:
> Run the helloworld app on all usable logical cores.
> Verify:
> The app prints a message from all used cores:
> "hello from core <core_id>"
> """
> -
> # get the maximum logical core number
> eal_para = self.sut_node.create_eal_parameters(
> lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
> diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
> index bf6b93deb5..e0c5239612 100644
> --- a/dts/tests/TestSuite_os_udp.py
> +++ b/dts/tests/TestSuite_os_udp.py
> @@ -1,7 +1,8 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> -"""
> +"""Basic IPv4 OS routing test suite.
> +
> Configure SUT node to route traffic from if1 to if2.
> Send a packet to the SUT node, verify it comes back on the second port on the TG node.
> """
> @@ -13,24 +14,27 @@
>
>
> class TestOSUdp(TestSuite):
> + """IPv4 UDP OS routing test suite."""
> +
> def set_up_suite(self) -> None:
> - """
> + """Set up the test suite.
> +
> Setup:
> - Configure SUT ports and SUT to route traffic from if1 to if2.
> + Bind the SUT ports to the OS driver, configure the ports and configure the SUT
> + to route traffic from if1 to if2.
> """
>
> - # This test uses kernel drivers
> self.sut_node.bind_ports_to_driver(for_dpdk=False)
> self.configure_testbed_ipv4()
>
> def test_os_udp(self) -> None:
> - """
> + """Basic UDP IPv4 traffic test case.
> +
> Steps:
> Send a UDP packet.
> Verify:
> The packet with proper addresses arrives at the other TG port.
> """
> -
> packet = Ether() / IP() / UDP()
>
> received_packets = self.send_packet_and_capture(packet)
> @@ -40,7 +44,8 @@ def test_os_udp(self) -> None:
> self.verify_packets(expected_packet, received_packets)
>
> def tear_down_suite(self) -> None:
> - """
> + """Tear down the test suite.
> +
> Teardown:
> Remove the SUT port configuration configured in setup.
> """
> diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
> index e8016d1b54..6fae099a0e 100644
> --- a/dts/tests/TestSuite_smoke_tests.py
> +++ b/dts/tests/TestSuite_smoke_tests.py
> @@ -1,6 +1,17 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 University of New Hampshire
>
> +"""Smoke test suite.
> +
> +Smoke tests are a class of tests which are used for validating a minimal set of important features.
> +These are the most important features without which (or when they're faulty) the software wouldn't
> +work properly. Thus, if any failure occurs while testing these features,
> +there isn't that much of a reason to continue testing, as the software is fundamentally broken.
> +
> +These tests don't have to include only DPDK tests, as the reason for failures could be
> +in the infrastructure (a faulty link between NICs or a misconfiguration).
> +"""
> +
> import re
>
> from framework.config import PortConfig
> @@ -11,13 +22,25 @@
>
>
> class SmokeTests(TestSuite):
> + """DPDK and infrastructure smoke test suite.
> +
> + The test cases validate the most basic DPDK functionality needed for all other test suites.
> + The infrastructure also needs to be tested, as that is also used by all other test suites.
> +
> + Attributes:
> + is_blocking: This test suite will block the execution of all other test suites
> + in the build target after it.
> + nics_in_node: The NICs present on the SUT node.
> + """
> +
> is_blocking = True
> # dicts in this list are expected to have two keys:
> # "pci_address" and "current_driver"
> nics_in_node: list[PortConfig] = []
>
> def set_up_suite(self) -> None:
> - """
> + """Set up the test suite.
> +
> Setup:
> Set the build directory path and generate a list of NICs in the SUT node.
> """
> @@ -25,7 +48,13 @@ def set_up_suite(self) -> None:
> self.nics_in_node = self.sut_node.config.ports
>
> def test_unit_tests(self) -> None:
> - """
> + """DPDK meson fast-tests unit tests.
> +
> + The DPDK unit tests are basic tests that indicate regressions and other critical failures.
> + These need to be addressed before other testing.
> +
> + The fast-tests unit tests are a subset with only the most basic tests.
> +
> Test:
> Run the fast-test unit-test suite through meson.
> """
> @@ -37,7 +66,14 @@ def test_unit_tests(self) -> None:
> )
>
> def test_driver_tests(self) -> None:
> - """
> + """DPDK meson driver-tests unit tests.
> +
Copy paste from the previous unit test in the driver tests. If it is on
purpose as both are considered unit tests, then the previous function is
test_unit_tests and deal with fast-tests
> +
> + The driver-tests unit tests are a subset that test only drivers. These may be run
> + with virtual devices as well.
> +
> Test:
> Run the driver-test unit-test suite through meson.
> """
> @@ -63,7 +99,10 @@ def test_driver_tests(self) -> None:
> )
>
> def test_devices_listed_in_testpmd(self) -> None:
> - """
> + """Testpmd device discovery.
> +
> + If the configured devices can't be found in testpmd, they can't be tested.
Maybe a bit nitpicky. This is more of a statement as to why the test
exist than a description of the test. Suggestion: "Tests that the
configured devices can be found in testpmd. If they aren't, the
configuration might be wrong and tests might be skipped"
> +
> Test:
> Uses testpmd driver to verify that devices have been found by testpmd.
> """
> @@ -79,7 +118,11 @@ def test_devices_listed_in_testpmd(self) -> None:
> )
>
> def test_device_bound_to_driver(self) -> None:
> - """
> + """Device driver in OS.
> +
> + The devices must be bound to the proper driver, otherwise they can't be used by DPDK
> + or the traffic generators.
Same as the previous comment. It is more of a statement as to why the
test exist than a description of the test
> +
> Test:
> Ensure that all drivers listed in the config are bound to the correct
> driver.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 21/21] dts: test suites docstring update
2023-11-16 17:36 ` Yoan Picchi
@ 2023-11-20 10:17 ` Juraj Linkeš
2023-11-20 12:50 ` Yoan Picchi
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-20 10:17 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Thu, Nov 16, 2023 at 6:36 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/15/23 13:09, Juraj Linkeš wrote:
> > Format according to the Google format and PEP257, with slight
> > deviations.
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> > dts/tests/TestSuite_hello_world.py | 16 +++++----
> > dts/tests/TestSuite_os_udp.py | 19 +++++++----
> > dts/tests/TestSuite_smoke_tests.py | 53 +++++++++++++++++++++++++++---
> > 3 files changed, 70 insertions(+), 18 deletions(-)
> >
> > diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
> > index 7e3d95c0cf..662a8f8726 100644
> > --- a/dts/tests/TestSuite_hello_world.py
> > +++ b/dts/tests/TestSuite_hello_world.py
> > @@ -1,7 +1,8 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2010-2014 Intel Corporation
> >
> > -"""
> > +"""The DPDK hello world app test suite.
> > +
> > Run the helloworld example app and verify it prints a message for each used core.
> > No other EAL parameters apart from cores are used.
> > """
> > @@ -15,22 +16,25 @@
> >
> >
> > class TestHelloWorld(TestSuite):
> > + """DPDK hello world app test suite."""
> > +
> > def set_up_suite(self) -> None:
> > - """
> > + """Set up the test suite.
> > +
> > Setup:
> > Build the app we're about to test - helloworld.
> > """
> > self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
> >
> > def test_hello_world_single_core(self) -> None:
> > - """
> > + """Single core test case.
> > +
> > Steps:
> > Run the helloworld app on the first usable logical core.
> > Verify:
> > The app prints a message from the used core:
> > "hello from core <core_id>"
> > """
> > -
> > # get the first usable core
> > lcore_amount = LogicalCoreCount(1, 1, 1)
> > lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
> > @@ -44,14 +48,14 @@ def test_hello_world_single_core(self) -> None:
> > )
> >
> > def test_hello_world_all_cores(self) -> None:
> > - """
> > + """All cores test case.
> > +
> > Steps:
> > Run the helloworld app on all usable logical cores.
> > Verify:
> > The app prints a message from all used cores:
> > "hello from core <core_id>"
> > """
> > -
> > # get the maximum logical core number
> > eal_para = self.sut_node.create_eal_parameters(
> > lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
> > diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
> > index bf6b93deb5..e0c5239612 100644
> > --- a/dts/tests/TestSuite_os_udp.py
> > +++ b/dts/tests/TestSuite_os_udp.py
> > @@ -1,7 +1,8 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >
> > -"""
> > +"""Basic IPv4 OS routing test suite.
> > +
> > Configure SUT node to route traffic from if1 to if2.
> > Send a packet to the SUT node, verify it comes back on the second port on the TG node.
> > """
> > @@ -13,24 +14,27 @@
> >
> >
> > class TestOSUdp(TestSuite):
> > + """IPv4 UDP OS routing test suite."""
> > +
> > def set_up_suite(self) -> None:
> > - """
> > + """Set up the test suite.
> > +
> > Setup:
> > - Configure SUT ports and SUT to route traffic from if1 to if2.
> > + Bind the SUT ports to the OS driver, configure the ports and configure the SUT
> > + to route traffic from if1 to if2.
> > """
> >
> > - # This test uses kernel drivers
> > self.sut_node.bind_ports_to_driver(for_dpdk=False)
> > self.configure_testbed_ipv4()
> >
> > def test_os_udp(self) -> None:
> > - """
> > + """Basic UDP IPv4 traffic test case.
> > +
> > Steps:
> > Send a UDP packet.
> > Verify:
> > The packet with proper addresses arrives at the other TG port.
> > """
> > -
> > packet = Ether() / IP() / UDP()
> >
> > received_packets = self.send_packet_and_capture(packet)
> > @@ -40,7 +44,8 @@ def test_os_udp(self) -> None:
> > self.verify_packets(expected_packet, received_packets)
> >
> > def tear_down_suite(self) -> None:
> > - """
> > + """Tear down the test suite.
> > +
> > Teardown:
> > Remove the SUT port configuration configured in setup.
> > """
> > diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
> > index e8016d1b54..6fae099a0e 100644
> > --- a/dts/tests/TestSuite_smoke_tests.py
> > +++ b/dts/tests/TestSuite_smoke_tests.py
> > @@ -1,6 +1,17 @@
> > # SPDX-License-Identifier: BSD-3-Clause
> > # Copyright(c) 2023 University of New Hampshire
> >
> > +"""Smoke test suite.
> > +
> > +Smoke tests are a class of tests which are used for validating a minimal set of important features.
> > +These are the most important features without which (or when they're faulty) the software wouldn't
> > +work properly. Thus, if any failure occurs while testing these features,
> > +there isn't that much of a reason to continue testing, as the software is fundamentally broken.
> > +
> > +These tests don't have to include only DPDK tests, as the reason for failures could be
> > +in the infrastructure (a faulty link between NICs or a misconfiguration).
> > +"""
> > +
> > import re
> >
> > from framework.config import PortConfig
> > @@ -11,13 +22,25 @@
> >
> >
> > class SmokeTests(TestSuite):
> > + """DPDK and infrastructure smoke test suite.
> > +
> > + The test cases validate the most basic DPDK functionality needed for all other test suites.
> > + The infrastructure also needs to be tested, as that is also used by all other test suites.
> > +
> > + Attributes:
> > + is_blocking: This test suite will block the execution of all other test suites
> > + in the build target after it.
> > + nics_in_node: The NICs present on the SUT node.
> > + """
> > +
> > is_blocking = True
> > # dicts in this list are expected to have two keys:
> > # "pci_address" and "current_driver"
> > nics_in_node: list[PortConfig] = []
> >
> > def set_up_suite(self) -> None:
> > - """
> > + """Set up the test suite.
> > +
> > Setup:
> > Set the build directory path and generate a list of NICs in the SUT node.
> > """
> > @@ -25,7 +48,13 @@ def set_up_suite(self) -> None:
> > self.nics_in_node = self.sut_node.config.ports
> >
> > def test_unit_tests(self) -> None:
> > - """
> > + """DPDK meson fast-tests unit tests.
> > +
> > + The DPDK unit tests are basic tests that indicate regressions and other critical failures.
> > + These need to be addressed before other testing.
> > +
> > + The fast-tests unit tests are a subset with only the most basic tests.
> > +
> > Test:
> > Run the fast-test unit-test suite through meson.
> > """
> > @@ -37,7 +66,14 @@ def test_unit_tests(self) -> None:
> > )
> >
> > def test_driver_tests(self) -> None:
> > - """
> > + """DPDK meson driver-tests unit tests.
> > +
>
> Copy paste from the previous unit test in the driver tests. If it is on
> purpose as both are considered unit tests, then the previous function is
> test_unit_tests and deal with fast-tests
>
I'm not sure what you mean. The two are separate tests (one with the
fast-test, the other one with the driver-test unit test test suites)
and the docstring do capture the differences.
> > +
> > + The driver-tests unit tests are a subset that test only drivers. These may be run
> > + with virtual devices as well.
> > +
> > Test:
> > Run the driver-test unit-test suite through meson.
> > """
> > @@ -63,7 +99,10 @@ def test_driver_tests(self) -> None:
> > )
> >
> > def test_devices_listed_in_testpmd(self) -> None:
> > - """
> > + """Testpmd device discovery.
> > +
> > + If the configured devices can't be found in testpmd, they can't be tested.
>
> Maybe a bit nitpicky. This is more of a statement as to why the test
> exist than a description of the test. Suggestion: "Tests that the
> configured devices can be found in testpmd. If they aren't, the
> configuration might be wrong and tests might be skipped"
>
This is more of a reason for why this particular test is a smoke test.
Since a smoke test failure results in all test suites being blocked,
this seemed like key information.
We also don't have an exact format of what should be included in a
test case/suite documentation. We should use this opportunity to
document what we deem important in these test cases at this point in
time and improve the docs as we continue adding test cases. We can add
more custom sections (such as the "Setup:" and" "Test:" sections,
which can be added to Sphinx); I like adding a section with
explanation for why a test is a particular type of test (in this case,
a smoke test). The regular body could contain a description as you
suggested. What do you think?
> > +
> > Test:
> > Uses testpmd driver to verify that devices have been found by testpmd.
> > """
> > @@ -79,7 +118,11 @@ def test_devices_listed_in_testpmd(self) -> None:
> > )
> >
> > def test_device_bound_to_driver(self) -> None:
> > - """
> > + """Device driver in OS.
> > +
> > + The devices must be bound to the proper driver, otherwise they can't be used by DPDK
> > + or the traffic generators.
>
> Same as the previous comment. It is more of a statement as to why the
> test exist than a description of the test
>
Ack.
> > +
> > Test:
> > Ensure that all drivers listed in the config are bound to the correct
> > driver.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 21/21] dts: test suites docstring update
2023-11-20 10:17 ` Juraj Linkeš
@ 2023-11-20 12:50 ` Yoan Picchi
2023-11-22 13:40 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-11-20 12:50 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On 11/20/23 10:17, Juraj Linkeš wrote:
> On Thu, Nov 16, 2023 at 6:36 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>>
>> On 11/15/23 13:09, Juraj Linkeš wrote:
>>> Format according to the Google format and PEP257, with slight
>>> deviations.
>>>
>>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>>> ---
>>> dts/tests/TestSuite_hello_world.py | 16 +++++----
>>> dts/tests/TestSuite_os_udp.py | 19 +++++++----
>>> dts/tests/TestSuite_smoke_tests.py | 53 +++++++++++++++++++++++++++---
>>> 3 files changed, 70 insertions(+), 18 deletions(-)
>>>
>>> diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
>>> index 7e3d95c0cf..662a8f8726 100644
>>> --- a/dts/tests/TestSuite_hello_world.py
>>> +++ b/dts/tests/TestSuite_hello_world.py
>>> @@ -1,7 +1,8 @@
>>> # SPDX-License-Identifier: BSD-3-Clause
>>> # Copyright(c) 2010-2014 Intel Corporation
>>>
>>> -"""
>>> +"""The DPDK hello world app test suite.
>>> +
>>> Run the helloworld example app and verify it prints a message for each used core.
>>> No other EAL parameters apart from cores are used.
>>> """
>>> @@ -15,22 +16,25 @@
>>>
>>>
>>> class TestHelloWorld(TestSuite):
>>> + """DPDK hello world app test suite."""
>>> +
>>> def set_up_suite(self) -> None:
>>> - """
>>> + """Set up the test suite.
>>> +
>>> Setup:
>>> Build the app we're about to test - helloworld.
>>> """
>>> self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
>>>
>>> def test_hello_world_single_core(self) -> None:
>>> - """
>>> + """Single core test case.
>>> +
>>> Steps:
>>> Run the helloworld app on the first usable logical core.
>>> Verify:
>>> The app prints a message from the used core:
>>> "hello from core <core_id>"
>>> """
>>> -
>>> # get the first usable core
>>> lcore_amount = LogicalCoreCount(1, 1, 1)
>>> lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
>>> @@ -44,14 +48,14 @@ def test_hello_world_single_core(self) -> None:
>>> )
>>>
>>> def test_hello_world_all_cores(self) -> None:
>>> - """
>>> + """All cores test case.
>>> +
>>> Steps:
>>> Run the helloworld app on all usable logical cores.
>>> Verify:
>>> The app prints a message from all used cores:
>>> "hello from core <core_id>"
>>> """
>>> -
>>> # get the maximum logical core number
>>> eal_para = self.sut_node.create_eal_parameters(
>>> lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
>>> diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
>>> index bf6b93deb5..e0c5239612 100644
>>> --- a/dts/tests/TestSuite_os_udp.py
>>> +++ b/dts/tests/TestSuite_os_udp.py
>>> @@ -1,7 +1,8 @@
>>> # SPDX-License-Identifier: BSD-3-Clause
>>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>>>
>>> -"""
>>> +"""Basic IPv4 OS routing test suite.
>>> +
>>> Configure SUT node to route traffic from if1 to if2.
>>> Send a packet to the SUT node, verify it comes back on the second port on the TG node.
>>> """
>>> @@ -13,24 +14,27 @@
>>>
>>>
>>> class TestOSUdp(TestSuite):
>>> + """IPv4 UDP OS routing test suite."""
>>> +
>>> def set_up_suite(self) -> None:
>>> - """
>>> + """Set up the test suite.
>>> +
>>> Setup:
>>> - Configure SUT ports and SUT to route traffic from if1 to if2.
>>> + Bind the SUT ports to the OS driver, configure the ports and configure the SUT
>>> + to route traffic from if1 to if2.
>>> """
>>>
>>> - # This test uses kernel drivers
>>> self.sut_node.bind_ports_to_driver(for_dpdk=False)
>>> self.configure_testbed_ipv4()
>>>
>>> def test_os_udp(self) -> None:
>>> - """
>>> + """Basic UDP IPv4 traffic test case.
>>> +
>>> Steps:
>>> Send a UDP packet.
>>> Verify:
>>> The packet with proper addresses arrives at the other TG port.
>>> """
>>> -
>>> packet = Ether() / IP() / UDP()
>>>
>>> received_packets = self.send_packet_and_capture(packet)
>>> @@ -40,7 +44,8 @@ def test_os_udp(self) -> None:
>>> self.verify_packets(expected_packet, received_packets)
>>>
>>> def tear_down_suite(self) -> None:
>>> - """
>>> + """Tear down the test suite.
>>> +
>>> Teardown:
>>> Remove the SUT port configuration configured in setup.
>>> """
>>> diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
>>> index e8016d1b54..6fae099a0e 100644
>>> --- a/dts/tests/TestSuite_smoke_tests.py
>>> +++ b/dts/tests/TestSuite_smoke_tests.py
>>> @@ -1,6 +1,17 @@
>>> # SPDX-License-Identifier: BSD-3-Clause
>>> # Copyright(c) 2023 University of New Hampshire
>>>
>>> +"""Smoke test suite.
>>> +
>>> +Smoke tests are a class of tests which are used for validating a minimal set of important features.
>>> +These are the most important features without which (or when they're faulty) the software wouldn't
>>> +work properly. Thus, if any failure occurs while testing these features,
>>> +there isn't that much of a reason to continue testing, as the software is fundamentally broken.
>>> +
>>> +These tests don't have to include only DPDK tests, as the reason for failures could be
>>> +in the infrastructure (a faulty link between NICs or a misconfiguration).
>>> +"""
>>> +
>>> import re
>>>
>>> from framework.config import PortConfig
>>> @@ -11,13 +22,25 @@
>>>
>>>
>>> class SmokeTests(TestSuite):
>>> + """DPDK and infrastructure smoke test suite.
>>> +
>>> + The test cases validate the most basic DPDK functionality needed for all other test suites.
>>> + The infrastructure also needs to be tested, as that is also used by all other test suites.
>>> +
>>> + Attributes:
>>> + is_blocking: This test suite will block the execution of all other test suites
>>> + in the build target after it.
>>> + nics_in_node: The NICs present on the SUT node.
>>> + """
>>> +
>>> is_blocking = True
>>> # dicts in this list are expected to have two keys:
>>> # "pci_address" and "current_driver"
>>> nics_in_node: list[PortConfig] = []
>>>
>>> def set_up_suite(self) -> None:
>>> - """
>>> + """Set up the test suite.
>>> +
>>> Setup:
>>> Set the build directory path and generate a list of NICs in the SUT node.
>>> """
>>> @@ -25,7 +48,13 @@ def set_up_suite(self) -> None:
>>> self.nics_in_node = self.sut_node.config.ports
>>>
>>> def test_unit_tests(self) -> None:
>>> - """
>>> + """DPDK meson fast-tests unit tests.
>>> +
>>> + The DPDK unit tests are basic tests that indicate regressions and other critical failures.
>>> + These need to be addressed before other testing.
>>> +
>>> + The fast-tests unit tests are a subset with only the most basic tests.
>>> +
>>> Test:
>>> Run the fast-test unit-test suite through meson.
>>> """
>>> @@ -37,7 +66,14 @@ def test_unit_tests(self) -> None:
>>> )
>>>
>>> def test_driver_tests(self) -> None:
>>> - """
>>> + """DPDK meson driver-tests unit tests.
>>> +
>>
>> Copy paste from the previous unit test in the driver tests. If it is on
>> purpose as both are considered unit tests, then the previous function is
>> test_unit_tests and deal with fast-tests
>>
>
> I'm not sure what you mean. The two are separate tests (one with the
> fast-test, the other one with the driver-test unit test test suites)
> and the docstring do capture the differences.
I am a little bit confused as to how I deleted it in my reply, but I was
referencing to this sentence in the patch:
"The DPDK unit tests are basic tests that indicate regressions and other
critical failures.
These need to be addressed before other testing."
But in any case, reading it again, I agree with you.
>
>>> +
>>> + The driver-tests unit tests are a subset that test only drivers. These may be run
>>> + with virtual devices as well.
>>> +
>>> Test:
>>> Run the driver-test unit-test suite through meson.
>>> """
>>> @@ -63,7 +99,10 @@ def test_driver_tests(self) -> None:
>>> )
>>>
>>> def test_devices_listed_in_testpmd(self) -> None:
>>> - """
>>> + """Testpmd device discovery.
>>> +
>>> + If the configured devices can't be found in testpmd, they can't be tested.
>>
>> Maybe a bit nitpicky. This is more of a statement as to why the test
>> exist than a description of the test. Suggestion: "Tests that the
>> configured devices can be found in testpmd. If they aren't, the
>> configuration might be wrong and tests might be skipped"
>>
>
> This is more of a reason for why this particular test is a smoke test.
> Since a smoke test failure results in all test suites being blocked,
> this seemed like key information.
>
> We also don't have an exact format of what should be included in a
> test case/suite documentation. We should use this opportunity to
> document what we deem important in these test cases at this point in
> time and improve the docs as we continue adding test cases. We can add
> more custom sections (such as the "Setup:" and" "Test:" sections,
> which can be added to Sphinx); I like adding a section with
> explanation for why a test is a particular type of test (in this case,
> a smoke test). The regular body could contain a description as you
> suggested. What do you think?
>
I'm not really sure what way to go here. The thing I noticed here was
mainly the lack of consistency between this test's description and the
previous one. I agree that making it clear it's a smoke test is good,
but compare it to test_device_bound_to_driver's description for
instance. Both states clearly that they are a smoke test, but the
formulation is quite different.
I'm not entirely sure about adding more custom sections. I fear it might
be more hassle than it's worth. A short guideline on how to write the
doc and what section to use could be handy though.
Reading the previous test, I think I see what you mean by having a
section to describe the test type and another for the description.
In short the type is: DPDK unit tests, test critical failures, needs to
run first
and then followed by the test's description
But I think this type is redundant with the test suite's description? If
so, only a description would be needed.
>>> +
>>> Test:
>>> Uses testpmd driver to verify that devices have been found by testpmd.
>>> """
>>> @@ -79,7 +118,11 @@ def test_devices_listed_in_testpmd(self) -> None:
>>> )
>>>
>>> def test_device_bound_to_driver(self) -> None:
>>> - """
>>> + """Device driver in OS.
>>> +
>>> + The devices must be bound to the proper driver, otherwise they can't be used by DPDK
>>> + or the traffic generators.
>>
>> Same as the previous comment. It is more of a statement as to why the
>> test exist than a description of the test
>>
>
> Ack.
>
>>> +
>>> Test:
>>> Ensure that all drivers listed in the config are bound to the correct
>>> driver.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v7 21/21] dts: test suites docstring update
2023-11-20 12:50 ` Yoan Picchi
@ 2023-11-22 13:40 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-22 13:40 UTC (permalink / raw)
To: Yoan Picchi
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek, dev
On Mon, Nov 20, 2023 at 1:50 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 11/20/23 10:17, Juraj Linkeš wrote:
> > On Thu, Nov 16, 2023 at 6:36 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
> >>
> >> On 11/15/23 13:09, Juraj Linkeš wrote:
> >>> Format according to the Google format and PEP257, with slight
> >>> deviations.
> >>>
> >>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> >>> ---
> >>> dts/tests/TestSuite_hello_world.py | 16 +++++----
> >>> dts/tests/TestSuite_os_udp.py | 19 +++++++----
> >>> dts/tests/TestSuite_smoke_tests.py | 53 +++++++++++++++++++++++++++---
> >>> 3 files changed, 70 insertions(+), 18 deletions(-)
> >>>
> >>> diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
> >>> index 7e3d95c0cf..662a8f8726 100644
> >>> --- a/dts/tests/TestSuite_hello_world.py
> >>> +++ b/dts/tests/TestSuite_hello_world.py
> >>> @@ -1,7 +1,8 @@
> >>> # SPDX-License-Identifier: BSD-3-Clause
> >>> # Copyright(c) 2010-2014 Intel Corporation
> >>>
> >>> -"""
> >>> +"""The DPDK hello world app test suite.
> >>> +
> >>> Run the helloworld example app and verify it prints a message for each used core.
> >>> No other EAL parameters apart from cores are used.
> >>> """
> >>> @@ -15,22 +16,25 @@
> >>>
> >>>
> >>> class TestHelloWorld(TestSuite):
> >>> + """DPDK hello world app test suite."""
> >>> +
> >>> def set_up_suite(self) -> None:
> >>> - """
> >>> + """Set up the test suite.
> >>> +
> >>> Setup:
> >>> Build the app we're about to test - helloworld.
> >>> """
> >>> self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
> >>>
> >>> def test_hello_world_single_core(self) -> None:
> >>> - """
> >>> + """Single core test case.
> >>> +
> >>> Steps:
> >>> Run the helloworld app on the first usable logical core.
> >>> Verify:
> >>> The app prints a message from the used core:
> >>> "hello from core <core_id>"
> >>> """
> >>> -
> >>> # get the first usable core
> >>> lcore_amount = LogicalCoreCount(1, 1, 1)
> >>> lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
> >>> @@ -44,14 +48,14 @@ def test_hello_world_single_core(self) -> None:
> >>> )
> >>>
> >>> def test_hello_world_all_cores(self) -> None:
> >>> - """
> >>> + """All cores test case.
> >>> +
> >>> Steps:
> >>> Run the helloworld app on all usable logical cores.
> >>> Verify:
> >>> The app prints a message from all used cores:
> >>> "hello from core <core_id>"
> >>> """
> >>> -
> >>> # get the maximum logical core number
> >>> eal_para = self.sut_node.create_eal_parameters(
> >>> lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
> >>> diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
> >>> index bf6b93deb5..e0c5239612 100644
> >>> --- a/dts/tests/TestSuite_os_udp.py
> >>> +++ b/dts/tests/TestSuite_os_udp.py
> >>> @@ -1,7 +1,8 @@
> >>> # SPDX-License-Identifier: BSD-3-Clause
> >>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >>>
> >>> -"""
> >>> +"""Basic IPv4 OS routing test suite.
> >>> +
> >>> Configure SUT node to route traffic from if1 to if2.
> >>> Send a packet to the SUT node, verify it comes back on the second port on the TG node.
> >>> """
> >>> @@ -13,24 +14,27 @@
> >>>
> >>>
> >>> class TestOSUdp(TestSuite):
> >>> + """IPv4 UDP OS routing test suite."""
> >>> +
> >>> def set_up_suite(self) -> None:
> >>> - """
> >>> + """Set up the test suite.
> >>> +
> >>> Setup:
> >>> - Configure SUT ports and SUT to route traffic from if1 to if2.
> >>> + Bind the SUT ports to the OS driver, configure the ports and configure the SUT
> >>> + to route traffic from if1 to if2.
> >>> """
> >>>
> >>> - # This test uses kernel drivers
> >>> self.sut_node.bind_ports_to_driver(for_dpdk=False)
> >>> self.configure_testbed_ipv4()
> >>>
> >>> def test_os_udp(self) -> None:
> >>> - """
> >>> + """Basic UDP IPv4 traffic test case.
> >>> +
> >>> Steps:
> >>> Send a UDP packet.
> >>> Verify:
> >>> The packet with proper addresses arrives at the other TG port.
> >>> """
> >>> -
> >>> packet = Ether() / IP() / UDP()
> >>>
> >>> received_packets = self.send_packet_and_capture(packet)
> >>> @@ -40,7 +44,8 @@ def test_os_udp(self) -> None:
> >>> self.verify_packets(expected_packet, received_packets)
> >>>
> >>> def tear_down_suite(self) -> None:
> >>> - """
> >>> + """Tear down the test suite.
> >>> +
> >>> Teardown:
> >>> Remove the SUT port configuration configured in setup.
> >>> """
> >>> diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
> >>> index e8016d1b54..6fae099a0e 100644
> >>> --- a/dts/tests/TestSuite_smoke_tests.py
> >>> +++ b/dts/tests/TestSuite_smoke_tests.py
> >>> @@ -1,6 +1,17 @@
> >>> # SPDX-License-Identifier: BSD-3-Clause
> >>> # Copyright(c) 2023 University of New Hampshire
> >>>
> >>> +"""Smoke test suite.
> >>> +
> >>> +Smoke tests are a class of tests which are used for validating a minimal set of important features.
> >>> +These are the most important features without which (or when they're faulty) the software wouldn't
> >>> +work properly. Thus, if any failure occurs while testing these features,
> >>> +there isn't that much of a reason to continue testing, as the software is fundamentally broken.
> >>> +
> >>> +These tests don't have to include only DPDK tests, as the reason for failures could be
> >>> +in the infrastructure (a faulty link between NICs or a misconfiguration).
> >>> +"""
> >>> +
> >>> import re
> >>>
> >>> from framework.config import PortConfig
> >>> @@ -11,13 +22,25 @@
> >>>
> >>>
> >>> class SmokeTests(TestSuite):
> >>> + """DPDK and infrastructure smoke test suite.
> >>> +
> >>> + The test cases validate the most basic DPDK functionality needed for all other test suites.
> >>> + The infrastructure also needs to be tested, as that is also used by all other test suites.
> >>> +
> >>> + Attributes:
> >>> + is_blocking: This test suite will block the execution of all other test suites
> >>> + in the build target after it.
> >>> + nics_in_node: The NICs present on the SUT node.
> >>> + """
> >>> +
> >>> is_blocking = True
> >>> # dicts in this list are expected to have two keys:
> >>> # "pci_address" and "current_driver"
> >>> nics_in_node: list[PortConfig] = []
> >>>
> >>> def set_up_suite(self) -> None:
> >>> - """
> >>> + """Set up the test suite.
> >>> +
> >>> Setup:
> >>> Set the build directory path and generate a list of NICs in the SUT node.
> >>> """
> >>> @@ -25,7 +48,13 @@ def set_up_suite(self) -> None:
> >>> self.nics_in_node = self.sut_node.config.ports
> >>>
> >>> def test_unit_tests(self) -> None:
> >>> - """
> >>> + """DPDK meson fast-tests unit tests.
> >>> +
> >>> + The DPDK unit tests are basic tests that indicate regressions and other critical failures.
> >>> + These need to be addressed before other testing.
> >>> +
> >>> + The fast-tests unit tests are a subset with only the most basic tests.
> >>> +
> >>> Test:
> >>> Run the fast-test unit-test suite through meson.
> >>> """
> >>> @@ -37,7 +66,14 @@ def test_unit_tests(self) -> None:
> >>> )
> >>>
> >>> def test_driver_tests(self) -> None:
> >>> - """
> >>> + """DPDK meson driver-tests unit tests.
> >>> +
> >>
> >> Copy paste from the previous unit test in the driver tests. If it is on
> >> purpose as both are considered unit tests, then the previous function is
> >> test_unit_tests and deal with fast-tests
> >>
> >
> > I'm not sure what you mean. The two are separate tests (one with the
> > fast-test, the other one with the driver-test unit test test suites)
> > and the docstring do capture the differences.
>
> I am a little bit confused as to how I deleted it in my reply, but I was
> referencing to this sentence in the patch:
> "The DPDK unit tests are basic tests that indicate regressions and other
> critical failures.
> These need to be addressed before other testing."
> But in any case, reading it again, I agree with you.
>
> >
> >>> +
> >>> + The driver-tests unit tests are a subset that test only drivers. These may be run
> >>> + with virtual devices as well.
> >>> +
> >>> Test:
> >>> Run the driver-test unit-test suite through meson.
> >>> """
> >>> @@ -63,7 +99,10 @@ def test_driver_tests(self) -> None:
> >>> )
> >>>
> >>> def test_devices_listed_in_testpmd(self) -> None:
> >>> - """
> >>> + """Testpmd device discovery.
> >>> +
> >>> + If the configured devices can't be found in testpmd, they can't be tested.
> >>
> >> Maybe a bit nitpicky. This is more of a statement as to why the test
> >> exist than a description of the test. Suggestion: "Tests that the
> >> configured devices can be found in testpmd. If they aren't, the
> >> configuration might be wrong and tests might be skipped"
> >>
> >
> > This is more of a reason for why this particular test is a smoke test.
> > Since a smoke test failure results in all test suites being blocked,
> > this seemed like key information.
> >
> > We also don't have an exact format of what should be included in a
> > test case/suite documentation. We should use this opportunity to
> > document what we deem important in these test cases at this point in
> > time and improve the docs as we continue adding test cases. We can add
> > more custom sections (such as the "Setup:" and" "Test:" sections,
> > which can be added to Sphinx); I like adding a section with
> > explanation for why a test is a particular type of test (in this case,
> > a smoke test). The regular body could contain a description as you
> > suggested. What do you think?
> >
>
> I'm not really sure what way to go here. The thing I noticed here was
> mainly the lack of consistency between this test's description and the
> previous one. I agree that making it clear it's a smoke test is good,
> but compare it to test_device_bound_to_driver's description for
> instance. Both states clearly that they are a smoke test, but the
> formulation is quite different.
>
> I'm not entirely sure about adding more custom sections. I fear it might
> be more hassle than it's worth. A short guideline on how to write the
> doc and what section to use could be handy though.
>
> Reading the previous test, I think I see what you mean by having a
> section to describe the test type and another for the description.
> In short the type is: DPDK unit tests, test critical failures, needs to
> run first
> and then followed by the test's description
> But I think this type is redundant with the test suite's description? If
> so, only a description would be needed.
>
Ok, let's not complicate this any further. I thought a bit more about
this and I've also come to the conclusion that the test suite
description is enough (because of the redundancy). I'll just try to
make the test case short descriptions more consistent.
> >>> +
> >>> Test:
> >>> Uses testpmd driver to verify that devices have been found by testpmd.
> >>> """
> >>> @@ -79,7 +118,11 @@ def test_devices_listed_in_testpmd(self) -> None:
> >>> )
> >>>
> >>> def test_device_bound_to_driver(self) -> None:
> >>> - """
> >>> + """Device driver in OS.
> >>> +
> >>> + The devices must be bound to the proper driver, otherwise they can't be used by DPDK
> >>> + or the traffic generators.
> >>
> >> Same as the previous comment. It is more of a statement as to why the
> >> test exist than a description of the test
> >>
> >
> > Ack.
> >
> >>> +
> >>> Test:
> >>> Ensure that all drivers listed in the config are bound to the correct
> >>> driver.
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 00/21] dts: docstrings update
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
` (20 preceding siblings ...)
2023-11-15 13:09 ` [PATCH v7 21/21] dts: test suites " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 01/21] dts: code adjustments for doc generation Juraj Linkeš
` (22 more replies)
21 siblings, 23 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
The first commit makes changes to the code. These code changes mainly
change the structure of the code so that the actual API docs generation
works. There are also some code changes which get reflected in the
documentation, such as making functions/methods/attributes private or
public.
The rest of the commits deal with the actual docstring documentation
(from which the API docs are generated). The format of the docstrings
is the Google format [0] with PEP257 [1] and some guidelines captured
in the last commit of this group covering what the Google format
doesn't.
The docstring updates are split into many commits to make review
possible. When accepted, they may be squashed.
The docstrings have been composed in anticipation of [2], adhering to
maximum line length of 100. We don't have a tool for automatic docstring
formatting, hence the usage of 100 right away to save time.
NOTE: The logger.py module is not fully documented, as it's being
refactored and the refactor will be submitted in the near future.
Documenting it now seems unnecessary.
[0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
[1] https://peps.python.org/pep-0257/
[2] https://patches.dpdk.org/project/dpdk/list/?series=29844
v7:
Split the series into docstrings and api docs generation and addressed
comments.
v8:
Addressed review comments, all of which were pretty minor - small
gramatical changes, a little bit of rewording to remove confusion here
and there, additional explanations and so on.
Juraj Linkeš (21):
dts: code adjustments for doc generation
dts: add docstring checker
dts: add basic developer docs
dts: exceptions docstring update
dts: settings docstring update
dts: logger and utils docstring update
dts: dts runner and main docstring update
dts: test suite docstring update
dts: test result docstring update
dts: config docstring update
dts: remote session docstring update
dts: interactive remote session docstring update
dts: port and virtual device docstring update
dts: cpu docstring update
dts: os session docstring update
dts: posix and linux sessions docstring update
dts: node docstring update
dts: sut and tg nodes docstring update
dts: base traffic generators docstring update
dts: scapy tg docstring update
dts: test suites docstring update
doc/guides/tools/dts.rst | 73 +++
dts/framework/__init__.py | 12 +-
dts/framework/config/__init__.py | 375 +++++++++++++---
dts/framework/config/types.py | 132 ++++++
dts/framework/dts.py | 162 +++++--
dts/framework/exception.py | 156 ++++---
dts/framework/logger.py | 72 ++-
dts/framework/remote_session/__init__.py | 80 ++--
.../interactive_remote_session.py | 36 +-
.../remote_session/interactive_shell.py | 150 +++++++
dts/framework/remote_session/os_session.py | 284 ------------
dts/framework/remote_session/python_shell.py | 32 ++
.../remote_session/remote/__init__.py | 27 --
.../remote/interactive_shell.py | 131 ------
.../remote_session/remote/python_shell.py | 12 -
.../remote_session/remote/remote_session.py | 168 -------
.../remote_session/remote/testpmd_shell.py | 45 --
.../remote_session/remote_session.py | 230 ++++++++++
.../{remote => }/ssh_session.py | 28 +-
dts/framework/remote_session/testpmd_shell.py | 83 ++++
dts/framework/settings.py | 188 ++++++--
dts/framework/test_result.py | 301 ++++++++++---
dts/framework/test_suite.py | 236 +++++++---
dts/framework/testbed_model/__init__.py | 29 +-
dts/framework/testbed_model/{hw => }/cpu.py | 209 ++++++---
dts/framework/testbed_model/hw/__init__.py | 27 --
dts/framework/testbed_model/hw/port.py | 60 ---
.../testbed_model/hw/virtual_device.py | 16 -
.../linux_session.py | 70 ++-
dts/framework/testbed_model/node.py | 214 ++++++---
dts/framework/testbed_model/os_session.py | 422 ++++++++++++++++++
dts/framework/testbed_model/port.py | 93 ++++
.../posix_session.py | 85 +++-
dts/framework/testbed_model/sut_node.py | 238 ++++++----
dts/framework/testbed_model/tg_node.py | 69 ++-
.../testbed_model/traffic_generator.py | 72 ---
.../traffic_generator/__init__.py | 43 ++
.../capturing_traffic_generator.py | 49 +-
.../{ => traffic_generator}/scapy.py | 110 +++--
.../traffic_generator/traffic_generator.py | 85 ++++
dts/framework/testbed_model/virtual_device.py | 29 ++
dts/framework/utils.py | 122 ++---
dts/main.py | 19 +-
dts/poetry.lock | 12 +-
dts/pyproject.toml | 6 +-
dts/tests/TestSuite_hello_world.py | 16 +-
dts/tests/TestSuite_os_udp.py | 20 +-
dts/tests/TestSuite_smoke_tests.py | 61 ++-
48 files changed, 3506 insertions(+), 1683 deletions(-)
create mode 100644 dts/framework/config/types.py
rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (76%)
create mode 100644 dts/framework/remote_session/interactive_shell.py
delete mode 100644 dts/framework/remote_session/os_session.py
create mode 100644 dts/framework/remote_session/python_shell.py
delete mode 100644 dts/framework/remote_session/remote/__init__.py
delete mode 100644 dts/framework/remote_session/remote/interactive_shell.py
delete mode 100644 dts/framework/remote_session/remote/python_shell.py
delete mode 100644 dts/framework/remote_session/remote/remote_session.py
delete mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
create mode 100644 dts/framework/remote_session/remote_session.py
rename dts/framework/remote_session/{remote => }/ssh_session.py (82%)
create mode 100644 dts/framework/remote_session/testpmd_shell.py
rename dts/framework/testbed_model/{hw => }/cpu.py (50%)
delete mode 100644 dts/framework/testbed_model/hw/__init__.py
delete mode 100644 dts/framework/testbed_model/hw/port.py
delete mode 100644 dts/framework/testbed_model/hw/virtual_device.py
rename dts/framework/{remote_session => testbed_model}/linux_session.py (77%)
create mode 100644 dts/framework/testbed_model/os_session.py
create mode 100644 dts/framework/testbed_model/port.py
rename dts/framework/{remote_session => testbed_model}/posix_session.py (73%)
delete mode 100644 dts/framework/testbed_model/traffic_generator.py
create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (68%)
rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (71%)
create mode 100644 dts/framework/testbed_model/traffic_generator/traffic_generator.py
create mode 100644 dts/framework/testbed_model/virtual_device.py
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 01/21] dts: code adjustments for doc generation
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 02/21] dts: add docstring checker Juraj Linkeš
` (21 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
The standard Python tool for generating API documentation, Sphinx,
imports modules one-by-one when generating the documentation. This
requires code changes:
* properly guarding argument parsing in the if __name__ == '__main__'
block,
* the logger used by DTS runner underwent the same treatment so that it
doesn't create log files outside of a DTS run,
* however, DTS uses the arguments to construct an object holding global
variables. The defaults for the global variables needed to be moved
from argument parsing elsewhere,
* importing the remote_session module from framework resulted in
circular imports because of one module trying to import another
module. This is fixed by reorganizing the code,
* some code reorganization was done because the resulting structure
makes more sense, improving documentation clarity.
The are some other changes which are documentation related:
* added missing type annotation so they appear in the generated docs,
* reordered arguments in some methods,
* removed superfluous arguments and attributes,
* change private functions/methods/attributes to private and vice-versa.
The above all appear in the generated documentation and the with them,
the documentation is improved.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 8 +-
dts/framework/dts.py | 31 +++++--
dts/framework/exception.py | 54 +++++-------
dts/framework/remote_session/__init__.py | 41 +++++----
.../interactive_remote_session.py | 0
.../{remote => }/interactive_shell.py | 0
.../{remote => }/python_shell.py | 0
.../remote_session/remote/__init__.py | 27 ------
.../{remote => }/remote_session.py | 0
.../{remote => }/ssh_session.py | 12 +--
.../{remote => }/testpmd_shell.py | 0
dts/framework/settings.py | 85 +++++++++++--------
dts/framework/test_result.py | 4 +-
dts/framework/test_suite.py | 7 +-
dts/framework/testbed_model/__init__.py | 12 +--
dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
dts/framework/testbed_model/hw/__init__.py | 27 ------
.../linux_session.py | 6 +-
dts/framework/testbed_model/node.py | 23 +++--
.../os_session.py | 22 ++---
dts/framework/testbed_model/{hw => }/port.py | 0
.../posix_session.py | 4 +-
dts/framework/testbed_model/sut_node.py | 8 +-
dts/framework/testbed_model/tg_node.py | 29 +------
.../traffic_generator/__init__.py | 23 +++++
.../capturing_traffic_generator.py | 4 +-
.../{ => traffic_generator}/scapy.py | 19 ++---
.../traffic_generator.py | 14 ++-
.../testbed_model/{hw => }/virtual_device.py | 0
dts/framework/utils.py | 40 +++------
dts/main.py | 9 +-
31 files changed, 244 insertions(+), 278 deletions(-)
rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
delete mode 100644 dts/framework/remote_session/remote/__init__.py
rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
delete mode 100644 dts/framework/testbed_model/hw/__init__.py
rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
rename dts/framework/testbed_model/{hw => }/port.py (100%)
rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (98%)
rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (81%)
rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 9b32cf0532..ef25a463c0 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -17,6 +17,7 @@
import warlock # type: ignore[import]
import yaml
+from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict):
+ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
# This looks useless now, but is designed to allow expansion to traffic
# generators that require more configuration later.
match TrafficGeneratorType(d["type"]):
@@ -97,6 +98,8 @@ def from_dict(d: dict):
return ScapyTrafficGeneratorConfig(
traffic_generator_type=TrafficGeneratorType.SCAPY
)
+ case _:
+ raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
@dataclass(slots=True, frozen=True)
@@ -314,6 +317,3 @@ def load_config() -> Configuration:
config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
config_obj: Configuration = Configuration.from_dict(dict(config))
return config_obj
-
-
-CONFIGURATION = load_config()
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 25d6942d81..356368ef10 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -6,19 +6,19 @@
import sys
from .config import (
- CONFIGURATION,
BuildTargetConfiguration,
ExecutionConfiguration,
TestSuiteConfig,
+ load_config,
)
from .exception import BlockingTestSuiteError
from .logger import DTSLOG, getLogger
from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
from .test_suite import get_test_suites
from .testbed_model import SutNode, TGNode
-from .utils import check_dts_python_version
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG = None # type: ignore[assignment]
result: DTSResult = DTSResult(dts_logger)
@@ -30,14 +30,18 @@ def run_all() -> None:
global dts_logger
global result
+ # create a regular DTS logger and create a new result with it
+ dts_logger = getLogger("DTSRunner")
+ result = DTSResult(dts_logger)
+
# check the python version of the server that run dts
- check_dts_python_version()
+ _check_dts_python_version()
sut_nodes: dict[str, SutNode] = {}
tg_nodes: dict[str, TGNode] = {}
try:
# for all Execution sections
- for execution in CONFIGURATION.executions:
+ for execution in load_config().executions:
sut_node = sut_nodes.get(execution.system_under_test_node.name)
tg_node = tg_nodes.get(execution.traffic_generator_node.name)
@@ -82,6 +86,23 @@ def run_all() -> None:
_exit_dts()
+def _check_dts_python_version() -> None:
+ def RED(text: str) -> str:
+ return f"\u001B[31;1m{str(text)}\u001B[0m"
+
+ if sys.version_info.major < 3 or (sys.version_info.major == 3 and sys.version_info.minor < 10):
+ print(
+ RED(
+ (
+ "WARNING: DTS execution node's python version is lower than"
+ "python 3.10, is deprecated and will not work in future releases."
+ )
+ ),
+ file=sys.stderr,
+ )
+ print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+
+
def _run_execution(
sut_node: SutNode,
tg_node: TGNode,
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index b362e42924..151e4d3aa9 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
Command execution timeout.
"""
- command: str
- output: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _command: str
- def __init__(self, command: str, output: str):
- self.command = command
- self.output = output
+ def __init__(self, command: str):
+ self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self.command}"
-
- def get_output(self) -> str:
- return self.output
+ return f"TIMEOUT on {self._command}"
class SSHConnectionError(DTSError):
@@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
SSH connection error.
"""
- host: str
- errors: list[str]
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
+ _errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
- self.host = host
- self.errors = [] if errors is None else errors
+ self._host = host
+ self._errors = [] if errors is None else errors
def __str__(self) -> str:
- message = f"Error trying to connect with {self.host}."
- if self.errors:
- message += f" Errors encountered while retrying: {', '.join(self.errors)}"
+ message = f"Error trying to connect with {self._host}."
+ if self._errors:
+ message += f" Errors encountered while retrying: {', '.join(self._errors)}"
return message
@@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
It can no longer be used.
"""
- host: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
def __init__(self, host: str):
- self.host = host
+ self._host = host
def __str__(self) -> str:
- return f"SSH session with {self.host} has died"
+ return f"SSH session with {self._host} has died"
class ConfigurationError(DTSError):
@@ -107,16 +102,16 @@ class RemoteCommandExecutionError(DTSError):
Raised when a command executed on a Node returns a non-zero exit status.
"""
- command: str
- command_return_code: int
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ command: str
+ _command_return_code: int
def __init__(self, command: str, command_return_code: int):
self.command = command
- self.command_return_code = command_return_code
+ self._command_return_code = command_return_code
def __str__(self) -> str:
- return f"Command {self.command} returned a non-zero exit code: {self.command_return_code}"
+ return f"Command {self.command} returned a non-zero exit code: {self._command_return_code}"
class RemoteDirectoryExistsError(DTSError):
@@ -140,22 +135,15 @@ class TestCaseVerifyError(DTSError):
Used in test cases to verify the expected behavior.
"""
- value: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
- def __init__(self, value: str):
- self.value = value
-
- def __str__(self) -> str:
- return repr(self.value)
-
class BlockingTestSuiteError(DTSError):
- suite_name: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
+ _suite_name: str
def __init__(self, suite_name: str) -> None:
- self.suite_name = suite_name
+ self._suite_name = suite_name
def __str__(self) -> str:
- return f"Blocking suite {self.suite_name} failed."
+ return f"Blocking suite {self._suite_name} failed."
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 6124417bd7..5e7ddb2b05 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -12,27 +12,24 @@
# pylama:ignore=W0611
-from framework.config import OS, NodeConfiguration
-from framework.exception import ConfigurationError
+from framework.config import NodeConfiguration
from framework.logger import DTSLOG
-from .linux_session import LinuxSession
-from .os_session import InteractiveShellType, OSSession
-from .remote import (
- CommandResult,
- InteractiveRemoteSession,
- InteractiveShell,
- PythonShell,
- RemoteSession,
- SSHSession,
- TestPmdDevice,
- TestPmdShell,
-)
-
-
-def create_session(node_config: NodeConfiguration, name: str, logger: DTSLOG) -> OSSession:
- match node_config.os:
- case OS.linux:
- return LinuxSession(node_config, name, logger)
- case _:
- raise ConfigurationError(f"Unsupported OS {node_config.os}")
+from .interactive_remote_session import InteractiveRemoteSession
+from .interactive_shell import InteractiveShell
+from .python_shell import PythonShell
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
+from .testpmd_shell import TestPmdShell
+
+
+def create_remote_session(
+ node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> RemoteSession:
+ return SSHSession(node_config, name, logger)
+
+
+def create_interactive_session(
+ node_config: NodeConfiguration, logger: DTSLOG
+) -> InteractiveRemoteSession:
+ return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_remote_session.py
rename to dts/framework/remote_session/interactive_remote_session.py
diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_shell.py
rename to dts/framework/remote_session/interactive_shell.py
diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/python_shell.py
rename to dts/framework/remote_session/python_shell.py
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 06403691a5..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-# Copyright(c) 2023 University of New Hampshire
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .interactive_remote_session import InteractiveRemoteSession
-from .interactive_shell import InteractiveShell
-from .python_shell import PythonShell
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-from .testpmd_shell import TestPmdDevice, TestPmdShell
-
-
-def create_remote_session(
- node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
- return SSHSession(node_config, name, logger)
-
-
-def create_interactive_session(
- node_config: NodeConfiguration, logger: DTSLOG
-) -> InteractiveRemoteSession:
- return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 91%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
index 1a7ee649ab..a467033a13 100644
--- a/dts/framework/remote_session/remote/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -18,9 +18,7 @@
SSHException,
)
-from framework.config import NodeConfiguration
from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
-from framework.logger import DTSLOG
from .remote_session import CommandResult, RemoteSession
@@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
session: Connection
- def __init__(
- self,
- node_config: NodeConfiguration,
- session_name: str,
- logger: DTSLOG,
- ):
- super(SSHSession, self).__init__(node_config, session_name, logger)
-
def _connect(self) -> None:
errors = []
retry_attempts = 10
@@ -111,7 +101,7 @@ def _send_command(self, command: str, timeout: float, env: dict | None) -> Comma
except CommandTimedOut as e:
self._logger.exception(e)
- raise SSHTimeoutError(command, e.result.stderr) from e
+ raise SSHTimeoutError(command) from e
return CommandResult(self.name, command, output.stdout, output.stderr, output.return_code)
diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/testpmd_shell.py
rename to dts/framework/remote_session/testpmd_shell.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 974793a11a..25b5dcff22 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
from pathlib import Path
from typing import Any, TypeVar
@@ -22,8 +22,8 @@ def __init__(
option_strings: Sequence[str],
dest: str,
nargs: str | int | None = None,
- const: str | None = None,
- default: str = None,
+ const: bool | None = None,
+ default: Any = None,
type: Callable[[str], _T | argparse.FileType | None] = None,
choices: Iterable[_T] | None = None,
required: bool = False,
@@ -32,6 +32,12 @@ def __init__(
) -> None:
env_var_value = os.environ.get(env_var)
default = env_var_value or default
+ if const is not None:
+ nargs = 0
+ default = const if env_var_value else default
+ type = None
+ choices = None
+ metavar = None
super(_EnvironmentArgument, self).__init__(
option_strings,
dest,
@@ -52,22 +58,28 @@ def __call__(
values: Any,
option_string: str = None,
) -> None:
- setattr(namespace, self.dest, values)
+ if self.const is not None:
+ setattr(namespace, self.dest, self.const)
+ else:
+ setattr(namespace, self.dest, values)
return _EnvironmentArgument
-@dataclass(slots=True, frozen=True)
-class _Settings:
- config_file_path: str
- output_dir: str
- timeout: float
- verbose: bool
- skip_setup: bool
- dpdk_tarball_path: Path
- compile_timeout: float
- test_cases: list
- re_run: int
+@dataclass(slots=True)
+class Settings:
+ config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ output_dir: str = "output"
+ timeout: float = 15
+ verbose: bool = False
+ skip_setup: bool = False
+ dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ compile_timeout: float = 1200
+ test_cases: list[str] = field(default_factory=list)
+ re_run: int = 0
+
+
+SETTINGS: Settings = Settings()
def _get_parser() -> argparse.ArgumentParser:
@@ -80,7 +92,8 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--config-file",
action=_env_arg("DTS_CFG_FILE"),
- default="conf.yaml",
+ default=SETTINGS.config_file_path,
+ type=Path,
help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs and targets.",
)
@@ -88,7 +101,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--output-dir",
"--output",
action=_env_arg("DTS_OUTPUT_DIR"),
- default="output",
+ default=SETTINGS.output_dir,
help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
)
@@ -96,7 +109,7 @@ def _get_parser() -> argparse.ArgumentParser:
"-t",
"--timeout",
action=_env_arg("DTS_TIMEOUT"),
- default=15,
+ default=SETTINGS.timeout,
type=float,
help="[DTS_TIMEOUT] The default timeout for all DTS operations except for compiling DPDK.",
)
@@ -105,8 +118,9 @@ def _get_parser() -> argparse.ArgumentParser:
"-v",
"--verbose",
action=_env_arg("DTS_VERBOSE"),
- default="N",
- help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
+ default=SETTINGS.verbose,
+ const=True,
+ help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
"to the console.",
)
@@ -114,8 +128,8 @@ def _get_parser() -> argparse.ArgumentParser:
"-s",
"--skip-setup",
action=_env_arg("DTS_SKIP_SETUP"),
- default="N",
- help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
+ const=True,
+ help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
)
parser.add_argument(
@@ -123,7 +137,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--snapshot",
"--git-ref",
action=_env_arg("DTS_DPDK_TARBALL"),
- default="dpdk.tar.xz",
+ default=SETTINGS.dpdk_tarball_path,
type=Path,
help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
"tag ID or tree ID to test. To test local changes, first commit them, "
@@ -133,7 +147,7 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--compile-timeout",
action=_env_arg("DTS_COMPILE_TIMEOUT"),
- default=1200,
+ default=SETTINGS.compile_timeout,
type=float,
help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
)
@@ -150,7 +164,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--re-run",
"--re_run",
action=_env_arg("DTS_RERUN"),
- default=0,
+ default=SETTINGS.re_run,
type=int,
help="[DTS_RERUN] Re-run each test case the specified amount of times "
"if a test failure occurs",
@@ -159,21 +173,20 @@ def _get_parser() -> argparse.ArgumentParser:
return parser
-def _get_settings() -> _Settings:
+def get_settings() -> Settings:
parsed_args = _get_parser().parse_args()
- return _Settings(
+ return Settings(
config_file_path=parsed_args.config_file,
output_dir=parsed_args.output_dir,
timeout=parsed_args.timeout,
- verbose=(parsed_args.verbose == "Y"),
- skip_setup=(parsed_args.skip_setup == "Y"),
- dpdk_tarball_path=Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
- if not os.path.exists(parsed_args.tarball)
- else Path(parsed_args.tarball),
+ verbose=parsed_args.verbose,
+ skip_setup=parsed_args.skip_setup,
+ dpdk_tarball_path=Path(
+ Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
+ if not os.path.exists(parsed_args.tarball)
+ else Path(parsed_args.tarball)
+ ),
compile_timeout=parsed_args.compile_timeout,
- test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+ test_cases=(parsed_args.test_cases.split(",") if parsed_args.test_cases else []),
re_run=parsed_args.re_run,
)
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 4c2e7e2418..57090feb04 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -246,7 +246,7 @@ def add_build_target(self, build_target: BuildTargetConfiguration) -> BuildTarge
self._inner_results.append(build_target_result)
return build_target_result
- def add_sut_info(self, sut_info: NodeInfo):
+ def add_sut_info(self, sut_info: NodeInfo) -> None:
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
@@ -289,7 +289,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
self._inner_results.append(execution_result)
return execution_result
- def add_error(self, error) -> None:
+ def add_error(self, error: Exception) -> None:
self._errors.append(error)
def process(self) -> None:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 4a7907ec33..f9e66e814a 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -11,7 +11,7 @@
import re
from ipaddress import IPv4Interface, IPv6Interface, ip_interface
from types import MethodType
-from typing import Union
+from typing import Any, Union
from scapy.layers.inet import IP # type: ignore[import]
from scapy.layers.l2 import Ether # type: ignore[import]
@@ -26,8 +26,7 @@
from .logger import DTSLOG, getLogger
from .settings import SETTINGS
from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
-from .testbed_model import SutNode, TGNode
-from .testbed_model.hw.port import Port, PortLink
+from .testbed_model import Port, PortLink, SutNode, TGNode
from .utils import get_packet_summaries
@@ -426,7 +425,7 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
- def is_test_suite(object) -> bool:
+ def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
return True
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 5cbb859e47..8ced05653b 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,15 +9,9 @@
# pylama:ignore=W0611
-from .hw import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
- VirtualDevice,
- lcore_filter,
-)
+from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
from .node import Node
+from .port import Port, PortLink
from .sut_node import SutNode
from .tg_node import TGNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/testbed_model/cpu.py
index cbc5fe7fff..1b392689f5 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -262,3 +262,16 @@ def filter(self) -> list[LogicalCore]:
)
return filtered_lcores
+
+
+def lcore_filter(
+ core_list: list[LogicalCore],
+ filter_specifier: LogicalCoreCount | LogicalCoreList,
+ ascending: bool,
+) -> LogicalCoreFilter:
+ if isinstance(filter_specifier, LogicalCoreList):
+ return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+ elif isinstance(filter_specifier, LogicalCoreCount):
+ return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+ else:
+ raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
- core_list: list[LogicalCore],
- filter_specifier: LogicalCoreCount | LogicalCoreList,
- ascending: bool,
-) -> LogicalCoreFilter:
- if isinstance(filter_specifier, LogicalCoreList):
- return LogicalCoreListFilter(core_list, filter_specifier, ascending)
- elif isinstance(filter_specifier, LogicalCoreCount):
- return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
- else:
- raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
similarity index 97%
rename from dts/framework/remote_session/linux_session.py
rename to dts/framework/testbed_model/linux_session.py
index fd877fbfae..055765ba2d 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -9,10 +9,10 @@
from typing_extensions import NotRequired
from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
from framework.utils import expand_range
+from .cpu import LogicalCore
+from .port import Port
from .posix_session import PosixSession
@@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
lcores.append(LogicalCore(lcore, core, socket, node))
return lcores
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return dpdk_prefix
def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index ef700d8114..b313b5ad54 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,23 +12,26 @@
from typing import Any, Callable, Type, Union
from framework.config import (
+ OS,
BuildTargetConfiguration,
ExecutionConfiguration,
NodeConfiguration,
)
+from framework.exception import ConfigurationError
from framework.logger import DTSLOG, getLogger
-from framework.remote_session import InteractiveShellType, OSSession, create_session
from framework.settings import SETTINGS
-from .hw import (
+from .cpu import (
LogicalCore,
LogicalCoreCount,
LogicalCoreList,
LogicalCoreListFilter,
- VirtualDevice,
lcore_filter,
)
-from .hw.port import Port
+from .linux_session import LinuxSession
+from .os_session import InteractiveShellType, OSSession
+from .port import Port
+from .virtual_device import VirtualDevice
class Node(ABC):
@@ -168,9 +171,9 @@ def create_interactive_shell(
return self.main_session.create_interactive_shell(
shell_cls,
- app_args,
timeout,
privileged,
+ app_args,
)
def filter_lcores(
@@ -201,7 +204,7 @@ def _get_remote_cpus(self) -> None:
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
- def _setup_hugepages(self):
+ def _setup_hugepages(self) -> None:
"""
Setup hugepages on the Node. Different architectures can supply different
amounts of memory for hugepages and numa-based hugepage allocation may need
@@ -245,3 +248,11 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
return lambda *args: None
else:
return func
+
+
+def create_session(node_config: NodeConfiguration, name: str, logger: DTSLOG) -> OSSession:
+ match node_config.os:
+ case OS.linux:
+ return LinuxSession(node_config, name, logger)
+ case _:
+ raise ConfigurationError(f"Unsupported OS {node_config.os}")
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
similarity index 95%
rename from dts/framework/remote_session/os_session.py
rename to dts/framework/testbed_model/os_session.py
index 8a709eac1c..76e595a518 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -10,19 +10,19 @@
from framework.config import Architecture, NodeConfiguration, NodeInfo
from framework.logger import DTSLOG
-from framework.remote_session.remote import InteractiveShell
-from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
-from framework.utils import MesonArgs
-
-from .remote import (
+from framework.remote_session import (
CommandResult,
InteractiveRemoteSession,
+ InteractiveShell,
RemoteSession,
create_interactive_session,
create_remote_session,
)
+from framework.settings import SETTINGS
+from framework.utils import MesonArgs
+
+from .cpu import LogicalCore
+from .port import Port
InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
@@ -85,9 +85,9 @@ def send_command(
def create_interactive_shell(
self,
shell_cls: Type[InteractiveShellType],
- eal_parameters: str,
timeout: float,
privileged: bool,
+ app_args: str,
) -> InteractiveShellType:
"""
See "create_interactive_shell" in SutNode
@@ -96,7 +96,7 @@ def create_interactive_shell(
self.interactive_session.session,
self._logger,
self._get_privileged_command if privileged else None,
- eal_parameters,
+ app_args,
timeout,
)
@@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
"""
@abstractmethod
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
"""
Try to find DPDK remote dir in remote_dir.
"""
@@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
"""
@abstractmethod
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
"""
Get the DPDK file prefix that will be used when running DPDK apps.
"""
diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
similarity index 100%
rename from dts/framework/testbed_model/hw/port.py
rename to dts/framework/testbed_model/port.py
diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
similarity index 98%
rename from dts/framework/remote_session/posix_session.py
rename to dts/framework/testbed_model/posix_session.py
index a29e2e8280..5657cc0bc9 100644
--- a/dts/framework/remote_session/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
@@ -207,7 +207,7 @@ def _remove_dpdk_runtime_dirs(self, dpdk_runtime_dirs: Iterable[str | PurePath])
for dpdk_runtime_dir in dpdk_runtime_dirs:
self.remove_remote_dir(dpdk_runtime_dir)
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return ""
def get_compiler_version(self, compiler_name: str) -> str:
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 7f75043bd3..5ce9446dba 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,12 +15,14 @@
NodeInfo,
SutNodeConfiguration,
)
-from framework.remote_session import CommandResult, InteractiveShellType, OSSession
+from framework.remote_session import CommandResult
from framework.settings import SETTINGS
from framework.utils import MesonArgs
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
+from .cpu import LogicalCoreCount, LogicalCoreList
from .node import Node
+from .os_session import InteractiveShellType, OSSession
+from .virtual_device import VirtualDevice
class EalParameters(object):
@@ -293,7 +295,7 @@ def create_eal_parameters(
prefix: str = "dpdk",
append_prefix_timestamp: bool = True,
no_pci: bool = False,
- vdevs: list[VirtualDevice] = None,
+ vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
"""
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 79a55663b5..8a8f0019f3 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -16,16 +16,11 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.config import (
- ScapyTrafficGeneratorConfig,
- TGNodeConfiguration,
- TrafficGeneratorType,
-)
-from framework.exception import ConfigurationError
-
-from .capturing_traffic_generator import CapturingTrafficGenerator
-from .hw.port import Port
+from framework.config import TGNodeConfiguration
+
from .node import Node
+from .port import Port
+from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
class TGNode(Node):
@@ -78,19 +73,3 @@ def close(self) -> None:
"""Free all resources used by the node"""
self.traffic_generator.close()
super(TGNode, self).close()
-
-
-def create_traffic_generator(
- tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
-) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
-
- from .scapy import ScapyTrafficGenerator
-
- match traffic_generator_config.traffic_generator_type:
- case TrafficGeneratorType.SCAPY:
- return ScapyTrafficGenerator(tg_node, traffic_generator_config)
- case _:
- raise ConfigurationError(
- f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
- )
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
new file mode 100644
index 0000000000..52888d03fa
--- /dev/null
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -0,0 +1,23 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
+from framework.exception import ConfigurationError
+from framework.testbed_model.node import Node
+
+from .capturing_traffic_generator import CapturingTrafficGenerator
+from .scapy import ScapyTrafficGenerator
+
+
+def create_traffic_generator(
+ tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
+) -> CapturingTrafficGenerator:
+ """A factory function for creating traffic generator object from user config."""
+
+ match traffic_generator_config.traffic_generator_type:
+ case TrafficGeneratorType.SCAPY:
+ return ScapyTrafficGenerator(tg_node, traffic_generator_config)
+ case _:
+ raise ConfigurationError(
+ "Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
+ )
diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
similarity index 98%
rename from dts/framework/testbed_model/capturing_traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index e6512061d7..1fc7f98c05 100644
--- a/dts/framework/testbed_model/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -16,9 +16,9 @@
from scapy.packet import Packet # type: ignore[import]
from framework.settings import SETTINGS
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
from .traffic_generator import TrafficGenerator
@@ -127,7 +127,7 @@ def _send_packets_and_capture(
for the specified duration. It must be able to handle no received packets.
"""
- def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
+ def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]) -> None:
file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
self._logger.debug(f"Writing packets to {file_name}.")
scapy.utils.wrpcap(file_name, packets)
diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
similarity index 95%
rename from dts/framework/testbed_model/scapy.py
rename to dts/framework/testbed_model/traffic_generator/scapy.py
index 9083e92b3d..c88cf28369 100644
--- a/dts/framework/testbed_model/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -24,16 +24,15 @@
from scapy.packet import Packet # type: ignore[import]
from framework.config import OS, ScapyTrafficGeneratorConfig
-from framework.logger import DTSLOG, getLogger
from framework.remote_session import PythonShell
from framework.settings import SETTINGS
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from .capturing_traffic_generator import (
CapturingTrafficGenerator,
_get_default_capture_name,
)
-from .hw.port import Port
-from .tg_node import TGNode
"""
========= BEGIN RPC FUNCTIONS =========
@@ -144,7 +143,7 @@ def quit(self) -> None:
self._BaseServer__shutdown_request = True
return None
- def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
+ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
"""Add a function to the server.
This is meant to be executed remotely.
@@ -189,13 +188,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
session: PythonShell
rpc_server_proxy: xmlrpc.client.ServerProxy
_config: ScapyTrafficGeneratorConfig
- _tg_node: TGNode
- _logger: DTSLOG
- def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
- self._config = config
- self._tg_node = tg_node
- self._logger = getLogger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+ def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ super().__init__(tg_node, config)
assert (
self._tg_node.config.os == OS.linux
@@ -229,7 +224,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
function_bytes = marshal.dumps(function.__code__)
self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
- def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
+ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
# load the source of the function
src = inspect.getsource(QuittableXMLRPCServer)
# Lines with only whitespace break the repl if in the middle of a function
@@ -271,7 +266,7 @@ def _send_packets_and_capture(
scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
return scapy_packets
- def close(self):
+ def close(self) -> None:
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
similarity index 81%
rename from dts/framework/testbed_model/traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 28c35d3ce4..0d9902ddb7 100644
--- a/dts/framework/testbed_model/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -12,11 +12,12 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.logger import DTSLOG
+from framework.config import TrafficGeneratorConfig
+from framework.logger import DTSLOG, getLogger
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
-
class TrafficGenerator(ABC):
"""The base traffic generator.
@@ -24,8 +25,15 @@ class TrafficGenerator(ABC):
Defines the few basic methods that each traffic generator must implement.
"""
+ _config: TrafficGeneratorConfig
+ _tg_node: Node
_logger: DTSLOG
+ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ self._config = config
+ self._tg_node = tg_node
+ self._logger = getLogger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+
def send_packet(self, packet: Packet, port: Port) -> None:
"""Send a packet and block until it is fully sent.
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index d098d364ff..a0f2173949 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -7,7 +7,6 @@
import json
import os
import subprocess
-import sys
from enum import Enum
from pathlib import Path
from subprocess import SubprocessError
@@ -16,31 +15,7 @@
from .exception import ConfigurationError
-
-class StrEnum(Enum):
- @staticmethod
- def _generate_next_value_(name: str, start: int, count: int, last_values: object) -> str:
- return name
-
- def __str__(self) -> str:
- return self.name
-
-
-REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
-
-
-def check_dts_python_version() -> None:
- if sys.version_info.major < 3 or (sys.version_info.major == 3 and sys.version_info.minor < 10):
- print(
- RED(
- (
- "WARNING: DTS execution node's python version is lower than"
- "python 3.10, is deprecated and will not work in future releases."
- )
- ),
- file=sys.stderr,
- )
- print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
def expand_range(range_str: str) -> list[int]:
@@ -61,7 +36,7 @@ def expand_range(range_str: str) -> list[int]:
return expanded_range
-def get_packet_summaries(packets: list[Packet]):
+def get_packet_summaries(packets: list[Packet]) -> str:
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -69,8 +44,13 @@ def get_packet_summaries(packets: list[Packet]):
return f"Packet contents: \n{packet_summaries}"
-def RED(text: str) -> str:
- return f"\u001B[31;1m{str(text)}\u001B[0m"
+class StrEnum(Enum):
+ @staticmethod
+ def _generate_next_value_(name: str, start: int, count: int, last_values: object) -> str:
+ return name
+
+ def __str__(self) -> str:
+ return self.name
class MesonArgs(object):
@@ -215,5 +195,5 @@ def _delete_tarball(self) -> None:
if self._tarball_path and os.path.exists(self._tarball_path):
os.remove(self._tarball_path)
- def __fspath__(self):
+ def __fspath__(self) -> str:
return str(self._tarball_path)
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..5d4714b0c3 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,17 @@
import logging
-from framework import dts
+from framework import settings
def main() -> None:
+ """Set DTS settings, then run DTS.
+
+ The DTS settings are taken from the command line arguments and the environment variables.
+ """
+ settings.SETTINGS = settings.get_settings()
+ from framework import dts
+
dts.run_all()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 02/21] dts: add docstring checker
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 01/21] dts: code adjustments for doc generation Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 03/21] dts: add basic developer docs Juraj Linkeš
` (20 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Python docstrings are the in-code way to document the code. The
docstring checker of choice is pydocstyle which we're executing from
Pylama, but the current latest versions are not complatible due to [0],
so pin the pydocstyle version to the latest working version.
[0] https://github.com/klen/pylama/issues/232
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 12 ++++++------
dts/pyproject.toml | 6 +++++-
2 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index f7b3b6d602..a734fa71f0 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -489,20 +489,20 @@ files = [
[[package]]
name = "pydocstyle"
-version = "6.3.0"
+version = "6.1.1"
description = "Python docstring style checker"
optional = false
python-versions = ">=3.6"
files = [
- {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
- {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+ {file = "pydocstyle-6.1.1-py3-none-any.whl", hash = "sha256:6987826d6775056839940041beef5c08cc7e3d71d63149b48e36727f70144dc4"},
+ {file = "pydocstyle-6.1.1.tar.gz", hash = "sha256:1d41b7c459ba0ee6c345f2eb9ae827cab14a7533a88c5c6f7e94923f72df92dc"},
]
[package.dependencies]
-snowballstemmer = ">=2.2.0"
+snowballstemmer = "*"
[package.extras]
-toml = ["tomli (>=1.2.3)"]
+toml = ["toml"]
[[package]]
name = "pyflakes"
@@ -837,4 +837,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
+content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 980ac3c7db..37a692d655 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -25,6 +25,7 @@ PyYAML = "^6.0"
types-PyYAML = "^6.0.8"
fabric = "^2.7.1"
scapy = "^2.5.0"
+pydocstyle = "6.1.1"
[tool.poetry.group.dev.dependencies]
mypy = "^0.961"
@@ -39,10 +40,13 @@ requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
[tool.pylama]
-linters = "mccabe,pycodestyle,pyflakes"
+linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
format = "pylint"
max_line_length = 100
+[tool.pylama.linter.pydocstyle]
+convention = "google"
+
[tool.mypy]
python_version = "3.10"
enable_error_code = ["ignore-without-code"]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 03/21] dts: add basic developer docs
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 01/21] dts: code adjustments for doc generation Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 02/21] dts: add docstring checker Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 04/21] dts: exceptions docstring update Juraj Linkeš
` (19 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Expand the framework contribution guidelines and add how to document the
code with Python docstrings.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
1 file changed, 73 insertions(+)
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 32c18ee472..cd771a428c 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
The results contain basic statistics of passed/failed test cases and DPDK version.
+Contributing to DTS
+-------------------
+
+There are two areas of contribution: The DTS framework and DTS test suites.
+
+The framework contains the logic needed to run test cases, such as connecting to nodes,
+running DPDK apps and collecting results.
+
+The test cases call APIs from the framework to test their scenarios. Adding test cases may
+require adding code to the framework as well.
+
+
+Framework Coding Guidelines
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When adding code to the DTS framework, pay attention to the rest of the code
+and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
+warnings when some of the basics are not met.
+
+The code must be properly documented with docstrings. The style must conform to
+the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
+See an example of the style
+`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
+For cases which are not covered by the Google style, refer
+to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
+the two style guides, where we deviate or where some additional clarification is helpful:
+
+ * The __init__() methods of classes are documented separately from the docstring of the class
+ itself.
+ * The docstrigs of implemented abstract methods should refer to the superclass's definition
+ if there's no deviation.
+ * Instance variables/attributes should be documented in the docstring of the class
+ in the ``Attributes:`` section.
+ * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
+ attributes which result in instance variables/attributes should also be recorded
+ in the ``Attributes:`` section.
+ * Class variables/attributes, on the other hand, should be documented with ``#:`` above
+ the type annotated line. The description may be omitted if the meaning is obvious.
+ * The Enum and TypedDict also process the attributes in particular ways and should be documented
+ with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
+ * When referencing a parameter of a function or a method in their docstring, don't use
+ any articles and put the parameter into single backticks. This mimics the style of
+ `Python's documentation <https://docs.python.org/3/index.html>`_.
+ * When specifying a value, use double backticks::
+
+ def foo(greet: bool) -> None:
+ """Demonstration of single and double backticks.
+
+ `greet` controls whether ``Hello World`` is printed.
+
+ Args:
+ greet: Whether to print the ``Hello World`` message.
+ """
+ if greet:
+ print(f"Hello World")
+
+ * The docstring maximum line length is the same as the code maximum line length.
+
+
How To Write a Test Suite
-------------------------
@@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
| These methods don't need to be implemented if there's no need for them in a test suite.
In that case, nothing will happen when they're is executed.
+#. **Configuration, traffic and other logic**
+
+ The ``TestSuite`` class contains a variety of methods for anything that
+ a test suite setup, a teardown, or a test case may need to do.
+
+ The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
+ and use the interactive shell instances directly.
+
+ These are the two main ways to call the framework logic in test suites. If there's any
+ functionality or logic missing from the framework, it should be implemented so that
+ the test suites can use one of these two ways.
+
#. **Test case verification**
Test case verification should be done with the ``verify`` method, which records the result.
@@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
and used by the test suite via the ``sut_node`` field.
+.. _dts_dev_tools:
+
DTS Developer Tools
-------------------
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 04/21] dts: exceptions docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (2 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 03/21] dts: add basic developer docs Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 05/21] dts: settings " Juraj Linkeš
` (18 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/__init__.py | 12 ++++-
dts/framework/exception.py | 106 +++++++++++++++++++++++++------------
2 files changed, 83 insertions(+), 35 deletions(-)
diff --git a/dts/framework/__init__.py b/dts/framework/__init__.py
index d551ad4bf0..662e6ccad2 100644
--- a/dts/framework/__init__.py
+++ b/dts/framework/__init__.py
@@ -1,3 +1,13 @@
# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2022 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+
+"""Libraries and utilities for running DPDK Test Suite (DTS).
+
+The various modules in the DTS framework offer:
+
+* Connections to nodes, both interactive and non-interactive,
+* A straightforward way to add support for different operating systems of remote nodes,
+* Test suite setup, execution and teardown, along with test case setup, execution and teardown,
+* Pre-test suite setup and post-test suite teardown.
+"""
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 151e4d3aa9..658eee2c38 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -3,8 +3,10 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-User-defined exceptions used across the framework.
+"""DTS exceptions.
+
+The exceptions all have different severities expressed as an integer.
+The highest severity of all raised exceptions is used as the exit code of DTS.
"""
from enum import IntEnum, unique
@@ -13,59 +15,79 @@
@unique
class ErrorSeverity(IntEnum):
- """
- The severity of errors that occur during DTS execution.
+ """The severity of errors that occur during DTS execution.
+
All exceptions are caught and the most severe error is used as return code.
"""
+ #:
NO_ERR = 0
+ #:
GENERIC_ERR = 1
+ #:
CONFIG_ERR = 2
+ #:
REMOTE_CMD_EXEC_ERR = 3
+ #:
SSH_ERR = 4
+ #:
DPDK_BUILD_ERR = 10
+ #:
TESTCASE_VERIFY_ERR = 20
+ #:
BLOCKING_TESTSUITE_ERR = 25
class DTSError(Exception):
- """
- The base exception from which all DTS exceptions are derived.
- Stores error severity.
+ """The base exception from which all DTS exceptions are subclassed.
+
+ Do not use this exception, only use subclassed exceptions.
"""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.GENERIC_ERR
class SSHTimeoutError(DTSError):
- """
- Command execution timeout.
- """
+ """The SSH execution of a command timed out."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_command: str
def __init__(self, command: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ command: The executed command.
+ """
self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self._command}"
+ """Add some context to the string representation."""
+ return f"{self._command} execution timed out."
class SSHConnectionError(DTSError):
- """
- SSH connection error.
- """
+ """An unsuccessful SSH connection."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
_errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ host: The hostname to which we're trying to connect.
+ errors: Any errors that occurred during the connection attempt.
+ """
self._host = host
self._errors = [] if errors is None else errors
def __str__(self) -> str:
+ """Include the errors in the string representation."""
message = f"Error trying to connect with {self._host}."
if self._errors:
message += f" Errors encountered while retrying: {', '.join(self._errors)}"
@@ -74,76 +96,92 @@ def __str__(self) -> str:
class SSHSessionDeadError(DTSError):
- """
- SSH session is not alive.
- It can no longer be used.
- """
+ """The SSH session is no longer alive."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
def __init__(self, host: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ host: The hostname of the disconnected node.
+ """
self._host = host
def __str__(self) -> str:
- return f"SSH session with {self._host} has died"
+ """Add some context to the string representation."""
+ return f"SSH session with {self._host} has died."
class ConfigurationError(DTSError):
- """
- Raised when an invalid configuration is encountered.
- """
+ """An invalid configuration."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.CONFIG_ERR
class RemoteCommandExecutionError(DTSError):
- """
- Raised when a command executed on a Node returns a non-zero exit status.
- """
+ """An unsuccessful execution of a remote command."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ #: The executed command.
command: str
_command_return_code: int
def __init__(self, command: str, command_return_code: int):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ command: The executed command.
+ command_return_code: The return code of the executed command.
+ """
self.command = command
self._command_return_code = command_return_code
def __str__(self) -> str:
+ """Include both the command and return code in the string representation."""
return f"Command {self.command} returned a non-zero exit code: {self._command_return_code}"
class RemoteDirectoryExistsError(DTSError):
- """
- Raised when a remote directory to be created already exists.
- """
+ """A directory that exists on a remote node."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
class DPDKBuildError(DTSError):
- """
- Raised when DPDK build fails for any reason.
- """
+ """A DPDK build failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.DPDK_BUILD_ERR
class TestCaseVerifyError(DTSError):
- """
- Used in test cases to verify the expected behavior.
- """
+ """A test case failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
class BlockingTestSuiteError(DTSError):
+ """A failure in a blocking test suite."""
+
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
_suite_name: str
def __init__(self, suite_name: str) -> None:
+ """Define the meaning of the first argument.
+
+ Args:
+ suite_name: The blocking test suite.
+ """
self._suite_name = suite_name
def __str__(self) -> str:
+ """Add some context to the string representation."""
return f"Blocking suite {self._suite_name} failed."
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 05/21] dts: settings docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (3 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 04/21] dts: exceptions docstring update Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 06/21] dts: logger and utils " Juraj Linkeš
` (17 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/settings.py | 103 +++++++++++++++++++++++++++++++++++++-
1 file changed, 102 insertions(+), 1 deletion(-)
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 25b5dcff22..41f98e8519 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -3,6 +3,72 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+"""Environment variables and command line arguments parsing.
+
+This is a simple module utilizing the built-in argparse module to parse command line arguments,
+augment them with values from environment variables and make them available across the framework.
+
+The command line value takes precedence, followed by the environment variable value,
+followed by the default value defined in this module.
+
+The command line arguments along with the supported environment variables are:
+
+.. option:: --config-file
+.. envvar:: DTS_CFG_FILE
+
+ The path to the YAML test run configuration file.
+
+.. option:: --output-dir, --output
+.. envvar:: DTS_OUTPUT_DIR
+
+ The directory where DTS logs and results are saved.
+
+.. option:: --compile-timeout
+.. envvar:: DTS_COMPILE_TIMEOUT
+
+ The timeout for compiling DPDK.
+
+.. option:: -t, --timeout
+.. envvar:: DTS_TIMEOUT
+
+ The timeout for all DTS operation except for compiling DPDK.
+
+.. option:: -v, --verbose
+.. envvar:: DTS_VERBOSE
+
+ Set to any value to enable logging everything to the console.
+
+.. option:: -s, --skip-setup
+.. envvar:: DTS_SKIP_SETUP
+
+ Set to any value to skip building DPDK.
+
+.. option:: --tarball, --snapshot, --git-ref
+.. envvar:: DTS_DPDK_TARBALL
+
+ The path to a DPDK tarball, git commit ID, tag ID or tree ID to test.
+
+.. option:: --test-cases
+.. envvar:: DTS_TESTCASES
+
+ A comma-separated list of test cases to execute. Unknown test cases will be silently ignored.
+
+.. option:: --re-run, --re_run
+.. envvar:: DTS_RERUN
+
+ Re-run each test case this many times in case of a failure.
+
+The module provides one key module-level variable:
+
+Attributes:
+ SETTINGS: The module level variable storing framework-wide DTS settings.
+
+Typical usage example::
+
+ from framework.settings import SETTINGS
+ foo = SETTINGS.foo
+"""
+
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
@@ -16,6 +82,23 @@
def _env_arg(env_var: str) -> Any:
+ """A helper method augmenting the argparse Action with environment variables.
+
+ If the supplied environment variable is defined, then the default value
+ of the argument is modified. This satisfies the priority order of
+ command line argument > environment variable > default value.
+
+ Arguments with no values (flags) should be defined using the const keyword argument
+ (True or False). When the argument is specified, it will be set to const, if not specified,
+ the default will be stored (possibly modified by the corresponding environment variable).
+
+ Other arguments work the same as default argparse arguments, that is using
+ the default 'store' action.
+
+ Returns:
+ The modified argparse.Action.
+ """
+
class _EnvironmentArgument(argparse.Action):
def __init__(
self,
@@ -68,14 +151,28 @@ def __call__(
@dataclass(slots=True)
class Settings:
+ """Default framework-wide user settings.
+
+ The defaults may be modified at the start of the run.
+ """
+
+ #:
config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ #:
output_dir: str = "output"
+ #:
timeout: float = 15
+ #:
verbose: bool = False
+ #:
skip_setup: bool = False
+ #:
dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ #:
compile_timeout: float = 1200
+ #:
test_cases: list[str] = field(default_factory=list)
+ #:
re_run: int = 0
@@ -166,7 +263,7 @@ def _get_parser() -> argparse.ArgumentParser:
action=_env_arg("DTS_RERUN"),
default=SETTINGS.re_run,
type=int,
- help="[DTS_RERUN] Re-run each test case the specified amount of times "
+ help="[DTS_RERUN] Re-run each test case the specified number of times "
"if a test failure occurs",
)
@@ -174,6 +271,10 @@ def _get_parser() -> argparse.ArgumentParser:
def get_settings() -> Settings:
+ """Create new settings with inputs from the user.
+
+ The inputs are taken from the command line and from environment variables.
+ """
parsed_args = _get_parser().parse_args()
return Settings(
config_file_path=parsed_args.config_file,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 06/21] dts: logger and utils docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (4 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 05/21] dts: settings " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 07/21] dts: dts runner and main " Juraj Linkeš
` (16 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/logger.py | 72 ++++++++++++++++++++++-----------
dts/framework/utils.py | 88 +++++++++++++++++++++++++++++------------
2 files changed, 113 insertions(+), 47 deletions(-)
diff --git a/dts/framework/logger.py b/dts/framework/logger.py
index bb2991e994..cfa6e8cd72 100644
--- a/dts/framework/logger.py
+++ b/dts/framework/logger.py
@@ -3,9 +3,9 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-DTS logger module with several log level. DTS framework and TestSuite logs
-are saved in different log files.
+"""DTS logger module.
+
+DTS framework and TestSuite logs are saved in different log files.
"""
import logging
@@ -18,19 +18,21 @@
stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
-class LoggerDictType(TypedDict):
- logger: "DTSLOG"
- name: str
- node: str
-
+class DTSLOG(logging.LoggerAdapter):
+ """DTS logger adapter class for framework and testsuites.
-# List for saving all using loggers
-Loggers: list[LoggerDictType] = []
+ The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
+ variable control the verbosity of output. If enabled, all messages will be emitted to the
+ console.
+ The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+ variable modify the directory where the logs will be stored.
-class DTSLOG(logging.LoggerAdapter):
- """
- DTS log class for framework and testsuite.
+ Attributes:
+ node: The additional identifier. Currently unused.
+ sh: The handler which emits logs to console.
+ fh: The handler which emits logs to a file.
+ verbose_fh: Just as fh, but logs with a different, more verbose, format.
"""
_logger: logging.Logger
@@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
verbose_fh: logging.FileHandler
def __init__(self, logger: logging.Logger, node: str = "suite"):
+ """Extend the constructor with additional handlers.
+
+ One handler logs to the console, the other one to a file, with either a regular or verbose
+ format.
+
+ Args:
+ logger: The logger from which to create the logger adapter.
+ node: An additional identifier. Currently unused.
+ """
self._logger = logger
# 1 means log everything, this will be used by file handlers if their level
# is not set
@@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
def logger_exit(self) -> None:
- """
- Remove stream handler and logfile handler.
- """
+ """Remove the stream handler and the logfile handler."""
for handler in (self.sh, self.fh, self.verbose_fh):
handler.flush()
self._logger.removeHandler(handler)
+class _LoggerDictType(TypedDict):
+ logger: DTSLOG
+ name: str
+ node: str
+
+
+# List for saving all loggers in use
+_Loggers: list[_LoggerDictType] = []
+
+
def getLogger(name: str, node: str = "suite") -> DTSLOG:
+ """Get DTS logger adapter identified by name and node.
+
+ An existing logger will be returned if one with the exact name and node already exists.
+ A new one will be created and stored otherwise.
+
+ Args:
+ name: The name of the logger.
+ node: An additional identifier for the logger.
+
+ Returns:
+ A logger uniquely identified by both name and node.
"""
- Get logger handler and if there's no handler for specified Node will create one.
- """
- global Loggers
+ global _Loggers
# return saved logger
- logger: LoggerDictType
- for logger in Loggers:
+ logger: _LoggerDictType
+ for logger in _Loggers:
if logger["name"] == name and logger["node"] == node:
return logger["logger"]
# return new logger
dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
- Loggers.append({"logger": dts_logger, "name": name, "node": node})
+ _Loggers.append({"logger": dts_logger, "name": name, "node": node})
return dts_logger
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index a0f2173949..cc5e458cc8 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -3,6 +3,16 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Various utility classes and functions.
+
+These are used in multiple modules across the framework. They're here because
+they provide some non-specific functionality, greatly simplify imports or just don't
+fit elsewhere.
+
+Attributes:
+ REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
+"""
+
import atexit
import json
import os
@@ -19,12 +29,20 @@
def expand_range(range_str: str) -> list[int]:
- """
- Process range string into a list of integers. There are two possible formats:
- n - a single integer
- n-m - a range of integers
+ """Process `range_str` into a list of integers.
+
+ There are two possible formats of `range_str`:
+
+ * ``n`` - a single integer,
+ * ``n-m`` - a range of integers.
- The returned range includes both n and m. Empty string returns an empty list.
+ The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
+
+ Args:
+ range_str: The range to expand.
+
+ Returns:
+ All the numbers from the range.
"""
expanded_range: list[int] = []
if range_str:
@@ -37,6 +55,14 @@ def expand_range(range_str: str) -> list[int]:
def get_packet_summaries(packets: list[Packet]) -> str:
+ """Format a string summary from `packets`.
+
+ Args:
+ packets: The packets to format.
+
+ Returns:
+ The summary of `packets`.
+ """
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -45,27 +71,36 @@ def get_packet_summaries(packets: list[Packet]) -> str:
class StrEnum(Enum):
+ """Enum with members stored as strings."""
+
@staticmethod
def _generate_next_value_(name: str, start: int, count: int, last_values: object) -> str:
return name
def __str__(self) -> str:
+ """The string representation is the name of the member."""
return self.name
class MesonArgs(object):
- """
- Aggregate the arguments needed to build DPDK:
- default_library: Default library type, Meson allows "shared", "static" and "both".
- Defaults to None, in which case the argument won't be used.
- Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
- Do not use -D with them, for example:
- meson_args = MesonArgs(enable_kmods=True).
- """
+ """Aggregate the arguments needed to build DPDK."""
_default_library: str
def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
+ """Initialize the meson arguments.
+
+ Args:
+ default_library: The default library type, Meson supports ``shared``, ``static`` and
+ ``both``. Defaults to :data:`None`, in which case the argument won't be used.
+ dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Example:
+ ::
+
+ meson_args = MesonArgs(enable_kmods=True).
+ """
self._default_library = f"--default-library={default_library}" if default_library else ""
self._dpdk_args = " ".join(
(
@@ -75,6 +110,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
)
def __str__(self) -> str:
+ """The actual args."""
return " ".join(f"{self._default_library} {self._dpdk_args}".split())
@@ -96,24 +132,14 @@ class _TarCompressionFormat(StrEnum):
class DPDKGitTarball(object):
- """Create a compressed tarball of DPDK from the repository.
-
- The DPDK version is specified with git object git_ref.
- The tarball will be compressed with _TarCompressionFormat,
- which must be supported by the DTS execution environment.
- The resulting tarball will be put into output_dir.
+ """Compressed tarball of DPDK from the repository.
- The class supports the os.PathLike protocol,
+ The class supports the :class:`os.PathLike` protocol,
which is used to get the Path of the tarball::
from pathlib import Path
tarball = DPDKGitTarball("HEAD", "output")
tarball_path = Path(tarball)
-
- Arguments:
- git_ref: A git commit ID, tag ID or tree ID.
- output_dir: The directory where to put the resulting tarball.
- tar_compression_format: The compression format to use.
"""
_git_ref: str
@@ -128,6 +154,17 @@ def __init__(
output_dir: str,
tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
):
+ """Create the tarball during initialization.
+
+ The DPDK version is specified with `git_ref`. The tarball will be compressed with
+ `tar_compression_format`, which must be supported by the DTS execution environment.
+ The resulting tarball will be put into `output_dir`.
+
+ Args:
+ git_ref: A git commit ID, tag ID or tree ID.
+ output_dir: The directory where to put the resulting tarball.
+ tar_compression_format: The compression format to use.
+ """
self._git_ref = git_ref
self._tar_compression_format = tar_compression_format
@@ -196,4 +233,5 @@ def _delete_tarball(self) -> None:
os.remove(self._tarball_path)
def __fspath__(self) -> str:
+ """The os.PathLike protocol implementation."""
return str(self._tarball_path)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 07/21] dts: dts runner and main docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (5 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 06/21] dts: logger and utils " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 08/21] dts: test suite " Juraj Linkeš
` (15 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/dts.py | 131 ++++++++++++++++++++++++++++++++++++-------
dts/main.py | 10 ++--
2 files changed, 116 insertions(+), 25 deletions(-)
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 356368ef10..e16d4578a0 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,33 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+r"""Test suite runner module.
+
+A DTS run is split into stages:
+
+ #. Execution stage,
+ #. Build target stage,
+ #. Test suite stage,
+ #. Test case stage.
+
+The module is responsible for running tests on testbeds defined in the test run configuration.
+Each setup or teardown of each stage is recorded in a :class:`~.test_result.DTSResult` or
+one of its subclasses. The test case results are also recorded.
+
+If an error occurs, the current stage is aborted, the error is recorded and the run continues in
+the next iteration of the same stage. The return code is the highest `severity` of all
+:class:`~.exception.DTSError`\s.
+
+Example:
+ An error occurs in a build target setup. The current build target is aborted and the run
+ continues with the next build target. If the errored build target was the last one in the given
+ execution, the next execution begins.
+
+Attributes:
+ dts_logger: The logger instance used in this module.
+ result: The top level result used in the module.
+"""
+
import sys
from .config import (
@@ -23,9 +50,38 @@
def run_all() -> None:
- """
- The main process of DTS. Runs all build targets in all executions from the main
- config file.
+ """Run all build targets in all executions from the test run configuration.
+
+ Before running test suites, executions and build targets are first set up.
+ The executions and build targets defined in the test run configuration are iterated over.
+ The executions define which tests to run and where to run them and build targets define
+ the DPDK build setup.
+
+ The tests suites are set up for each execution/build target tuple and each scheduled
+ test case within the test suite is set up, executed and torn down. After all test cases
+ have been executed, the test suite is torn down and the next build target will be tested.
+
+ All the nested steps look like this:
+
+ #. Execution setup
+
+ #. Build target setup
+
+ #. Test suite setup
+
+ #. Test case setup
+ #. Test case logic
+ #. Test case teardown
+
+ #. Test suite teardown
+
+ #. Build target teardown
+
+ #. Execution teardown
+
+ The test cases are filtered according to the specification in the test run configuration and
+ the :option:`--test-cases` command line argument or
+ the :envvar:`DTS_TESTCASES` environment variable.
"""
global dts_logger
global result
@@ -87,6 +143,8 @@ def run_all() -> None:
def _check_dts_python_version() -> None:
+ """Check the required Python version - v3.10."""
+
def RED(text: str) -> str:
return f"\u001B[31;1m{str(text)}\u001B[0m"
@@ -109,9 +167,16 @@ def _run_execution(
execution: ExecutionConfiguration,
result: DTSResult,
) -> None:
- """
- Run the given execution. This involves running the execution setup as well as
- running all build targets in the given execution.
+ """Run the given execution.
+
+ This involves running the execution setup as well as running all build targets
+ in the given execution. After that, execution teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: An execution's test run configuration.
+ result: The top level result object.
"""
dts_logger.info(f"Running execution with SUT '{execution.system_under_test_node.name}'.")
execution_result = result.add_execution(sut_node.config)
@@ -144,8 +209,18 @@ def _run_build_target(
execution: ExecutionConfiguration,
execution_result: ExecutionResult,
) -> None:
- """
- Run the given build target.
+ """Run the given build target.
+
+ This involves running the build target setup as well as running all test suites
+ in the given execution the build target is defined in.
+ After that, build target teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ build_target: A build target's test run configuration.
+ execution: The build target's execution's test run configuration.
+ execution_result: The execution level result object associated with the execution.
"""
dts_logger.info(f"Running build target '{build_target.name}'.")
build_target_result = execution_result.add_build_target(build_target)
@@ -177,10 +252,20 @@ def _run_all_suites(
execution: ExecutionConfiguration,
build_target_result: BuildTargetResult,
) -> None:
- """
- Use the given build_target to run execution's test suites
- with possibly only a subset of test cases.
- If no subset is specified, run all test cases.
+ """Run the execution's (possibly a subset) test suites using the current build target.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
+
+ If a blocking test suite (such as the smoke test suite) fails, the rest of the test suites
+ in the current build target won't be executed.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
"""
end_build_target = False
if not execution.skip_smoke_tests:
@@ -206,16 +291,22 @@ def _run_single_suite(
build_target_result: BuildTargetResult,
test_suite_config: TestSuiteConfig,
) -> None:
- """Runs a single test suite.
+ """Run all test suite in a single test suite module.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
Args:
- sut_node: Node to run tests on.
- execution: Execution the test case belongs to.
- build_target_result: Build target configuration test case is run on
- test_suite_config: Test suite configuration
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
+ test_suite_config: Test suite test run configuration specifying the test suite module
+ and possibly a subset of test cases of test suites in that module.
Raises:
- BlockingTestSuiteError: If a test suite that was marked as blocking fails.
+ BlockingTestSuiteError: If a blocking test suite fails.
"""
try:
full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
@@ -239,9 +330,7 @@ def _run_single_suite(
def _exit_dts() -> None:
- """
- Process all errors and exit with the proper exit code.
- """
+ """Process all errors and exit with the proper exit code."""
result.process()
if dts_logger:
diff --git a/dts/main.py b/dts/main.py
index 5d4714b0c3..b856ba86be 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -1,12 +1,10 @@
#!/usr/bin/env python3
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2010-2014 Intel Corporation
-# Copyright(c) 2022 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
-"""
-A test framework for testing DPDK.
-"""
+"""The DTS executable."""
import logging
@@ -17,6 +15,10 @@ def main() -> None:
"""Set DTS settings, then run DTS.
The DTS settings are taken from the command line arguments and the environment variables.
+ The settings object is stored in the module-level variable settings.SETTINGS which the entire
+ framework uses. After importing the module (or the variable), any changes to the variable are
+ not going to be reflected without a re-import. This means that the SETTINGS variable must
+ be modified before the settings module is imported anywhere else in the framework.
"""
settings.SETTINGS = settings.get_settings()
from framework import dts
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 08/21] dts: test suite docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (6 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 07/21] dts: dts runner and main " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 09/21] dts: test result " Juraj Linkeš
` (14 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_suite.py | 231 +++++++++++++++++++++++++++---------
1 file changed, 175 insertions(+), 56 deletions(-)
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index f9e66e814a..dfb391ffbd 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -2,8 +2,19 @@
# Copyright(c) 2010-2014 Intel Corporation
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Base class for creating DTS test cases.
+"""Features common to all test suites.
+
+The module defines the :class:`TestSuite` class which doesn't contain any test cases, and as such
+must be extended by subclasses which add test cases. The :class:`TestSuite` contains the basics
+needed by subclasses:
+
+ * Test suite and test case execution flow,
+ * Testbed (SUT, TG) configuration,
+ * Packet sending and verification,
+ * Test case verification.
+
+The module also defines a function, :func:`get_test_suites`,
+for gathering test suites from a Python module.
"""
import importlib
@@ -11,7 +22,7 @@
import re
from ipaddress import IPv4Interface, IPv6Interface, ip_interface
from types import MethodType
-from typing import Any, Union
+from typing import Any, ClassVar, Union
from scapy.layers.inet import IP # type: ignore[import]
from scapy.layers.l2 import Ether # type: ignore[import]
@@ -31,25 +42,44 @@
class TestSuite(object):
- """
- The base TestSuite class provides methods for handling basic flow of a test suite:
- * test case filtering and collection
- * test suite setup/cleanup
- * test setup/cleanup
- * test case execution
- * error handling and results storage
- Test cases are implemented by derived classes. Test cases are all methods
- starting with test_, further divided into performance test cases
- (starting with test_perf_) and functional test cases (all other test cases).
- By default, all test cases will be executed. A list of testcase str names
- may be specified in conf.yaml or on the command line
- to filter which test cases to run.
- The methods named [set_up|tear_down]_[suite|test_case] should be overridden
- in derived classes if the appropriate suite/test case fixtures are needed.
+ """The base class with methods for handling the basic flow of a test suite.
+
+ * Test case filtering and collection,
+ * Test suite setup/cleanup,
+ * Test setup/cleanup,
+ * Test case execution,
+ * Error handling and results storage.
+
+ Test cases are implemented by subclasses. Test cases are all methods starting with ``test_``,
+ further divided into performance test cases (starting with ``test_perf_``)
+ and functional test cases (all other test cases).
+
+ By default, all test cases will be executed. A list of testcase names may be specified
+ in the YAML test run configuration file and in the :option:`--test-cases` command line argument
+ or in the :envvar:`DTS_TESTCASES` environment variable to filter which test cases to run.
+ The union of both lists will be used. Any unknown test cases from the latter lists
+ will be silently ignored.
+
+ If the :option:`--re-run` command line argument or the :envvar:`DTS_RERUN` environment variable
+ is set, in case of a test case failure, the test case will be executed again until it passes
+ or it fails that many times in addition of the first failure.
+
+ The methods named ``[set_up|tear_down]_[suite|test_case]`` should be overridden in subclasses
+ if the appropriate test suite/test case fixtures are needed.
+
+ The test suite is aware of the testbed (the SUT and TG) it's running on. From this, it can
+ properly choose the IP addresses and other configuration that must be tailored to the testbed.
+
+ Attributes:
+ sut_node: The SUT node where the test suite is running.
+ tg_node: The TG node where the test suite is running.
"""
sut_node: SutNode
- is_blocking = False
+ tg_node: TGNode
+ #: Whether the test suite is blocking. A failure of a blocking test suite
+ #: will block the execution of all subsequent test suites in the current build target.
+ is_blocking: ClassVar[bool] = False
_logger: DTSLOG
_test_cases_to_run: list[str]
_func: bool
@@ -72,6 +102,20 @@ def __init__(
func: bool,
build_target_result: BuildTargetResult,
):
+ """Initialize the test suite testbed information and basic configuration.
+
+ Process what test cases to run, create the associated
+ :class:`~.test_result.TestSuiteResult`, find links between ports
+ and set up default IP addresses to be used when configuring them.
+
+ Args:
+ sut_node: The SUT node where the test suite will run.
+ tg_node: The TG node where the test suite will run.
+ test_cases: The list of test cases to execute.
+ If empty, all test cases will be executed.
+ func: Whether to run functional tests.
+ build_target_result: The build target result this test suite is run in.
+ """
self.sut_node = sut_node
self.tg_node = tg_node
self._logger = getLogger(self.__class__.__name__)
@@ -95,6 +139,7 @@ def __init__(
self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
def _process_links(self) -> None:
+ """Construct links between SUT and TG ports."""
for sut_port in self.sut_node.ports:
for tg_port in self.tg_node.ports:
if (sut_port.identifier, sut_port.peer) == (
@@ -104,27 +149,42 @@ def _process_links(self) -> None:
self._port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
def set_up_suite(self) -> None:
- """
- Set up test fixtures common to all test cases; this is done before
- any test case is run.
+ """Set up test fixtures common to all test cases.
+
+ This is done before any test case has been run.
"""
def tear_down_suite(self) -> None:
- """
- Tear down the previously created test fixtures common to all test cases.
+ """Tear down the previously created test fixtures common to all test cases.
+
+ This is done after all test have been run.
"""
def set_up_test_case(self) -> None:
- """
- Set up test fixtures before each test case.
+ """Set up test fixtures before each test case.
+
+ This is done before *each* test case.
"""
def tear_down_test_case(self) -> None:
- """
- Tear down the previously created test fixtures after each test case.
+ """Tear down the previously created test fixtures after each test case.
+
+ This is done after *each* test case.
"""
def configure_testbed_ipv4(self, restore: bool = False) -> None:
+ """Configure IPv4 addresses on all testbed ports.
+
+ The configured ports are:
+
+ * SUT ingress port,
+ * SUT egress port,
+ * TG ingress port,
+ * TG egress port.
+
+ Args:
+ restore: If :data:`True`, will remove the configuration instead.
+ """
delete = True if restore else False
enable = False if restore else True
self._configure_ipv4_forwarding(enable)
@@ -149,11 +209,17 @@ def _configure_ipv4_forwarding(self, enable: bool) -> None:
self.sut_node.configure_ipv4_forwarding(enable)
def send_packet_and_capture(self, packet: Packet, duration: float = 1) -> list[Packet]:
- """
- Send a packet through the appropriate interface and
- receive on the appropriate interface.
- Modify the packet with l3/l2 addresses corresponding
- to the testbed and desired traffic.
+ """Send and receive `packet` using the associated TG.
+
+ Send `packet` through the appropriate interface and receive on the appropriate interface.
+ Modify the packet with l3/l2 addresses corresponding to the testbed and desired traffic.
+
+ Args:
+ packet: The packet to send.
+ duration: Capture traffic for this amount of time after sending `packet`.
+
+ Returns:
+ A list of received packets.
"""
packet = self._adjust_addresses(packet)
return self.tg_node.send_packet_and_capture(
@@ -161,13 +227,26 @@ def send_packet_and_capture(self, packet: Packet, duration: float = 1) -> list[P
)
def get_expected_packet(self, packet: Packet) -> Packet:
+ """Inject the proper L2/L3 addresses into `packet`.
+
+ Args:
+ packet: The packet to modify.
+
+ Returns:
+ `packet` with injected L2/L3 addresses.
+ """
return self._adjust_addresses(packet, expected=True)
def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
- """
+ """L2 and L3 address additions in both directions.
+
Assumptions:
- Two links between SUT and TG, one link is TG -> SUT,
- the other SUT -> TG.
+ Two links between SUT and TG, one link is TG -> SUT, the other SUT -> TG.
+
+ Args:
+ packet: The packet to modify.
+ expected: If :data:`True`, the direction is SUT -> TG,
+ otherwise the direction is TG -> SUT.
"""
if expected:
# The packet enters the TG from SUT
@@ -193,6 +272,19 @@ def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
return Ether(packet.build())
def verify(self, condition: bool, failure_description: str) -> None:
+ """Verify `condition` and handle failures.
+
+ When `condition` is :data:`False`, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ condition: The condition to check.
+ failure_description: A short description of the failure
+ that will be stored in the raised exception.
+
+ Raises:
+ TestCaseVerifyError: `condition` is :data:`False`.
+ """
if not condition:
self._fail_test_case_verify(failure_description)
@@ -206,6 +298,19 @@ def _fail_test_case_verify(self, failure_description: str) -> None:
raise TestCaseVerifyError(failure_description)
def verify_packets(self, expected_packet: Packet, received_packets: list[Packet]) -> None:
+ """Verify that `expected_packet` has been received.
+
+ Go through `received_packets` and check that `expected_packet` is among them.
+ If not, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ expected_packet: The packet we're expecting to receive.
+ received_packets: The packets where we're looking for `expected_packet`.
+
+ Raises:
+ TestCaseVerifyError: `expected_packet` is not among `received_packets`.
+ """
for received_packet in received_packets:
if self._compare_packets(expected_packet, received_packet):
break
@@ -280,10 +385,14 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
return True
def run(self) -> None:
- """
- Setup, execute and teardown the whole suite.
- Suite execution consists of running all test cases scheduled to be executed.
- A test cast run consists of setup, execution and teardown of said test case.
+ """Set up, execute and tear down the whole suite.
+
+ Test suite execution consists of running all test cases scheduled to be executed.
+ A test case run consists of setup, execution and teardown of said test case.
+
+ Record the setup and the teardown and handle failures.
+
+ The list of scheduled test cases is constructed when creating the :class:`TestSuite` object.
"""
test_suite_name = self.__class__.__name__
@@ -315,9 +424,7 @@ def run(self) -> None:
raise BlockingTestSuiteError(test_suite_name)
def _execute_test_suite(self) -> None:
- """
- Execute all test cases scheduled to be executed in this suite.
- """
+ """Execute all test cases scheduled to be executed in this suite."""
if self._func:
for test_case_method in self._get_functional_test_cases():
test_case_name = test_case_method.__name__
@@ -334,14 +441,18 @@ def _execute_test_suite(self) -> None:
self._run_test_case(test_case_method, test_case_result)
def _get_functional_test_cases(self) -> list[MethodType]:
- """
- Get all functional test cases.
+ """Get all functional test cases defined in this TestSuite.
+
+ Returns:
+ The list of functional test cases of this TestSuite.
"""
return self._get_test_cases(r"test_(?!perf_)")
def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
- """
- Return a list of test cases matching test_case_regex.
+ """Return a list of test cases matching test_case_regex.
+
+ Returns:
+ The list of test cases matching test_case_regex of this TestSuite.
"""
self._logger.debug(f"Searching for test cases in {self.__class__.__name__}.")
filtered_test_cases = []
@@ -353,9 +464,7 @@ def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
return filtered_test_cases
def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool:
- """
- Check whether the test case should be executed.
- """
+ """Check whether the test case should be scheduled to be executed."""
match = bool(re.match(test_case_regex, test_case_name))
if self._test_cases_to_run:
return match and test_case_name in self._test_cases_to_run
@@ -365,9 +474,9 @@ def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool
def _run_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Setup, execute and teardown a test case in this suite.
- Exceptions are caught and recorded in logs and results.
+ """Setup, execute and teardown a test case in this suite.
+
+ Record the result of the setup and the teardown and handle failures.
"""
test_case_name = test_case_method.__name__
@@ -402,9 +511,7 @@ def _run_test_case(
def _execute_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Execute one test case and handle failures.
- """
+ """Execute one test case, record the result and handle failures."""
test_case_name = test_case_method.__name__
try:
self._logger.info(f"Starting test case execution: {test_case_name}")
@@ -425,6 +532,18 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
+ r"""Find all :class:`TestSuite`\s in a Python module.
+
+ Args:
+ testsuite_module_path: The path to the Python module.
+
+ Returns:
+ The list of :class:`TestSuite`\s found within the Python module.
+
+ Raises:
+ ConfigurationError: The test suite module was not found.
+ """
+
def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 09/21] dts: test result docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (7 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 08/21] dts: test suite " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 10/21] dts: config " Juraj Linkeš
` (13 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_result.py | 297 ++++++++++++++++++++++++++++-------
1 file changed, 239 insertions(+), 58 deletions(-)
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 57090feb04..4467749a9d 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -2,8 +2,25 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-Generic result container and reporters
+r"""Record and process DTS results.
+
+The results are recorded in a hierarchical manner:
+
+ * :class:`DTSResult` contains
+ * :class:`ExecutionResult` contains
+ * :class:`BuildTargetResult` contains
+ * :class:`TestSuiteResult` contains
+ * :class:`TestCaseResult`
+
+Each result may contain multiple lower level results, e.g. there are multiple
+:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
+The results have common parts, such as setup and teardown results, captured in :class:`BaseResult`,
+which also defines some common behaviors in its methods.
+
+Each result class has its own idiosyncrasies which they implement in overridden methods.
+
+The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+variable modify the directory where the files with results will be stored.
"""
import os.path
@@ -26,26 +43,34 @@
class Result(Enum):
- """
- An Enum defining the possible states that
- a setup, a teardown or a test case may end up in.
- """
+ """The possible states that a setup, a teardown or a test case may end up in."""
+ #:
PASS = auto()
+ #:
FAIL = auto()
+ #:
ERROR = auto()
+ #:
SKIP = auto()
def __bool__(self) -> bool:
+ """Only PASS is True."""
return self is self.PASS
class FixtureResult(object):
- """
- A record that stored the result of a setup or a teardown.
- The default is FAIL because immediately after creating the object
- the setup of the corresponding stage will be executed, which also guarantees
- the execution of teardown.
+ """A record that stores the result of a setup or a teardown.
+
+ :attr:`~Result.FAIL` is a sensible default since it prevents false positives (which could happen
+ if the default was :attr:`~Result.PASS`).
+
+ Preventing false positives or other false results is preferable since a failure
+ is mostly likely to be investigated (the other false results may not be investigated at all).
+
+ Attributes:
+ result: The associated result.
+ error: The error in case of a failure.
"""
result: Result
@@ -56,21 +81,37 @@ def __init__(
result: Result = Result.FAIL,
error: Exception | None = None,
):
+ """Initialize the constructor with the fixture result and store a possible error.
+
+ Args:
+ result: The result to store.
+ error: The error which happened when a failure occurred.
+ """
self.result = result
self.error = error
def __bool__(self) -> bool:
+ """A wrapper around the stored :class:`Result`."""
return bool(self.result)
class Statistics(dict):
- """
- A helper class used to store the number of test cases by its result
- along a few other basic information.
- Using a dict provides a convenient way to format the data.
+ """How many test cases ended in which result state along some other basic information.
+
+ Subclassing :class:`dict` provides a convenient way to format the data.
+
+ The data are stored in the following keys:
+
+ * **PASS RATE** (:class:`int`) -- The FAIL/PASS ratio of all test cases.
+ * **DPDK VERSION** (:class:`str`) -- The tested DPDK version.
"""
def __init__(self, dpdk_version: str | None):
+ """Extend the constructor with keys in which the data are stored.
+
+ Args:
+ dpdk_version: The version of tested DPDK.
+ """
super(Statistics, self).__init__()
for result in Result:
self[result.name] = 0
@@ -78,8 +119,17 @@ def __init__(self, dpdk_version: str | None):
self["DPDK VERSION"] = dpdk_version
def __iadd__(self, other: Result) -> "Statistics":
- """
- Add a Result to the final count.
+ """Add a Result to the final count.
+
+ Example:
+ stats: Statistics = Statistics() # empty Statistics
+ stats += Result.PASS # add a Result to `stats`
+
+ Args:
+ other: The Result to add to this statistics object.
+
+ Returns:
+ The modified statistics object.
"""
self[other.name] += 1
self["PASS RATE"] = (
@@ -88,9 +138,7 @@ def __iadd__(self, other: Result) -> "Statistics":
return self
def __str__(self) -> str:
- """
- Provide a string representation of the data.
- """
+ """Each line contains the formatted key = value pair."""
stats_str = ""
for key, value in self.items():
stats_str += f"{key:<12} = {value}\n"
@@ -100,10 +148,16 @@ def __str__(self) -> str:
class BaseResult(object):
- """
- The Base class for all results. Stores the results of
- the setup and teardown portions of the corresponding stage
- and a list of results from each inner stage in _inner_results.
+ """Common data and behavior of DTS results.
+
+ Stores the results of the setup and teardown portions of the corresponding stage.
+ The hierarchical nature of DTS results is captured recursively in an internal list.
+ A stage is each level in this particular hierarchy (pre-execution or the top-most level,
+ execution, build target, test suite and test case.)
+
+ Attributes:
+ setup_result: The result of the setup of the particular stage.
+ teardown_result: The results of the teardown of the particular stage.
"""
setup_result: FixtureResult
@@ -111,15 +165,28 @@ class BaseResult(object):
_inner_results: MutableSequence["BaseResult"]
def __init__(self):
+ """Initialize the constructor."""
self.setup_result = FixtureResult()
self.teardown_result = FixtureResult()
self._inner_results = []
def update_setup(self, result: Result, error: Exception | None = None) -> None:
+ """Store the setup result.
+
+ Args:
+ result: The result of the setup.
+ error: The error that occurred in case of a failure.
+ """
self.setup_result.result = result
self.setup_result.error = error
def update_teardown(self, result: Result, error: Exception | None = None) -> None:
+ """Store the teardown result.
+
+ Args:
+ result: The result of the teardown.
+ error: The error that occurred in case of a failure.
+ """
self.teardown_result.result = result
self.teardown_result.error = error
@@ -137,27 +204,55 @@ def _get_inner_errors(self) -> list[Exception]:
]
def get_errors(self) -> list[Exception]:
+ """Compile errors from the whole result hierarchy.
+
+ Returns:
+ The errors from setup, teardown and all errors found in the whole result hierarchy.
+ """
return self._get_setup_teardown_errors() + self._get_inner_errors()
def add_stats(self, statistics: Statistics) -> None:
+ """Collate stats from the whole result hierarchy.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be collated.
+ """
for inner_result in self._inner_results:
inner_result.add_stats(statistics)
class TestCaseResult(BaseResult, FixtureResult):
- """
- The test case specific result.
- Stores the result of the actual test case.
- Also stores the test case name.
+ r"""The test case specific result.
+
+ Stores the result of the actual test case. This is done by adding an extra superclass
+ in :class:`FixtureResult`. The setup and teardown results are :class:`FixtureResult`\s and
+ the class is itself a record of the test case.
+
+ Attributes:
+ test_case_name: The test case name.
"""
test_case_name: str
def __init__(self, test_case_name: str):
+ """Extend the constructor with `test_case_name`.
+
+ Args:
+ test_case_name: The test case's name.
+ """
super(TestCaseResult, self).__init__()
self.test_case_name = test_case_name
def update(self, result: Result, error: Exception | None = None) -> None:
+ """Update the test case result.
+
+ This updates the result of the test case itself and doesn't affect
+ the results of the setup and teardown steps in any way.
+
+ Args:
+ result: The result of the test case.
+ error: The error that occurred in case of a failure.
+ """
self.result = result
self.error = error
@@ -167,36 +262,64 @@ def _get_inner_errors(self) -> list[Exception]:
return []
def add_stats(self, statistics: Statistics) -> None:
+ r"""Add the test case result to statistics.
+
+ The base method goes through the hierarchy recursively and this method is here to stop
+ the recursion, as the :class:`TestCaseResult`\s are the leaves of the hierarchy tree.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be added.
+ """
statistics += self.result
def __bool__(self) -> bool:
+ """The test case passed only if setup, teardown and the test case itself passed."""
return bool(self.setup_result) and bool(self.teardown_result) and bool(self.result)
class TestSuiteResult(BaseResult):
- """
- The test suite specific result.
- The _inner_results list stores results of test cases in a given test suite.
- Also stores the test suite name.
+ """The test suite specific result.
+
+ The internal list stores the results of all test cases in a given test suite.
+
+ Attributes:
+ suite_name: The test suite name.
"""
suite_name: str
def __init__(self, suite_name: str):
+ """Extend the constructor with `suite_name`.
+
+ Args:
+ suite_name: The test suite's name.
+ """
super(TestSuiteResult, self).__init__()
self.suite_name = suite_name
def add_test_case(self, test_case_name: str) -> TestCaseResult:
+ """Add and return the inner result (test case).
+
+ Returns:
+ The test case's result.
+ """
test_case_result = TestCaseResult(test_case_name)
self._inner_results.append(test_case_result)
return test_case_result
class BuildTargetResult(BaseResult):
- """
- The build target specific result.
- The _inner_results list stores results of test suites in a given build target.
- Also stores build target specifics, such as compiler used to build DPDK.
+ """The build target specific result.
+
+ The internal list stores the results of all test suites in a given build target.
+
+ Attributes:
+ arch: The DPDK build target architecture.
+ os: The DPDK build target operating system.
+ cpu: The DPDK build target CPU.
+ compiler: The DPDK build target compiler.
+ compiler_version: The DPDK build target compiler version.
+ dpdk_version: The built DPDK version.
"""
arch: Architecture
@@ -207,6 +330,11 @@ class BuildTargetResult(BaseResult):
dpdk_version: str | None
def __init__(self, build_target: BuildTargetConfiguration):
+ """Extend the constructor with the `build_target`'s build target config.
+
+ Args:
+ build_target: The build target's test run configuration.
+ """
super(BuildTargetResult, self).__init__()
self.arch = build_target.arch
self.os = build_target.os
@@ -216,20 +344,35 @@ def __init__(self, build_target: BuildTargetConfiguration):
self.dpdk_version = None
def add_build_target_info(self, versions: BuildTargetInfo) -> None:
+ """Add information about the build target gathered at runtime.
+
+ Args:
+ versions: The additional information.
+ """
self.compiler_version = versions.compiler_version
self.dpdk_version = versions.dpdk_version
def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
+ """Add and return the inner result (test suite).
+
+ Returns:
+ The test suite's result.
+ """
test_suite_result = TestSuiteResult(test_suite_name)
self._inner_results.append(test_suite_result)
return test_suite_result
class ExecutionResult(BaseResult):
- """
- The execution specific result.
- The _inner_results list stores results of build targets in a given execution.
- Also stores the SUT node configuration.
+ """The execution specific result.
+
+ The internal list stores the results of all build targets in a given execution.
+
+ Attributes:
+ sut_node: The SUT node used in the execution.
+ sut_os_name: The operating system of the SUT node.
+ sut_os_version: The operating system version of the SUT node.
+ sut_kernel_version: The operating system kernel version of the SUT node.
"""
sut_node: NodeConfiguration
@@ -238,34 +381,53 @@ class ExecutionResult(BaseResult):
sut_kernel_version: str
def __init__(self, sut_node: NodeConfiguration):
+ """Extend the constructor with the `sut_node`'s config.
+
+ Args:
+ sut_node: The SUT node's test run configuration used in the execution.
+ """
super(ExecutionResult, self).__init__()
self.sut_node = sut_node
def add_build_target(self, build_target: BuildTargetConfiguration) -> BuildTargetResult:
+ """Add and return the inner result (build target).
+
+ Args:
+ build_target: The build target's test run configuration.
+
+ Returns:
+ The build target's result.
+ """
build_target_result = BuildTargetResult(build_target)
self._inner_results.append(build_target_result)
return build_target_result
def add_sut_info(self, sut_info: NodeInfo) -> None:
+ """Add SUT information gathered at runtime.
+
+ Args:
+ sut_info: The additional SUT node information.
+ """
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
class DTSResult(BaseResult):
- """
- Stores environment information and test results from a DTS run, which are:
- * Execution level information, such as SUT and TG hardware.
- * Build target level information, such as compiler, target OS and cpu.
- * Test suite results.
- * All errors that are caught and recorded during DTS execution.
+ """Stores environment information and test results from a DTS run.
- The information is stored in nested objects.
+ * Execution level information, such as testbed and the test suite list,
+ * Build target level information, such as compiler, target OS and cpu,
+ * Test suite and test case results,
+ * All errors that are caught and recorded during DTS execution.
- The class is capable of computing the return code used to exit DTS with
- from the stored error.
+ The information is stored hierarchically. This is the first level of the hierarchy
+ and as such is where the data form the whole hierarchy is collated or processed.
- It also provides a brief statistical summary of passed/failed test cases.
+ The internal list stores the results of all executions.
+
+ Attributes:
+ dpdk_version: The DPDK version to record.
"""
dpdk_version: str | None
@@ -276,6 +438,11 @@ class DTSResult(BaseResult):
_stats_filename: str
def __init__(self, logger: DTSLOG):
+ """Extend the constructor with top-level specifics.
+
+ Args:
+ logger: The logger instance the whole result will use.
+ """
super(DTSResult, self).__init__()
self.dpdk_version = None
self._logger = logger
@@ -285,21 +452,33 @@ def __init__(self, logger: DTSLOG):
self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
+ """Add and return the inner result (execution).
+
+ Args:
+ sut_node: The SUT node's test run configuration.
+
+ Returns:
+ The execution's result.
+ """
execution_result = ExecutionResult(sut_node)
self._inner_results.append(execution_result)
return execution_result
def add_error(self, error: Exception) -> None:
+ """Record an error that occurred outside any execution.
+
+ Args:
+ error: The exception to record.
+ """
self._errors.append(error)
def process(self) -> None:
- """
- Process the data after a DTS run.
- The data is added to nested objects during runtime and this parent object
- is not updated at that time. This requires us to process the nested data
- after it's all been gathered.
+ """Process the data after a whole DTS run.
+
+ The data is added to inner objects during runtime and this object is not updated
+ at that time. This requires us to process the inner data after it's all been gathered.
- The processing gathers all errors and the result statistics of test cases.
+ The processing gathers all errors and the statistics of test case results.
"""
self._errors += self.get_errors()
if self._errors and self._logger:
@@ -313,8 +492,10 @@ def process(self) -> None:
stats_file.write(str(self._stats_result))
def get_return_code(self) -> int:
- """
- Go through all stored Exceptions and return the highest error code found.
+ """Go through all stored Exceptions and return the final DTS error code.
+
+ Returns:
+ The highest error code found.
"""
for error in self._errors:
error_return_code = ErrorSeverity.GENERIC_ERR
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 10/21] dts: config docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (8 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 09/21] dts: test result " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 11/21] dts: remote session " Juraj Linkeš
` (12 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 369 ++++++++++++++++++++++++++-----
dts/framework/config/types.py | 132 +++++++++++
2 files changed, 444 insertions(+), 57 deletions(-)
create mode 100644 dts/framework/config/types.py
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index ef25a463c0..62eded7f04 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -3,8 +3,34 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Yaml config parsing methods
+"""Testbed configuration and test suite specification.
+
+This package offers classes that hold real-time information about the testbed, hold test run
+configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
+the YAML test run configuration file
+and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+
+The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
+this package. The allowed keys and types inside this dictionary are defined in
+the :doc:`types <framework.config.types>` module.
+
+The test run configuration has two main sections:
+
+ * The :class:`ExecutionConfiguration` which defines what tests are going to be run
+ and how DPDK will be built. It also references the testbed where these tests and DPDK
+ are going to be run,
+ * The nodes of the testbed are defined in the other section,
+ a :class:`list` of :class:`NodeConfiguration` objects.
+
+The real-time information about testbed is supposed to be gathered at runtime.
+
+The classes defined in this package make heavy use of :mod:`dataclasses`.
+All of them use slots and are frozen:
+
+ * Slots enables some optimizations, by pre-allocating space for the defined
+ attributes in the underlying data structure,
+ * Frozen makes the object immutable. This enables further optimizations,
+ and makes it thread safe should we ever want to move in that direction.
"""
import json
@@ -12,11 +38,20 @@
import pathlib
from dataclasses import dataclass
from enum import auto, unique
-from typing import Any, TypedDict, Union
+from typing import Union
import warlock # type: ignore[import]
import yaml
+from framework.config.types import (
+ BuildTargetConfigDict,
+ ConfigurationDict,
+ ExecutionConfigDict,
+ NodeConfigDict,
+ PortConfigDict,
+ TestSuiteConfigDict,
+ TrafficGeneratorConfigDict,
+)
from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -24,55 +59,97 @@
@unique
class Architecture(StrEnum):
+ r"""The supported architectures of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
i686 = auto()
+ #:
x86_64 = auto()
+ #:
x86_32 = auto()
+ #:
arm64 = auto()
+ #:
ppc64le = auto()
@unique
class OS(StrEnum):
+ r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
linux = auto()
+ #:
freebsd = auto()
+ #:
windows = auto()
@unique
class CPUType(StrEnum):
+ r"""The supported CPUs of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
native = auto()
+ #:
armv8a = auto()
+ #:
dpaa2 = auto()
+ #:
thunderx = auto()
+ #:
xgene1 = auto()
@unique
class Compiler(StrEnum):
+ r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
gcc = auto()
+ #:
clang = auto()
+ #:
icc = auto()
+ #:
msvc = auto()
@unique
class TrafficGeneratorType(StrEnum):
+ """The supported traffic generators."""
+
+ #:
SCAPY = auto()
-# Slots enables some optimizations, by pre-allocating space for the defined
-# attributes in the underlying data structure.
-#
-# Frozen makes the object immutable. This enables further optimizations,
-# and makes it thread safe should we every want to move in that direction.
@dataclass(slots=True, frozen=True)
class HugepageConfiguration:
+ r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ amount: The number of hugepages.
+ force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
+ """
+
amount: int
force_first_numa: bool
@dataclass(slots=True, frozen=True)
class PortConfig:
+ r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ node: The :class:`~framework.testbed_model.node.Node` where this port exists.
+ pci: The PCI address of the port.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK.
+ os_driver: The operating system driver name when the operating system controls the port.
+ peer_node: The :class:`~framework.testbed_model.node.Node` of the port
+ connected to this port.
+ peer_pci: The PCI address of the port connected to this port.
+ """
+
node: str
pci: str
os_driver_for_dpdk: str
@@ -81,18 +158,44 @@ class PortConfig:
peer_pci: str
@staticmethod
- def from_dict(node: str, d: dict) -> "PortConfig":
+ def from_dict(node: str, d: PortConfigDict) -> "PortConfig":
+ """A convenience method that creates the object from fewer inputs.
+
+ Args:
+ node: The node where this port exists.
+ d: The configuration dictionary.
+
+ Returns:
+ The port configuration instance.
+ """
return PortConfig(node=node, **d)
@dataclass(slots=True, frozen=True)
class TrafficGeneratorConfig:
+ """The configuration of traffic generators.
+
+ The class will be expanded when more configuration is needed.
+
+ Attributes:
+ traffic_generator_type: The type of the traffic generator.
+ """
+
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
- # This looks useless now, but is designed to allow expansion to traffic
- # generators that require more configuration later.
+ def from_dict(d: TrafficGeneratorConfigDict) -> "ScapyTrafficGeneratorConfig":
+ """A convenience method that produces traffic generator config of the proper type.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The traffic generator configuration instance.
+
+ Raises:
+ ConfigurationError: An unknown traffic generator type was encountered.
+ """
match TrafficGeneratorType(d["type"]):
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGeneratorConfig(
@@ -104,11 +207,31 @@ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
@dataclass(slots=True, frozen=True)
class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+ """Scapy traffic generator specific configuration."""
+
pass
@dataclass(slots=True, frozen=True)
class NodeConfiguration:
+ r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ name: The name of the :class:`~framework.testbed_model.node.Node`.
+ hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
+ Can be an IP or a domain name.
+ user: The name of the user used to connect to
+ the :class:`~framework.testbed_model.node.Node`.
+ password: The password of the user. The use of passwords is heavily discouraged.
+ Please use keys instead.
+ arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
+ os: The operating system of the :class:`~framework.testbed_model.node.Node`.
+ lcores: A comma delimited list of logical cores to use when running DPDK.
+ use_first_core: If :data:`True`, the first logical core won't be used.
+ hugepages: An optional hugepage configuration.
+ ports: The ports that can be used in testing.
+ """
+
name: str
hostname: str
user: str
@@ -121,55 +244,89 @@ class NodeConfiguration:
ports: list[PortConfig]
@staticmethod
- def from_dict(d: dict) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
- hugepage_config = d.get("hugepages")
- if hugepage_config:
- if "force_first_numa" not in hugepage_config:
- hugepage_config["force_first_numa"] = False
- hugepage_config = HugepageConfiguration(**hugepage_config)
-
- common_config = {
- "name": d["name"],
- "hostname": d["hostname"],
- "user": d["user"],
- "password": d.get("password"),
- "arch": Architecture(d["arch"]),
- "os": OS(d["os"]),
- "lcores": d.get("lcores", "1"),
- "use_first_core": d.get("use_first_core", False),
- "hugepages": hugepage_config,
- "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
- }
-
+ def from_dict(
+ d: NodeConfigDict,
+ ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
+ """A convenience method that processes the inputs before creating a specialized instance.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ Either an SUT or TG configuration instance.
+ """
+ hugepage_config = None
+ if "hugepages" in d:
+ hugepage_config_dict = d["hugepages"]
+ if "force_first_numa" not in hugepage_config_dict:
+ hugepage_config_dict["force_first_numa"] = False
+ hugepage_config = HugepageConfiguration(**hugepage_config_dict)
+
+ # The calls here contain duplicated code which is here because Mypy doesn't
+ # properly support dictionary unpacking with TypedDicts
if "traffic_generator" in d:
return TGNodeConfiguration(
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
- **common_config,
)
else:
return SutNodeConfiguration(
- memory_channels=d.get("memory_channels", 1), **common_config
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
+ memory_channels=d.get("memory_channels", 1),
)
@dataclass(slots=True, frozen=True)
class SutNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
+
+ Attributes:
+ memory_channels: The number of memory channels to use when running DPDK.
+ """
+
memory_channels: int
@dataclass(slots=True, frozen=True)
class TGNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
+
+ Attributes:
+ traffic_generator: The configuration of the traffic generator present on the TG node.
+ """
+
traffic_generator: ScapyTrafficGeneratorConfig
@dataclass(slots=True, frozen=True)
class NodeInfo:
- """Class to hold important versions within the node.
-
- This class, unlike the NodeConfiguration class, cannot be generated at the start.
- This is because we need to initialize a connection with the node before we can
- collect the information needed in this class. Therefore, it cannot be a part of
- the configuration class above.
+ """Supplemental node information.
+
+ Attributes:
+ os_name: The name of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ os_version: The version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ kernel_version: The kernel version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
"""
os_name: str
@@ -179,6 +336,20 @@ class NodeInfo:
@dataclass(slots=True, frozen=True)
class BuildTargetConfiguration:
+ """DPDK build configuration.
+
+ The configuration used for building DPDK.
+
+ Attributes:
+ arch: The target architecture to build for.
+ os: The target os to build for.
+ cpu: The target CPU to build for.
+ compiler: The compiler executable to use.
+ compiler_wrapper: This string will be put in front of the compiler when
+ executing the build. Useful for adding wrapper commands, such as ``ccache``.
+ name: The name of the compiler.
+ """
+
arch: Architecture
os: OS
cpu: CPUType
@@ -187,7 +358,18 @@ class BuildTargetConfiguration:
name: str
@staticmethod
- def from_dict(d: dict) -> "BuildTargetConfiguration":
+ def from_dict(d: BuildTargetConfigDict) -> "BuildTargetConfiguration":
+ r"""A convenience method that processes the inputs before creating an instance.
+
+ `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
+ `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The build target configuration instance.
+ """
return BuildTargetConfiguration(
arch=Architecture(d["arch"]),
os=OS(d["os"]),
@@ -200,23 +382,29 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
@dataclass(slots=True, frozen=True)
class BuildTargetInfo:
- """Class to hold important versions within the build target.
+ """Various versions and other information about a build target.
- This is very similar to the NodeInfo class, it just instead holds information
- for the build target.
+ Attributes:
+ dpdk_version: The DPDK version that was built.
+ compiler_version: The version of the compiler used to build DPDK.
"""
dpdk_version: str
compiler_version: str
-class TestSuiteConfigDict(TypedDict):
- suite: str
- cases: list[str]
-
-
@dataclass(slots=True, frozen=True)
class TestSuiteConfig:
+ """Test suite configuration.
+
+ Information about a single test suite to be executed.
+
+ Attributes:
+ test_suite: The name of the test suite module without the starting ``TestSuite_``.
+ test_cases: The names of test cases from this test suite to execute.
+ If empty, all test cases will be executed.
+ """
+
test_suite: str
test_cases: list[str]
@@ -224,6 +412,14 @@ class TestSuiteConfig:
def from_dict(
entry: str | TestSuiteConfigDict,
) -> "TestSuiteConfig":
+ """Create an instance from two different types.
+
+ Args:
+ entry: Either a suite name or a dictionary containing the config.
+
+ Returns:
+ The test suite configuration instance.
+ """
if isinstance(entry, str):
return TestSuiteConfig(test_suite=entry, test_cases=[])
elif isinstance(entry, dict):
@@ -234,19 +430,49 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class ExecutionConfiguration:
+ """The configuration of an execution.
+
+ The configuration contains testbed information, what tests to execute
+ and with what DPDK build.
+
+ Attributes:
+ build_targets: A list of DPDK builds to test.
+ perf: Whether to run performance tests.
+ func: Whether to run functional tests.
+ skip_smoke_tests: Whether to skip smoke tests.
+ test_suites: The names of test suites and/or test cases to execute.
+ system_under_test_node: The SUT node to use in this execution.
+ traffic_generator_node: The TG node to use in this execution.
+ vdevs: The names of virtual devices to test.
+ """
+
build_targets: list[BuildTargetConfiguration]
perf: bool
func: bool
+ skip_smoke_tests: bool
test_suites: list[TestSuiteConfig]
system_under_test_node: SutNodeConfiguration
traffic_generator_node: TGNodeConfiguration
vdevs: list[str]
- skip_smoke_tests: bool
@staticmethod
def from_dict(
- d: dict, node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]]
+ d: ExecutionConfigDict,
+ node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]],
) -> "ExecutionConfiguration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ The build target and the test suite config are transformed into their respective objects.
+ SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
+ are just stored.
+
+ Args:
+ d: The configuration dictionary.
+ node_map: A dictionary mapping node names to their config objects.
+
+ Returns:
+ The execution configuration instance.
+ """
build_targets: list[BuildTargetConfiguration] = list(
map(BuildTargetConfiguration.from_dict, d["build_targets"])
)
@@ -283,10 +509,31 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class Configuration:
+ """DTS testbed and test configuration.
+
+ The node configuration is not stored in this object. Rather, all used node configurations
+ are stored inside the execution configuration where the nodes are actually used.
+
+ Attributes:
+ executions: Execution configurations.
+ """
+
executions: list[ExecutionConfiguration]
@staticmethod
- def from_dict(d: dict) -> "Configuration":
+ def from_dict(d: ConfigurationDict) -> "Configuration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ Build target and test suite config are transformed into their respective objects.
+ SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
+ are just stored.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The whole configuration instance.
+ """
nodes: list[Union[SutNodeConfiguration | TGNodeConfiguration]] = list(
map(NodeConfiguration.from_dict, d["nodes"])
)
@@ -303,9 +550,17 @@ def from_dict(d: dict) -> "Configuration":
def load_config() -> Configuration:
- """
- Loads the configuration file and the configuration file schema,
- validates the configuration file, and creates a configuration object.
+ """Load DTS test run configuration from a file.
+
+ Load the YAML test run configuration file
+ and :download:`the configuration file schema <conf_yaml_schema.json>`,
+ validate the test run configuration file, and create a test run configuration object.
+
+ The YAML test run configuration file is specified in the :option:`--config-file` command line
+ argument or the :envvar:`DTS_CFG_FILE` environment variable.
+
+ Returns:
+ The parsed test run configuration.
"""
with open(SETTINGS.config_file_path, "r") as f:
config_data = yaml.safe_load(f)
@@ -314,6 +569,6 @@ def load_config() -> Configuration:
with open(schema_path, "r") as f:
schema = json.load(f)
- config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
- config_obj: Configuration = Configuration.from_dict(dict(config))
+ config = warlock.model_factory(schema, name="_Config")(config_data)
+ config_obj: Configuration = Configuration.from_dict(dict(config)) # type: ignore[arg-type]
return config_obj
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
new file mode 100644
index 0000000000..1927910d88
--- /dev/null
+++ b/dts/framework/config/types.py
@@ -0,0 +1,132 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+"""Configuration dictionary contents specification.
+
+These type definitions serve as documentation of the configuration dictionary contents.
+
+The definitions use the built-in :class:`~typing.TypedDict` construct.
+"""
+
+from typing import TypedDict
+
+
+class PortConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ pci: str
+ #:
+ os_driver_for_dpdk: str
+ #:
+ os_driver: str
+ #:
+ peer_node: str
+ #:
+ peer_pci: str
+
+
+class TrafficGeneratorConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ type: str
+
+
+class HugepageConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ amount: int
+ #:
+ force_first_numa: bool
+
+
+class NodeConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ hugepages: HugepageConfigurationDict
+ #:
+ name: str
+ #:
+ hostname: str
+ #:
+ user: str
+ #:
+ password: str
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ lcores: str
+ #:
+ use_first_core: bool
+ #:
+ ports: list[PortConfigDict]
+ #:
+ memory_channels: int
+ #:
+ traffic_generator: TrafficGeneratorConfigDict
+
+
+class BuildTargetConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ cpu: str
+ #:
+ compiler: str
+ #:
+ compiler_wrapper: str
+
+
+class TestSuiteConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ suite: str
+ #:
+ cases: list[str]
+
+
+class ExecutionSUTConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ node_name: str
+ #:
+ vdevs: list[str]
+
+
+class ExecutionConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ build_targets: list[BuildTargetConfigDict]
+ #:
+ perf: bool
+ #:
+ func: bool
+ #:
+ skip_smoke_tests: bool
+ #:
+ test_suites: TestSuiteConfigDict
+ #:
+ system_under_test_node: ExecutionSUTConfigDict
+ #:
+ traffic_generator_node: str
+
+
+class ConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ nodes: list[NodeConfigDict]
+ #:
+ executions: list[ExecutionConfigDict]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 11/21] dts: remote session docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (9 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 10/21] dts: config " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 12/21] dts: interactive " Juraj Linkeš
` (11 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/__init__.py | 39 +++++-
.../remote_session/remote_session.py | 130 +++++++++++++-----
dts/framework/remote_session/ssh_session.py | 16 +--
3 files changed, 137 insertions(+), 48 deletions(-)
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 5e7ddb2b05..51a01d6b5e 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -2,12 +2,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-The package provides modules for managing remote connections to a remote host (node),
-differentiated by OS.
-The package provides a factory function, create_session, that returns the appropriate
-remote connection based on the passed configuration. The differences are in the
-underlying transport protocol (e.g. SSH) and remote OS (e.g. Linux).
+"""Remote interactive and non-interactive sessions.
+
+This package provides modules for managing remote connections to a remote host (node).
+
+The non-interactive sessions send commands and return their output and exit code.
+
+The interactive sessions open an interactive shell which is continuously open,
+allowing it to send and receive data within that particular shell.
"""
# pylama:ignore=W0611
@@ -26,10 +28,35 @@
def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> RemoteSession:
+ """Factory for non-interactive remote sessions.
+
+ The function returns an SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The SSH remote session.
+ """
return SSHSession(node_config, name, logger)
def create_interactive_session(
node_config: NodeConfiguration, logger: DTSLOG
) -> InteractiveRemoteSession:
+ """Factory for interactive remote sessions.
+
+ The function returns an interactive SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The interactive SSH remote session.
+ """
return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
index 719f7d1ef7..2059f9a981 100644
--- a/dts/framework/remote_session/remote_session.py
+++ b/dts/framework/remote_session/remote_session.py
@@ -3,6 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Base remote session.
+
+This module contains the abstract base class for remote sessions and defines
+the structure of the result of a command execution.
+"""
+
+
import dataclasses
from abc import ABC, abstractmethod
from pathlib import PurePath
@@ -15,8 +22,14 @@
@dataclasses.dataclass(slots=True, frozen=True)
class CommandResult:
- """
- The result of remote execution of a command.
+ """The result of remote execution of a command.
+
+ Attributes:
+ name: The name of the session that executed the command.
+ command: The executed command.
+ stdout: The standard output the command produced.
+ stderr: The standard error output the command produced.
+ return_code: The return code the command exited with.
"""
name: str
@@ -26,6 +39,7 @@ class CommandResult:
return_code: int
def __str__(self) -> str:
+ """Format the command outputs."""
return (
f"stdout: '{self.stdout}'\n"
f"stderr: '{self.stderr}'\n"
@@ -34,13 +48,24 @@ def __str__(self) -> str:
class RemoteSession(ABC):
- """
- The base class for defining which methods must be implemented in order to connect
- to a remote host (node) and maintain a remote session. The derived classes are
- supposed to implement/use some underlying transport protocol (e.g. SSH) to
- implement the methods. On top of that, it provides some basic services common to
- all derived classes, such as keeping history and logging what's being executed
- on the remote node.
+ """Non-interactive remote session.
+
+ The abstract methods must be implemented in order to connect to a remote host (node)
+ and maintain a remote session.
+ The subclasses must use (or implement) some underlying transport protocol (e.g. SSH)
+ to implement the methods. On top of that, it provides some basic services common to all
+ subclasses, such as keeping history and logging what's being executed on the remote node.
+
+ Attributes:
+ name: The name of the session.
+ hostname: The node's hostname. Could be an IP (possibly with port, separated by a colon)
+ or a domain name.
+ ip: The IP address of the node or a domain name, whichever was used in `hostname`.
+ port: The port of the node, if given in `hostname`.
+ username: The username used in the connection.
+ password: The password used in the connection. Most frequently empty,
+ as the use of passwords is discouraged.
+ history: The executed commands during this session.
"""
name: str
@@ -59,6 +84,16 @@ def __init__(
session_name: str,
logger: DTSLOG,
):
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ session_name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Raises:
+ SSHConnectionError: If the connection to the node was not successful.
+ """
self._node_config = node_config
self.name = session_name
@@ -79,8 +114,13 @@ def __init__(
@abstractmethod
def _connect(self) -> None:
- """
- Create connection to assigned node.
+ """Create a connection to the node.
+
+ The implementation must assign the established session to self.session.
+
+ The implementation must except all exceptions and convert them to an SSHConnectionError.
+
+ The implementation may optionally implement retry attempts.
"""
def send_command(
@@ -90,11 +130,24 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- Send a command to the connected node using optional env vars
- and return CommandResult.
- If verify is True, check the return code of the executed command
- and raise a RemoteCommandExecutionError if the command failed.
+ """Send `command` to the connected node.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds for `command` execution to complete.
+ verify: If :data:`True`, will check the exit code of `command`.
+ env: A dictionary with environment variables to be used with `command` execution.
+
+ Raises:
+ SSHSessionDeadError: If the session isn't alive when sending `command`.
+ SSHTimeoutError: If `command` execution timed out.
+ RemoteCommandExecutionError: If verify is :data:`True` and `command` execution failed.
+
+ Returns:
+ The output of the command along with the return code.
"""
self._logger.info(f"Sending: '{command}'" + (f" with env vars: '{env}'" if env else ""))
result = self._send_command(command, timeout, env)
@@ -111,29 +164,38 @@ def send_command(
@abstractmethod
def _send_command(self, command: str, timeout: float, env: dict | None) -> CommandResult:
- """
- Use the underlying protocol to execute the command using optional env vars
- and return CommandResult.
+ """Send a command to the connected node.
+
+ The implementation must execute the command remotely with `env` environment variables
+ and return the result.
+
+ The implementation must except all exceptions and raise:
+
+ * SSHSessionDeadError if the session is not alive,
+ * SSHTimeoutError if the command execution times out.
"""
def close(self, force: bool = False) -> None:
- """
- Close the remote session and free all used resources.
+ """Close the remote session and free all used resources.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
"""
self._logger.logger_exit()
self._close(force)
@abstractmethod
def _close(self, force: bool = False) -> None:
- """
- Execute protocol specific steps needed to close the session properly.
+ """Protocol specific steps needed to close the session properly.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
+ This doesn't have to be implemented in the overloaded method.
"""
@abstractmethod
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the remote session is still responding."""
@abstractmethod
def copy_from(
@@ -143,12 +205,12 @@ def copy_from(
) -> None:
"""Copy a file from the remote Node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote Node associated with this remote session
+ to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
- destination_file: a file or directory path on the local filesystem.
+ source_file: The file on the remote Node.
+ destination_file: A file or directory path on the local filesystem.
"""
@abstractmethod
@@ -159,10 +221,10 @@ def copy_to(
) -> None:
"""Copy a file from local filesystem to the remote Node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file` on the remote Node
+ associated with this remote session.
Args:
- source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ source_file: The file on the local filesystem.
+ destination_file: A file or directory path on the remote Node.
"""
diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
index a467033a13..782220092c 100644
--- a/dts/framework/remote_session/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -1,6 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""SSH remote session."""
+
import socket
import traceback
from pathlib import PurePath
@@ -26,13 +28,8 @@
class SSHSession(RemoteSession):
"""A persistent SSH connection to a remote Node.
- The connection is implemented with the Fabric Python library.
-
- Args:
- node_config: The configuration of the Node to connect to.
- session_name: The name of the session.
- logger: The logger used for logging.
- This should be passed from the parent OSSession.
+ The connection is implemented with
+ `the Fabric Python library <https://docs.fabfile.org/en/latest/>`_.
Attributes:
session: The underlying Fabric SSH connection.
@@ -78,6 +75,7 @@ def _connect(self) -> None:
raise SSHConnectionError(self.hostname, errors)
def is_alive(self) -> bool:
+ """Overrides :meth:`~.remote_session.RemoteSession.is_alive`."""
return self.session.is_connected
def _send_command(self, command: str, timeout: float, env: dict | None) -> CommandResult:
@@ -85,7 +83,7 @@ def _send_command(self, command: str, timeout: float, env: dict | None) -> Comma
Args:
command: The command to execute.
- timeout: Wait at most this many seconds for the execution to complete.
+ timeout: Wait at most this long in seconds for the command execution to complete.
env: Extra environment variables that will be used in command execution.
Raises:
@@ -110,6 +108,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
self.session.get(str(destination_file), str(source_file))
def copy_to(
@@ -117,6 +116,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
self.session.put(str(source_file), str(destination_file))
def _close(self, force: bool = False) -> None:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 12/21] dts: interactive remote session docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (10 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 11/21] dts: remote session " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-30 21:49 ` Jeremy Spewock
2023-11-23 15:13 ` [PATCH v8 13/21] dts: port and virtual device " Juraj Linkeš
` (10 subsequent siblings)
22 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../interactive_remote_session.py | 36 +++----
.../remote_session/interactive_shell.py | 99 +++++++++++--------
dts/framework/remote_session/python_shell.py | 26 ++++-
dts/framework/remote_session/testpmd_shell.py | 58 +++++++++--
4 files changed, 149 insertions(+), 70 deletions(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 098ded1bb0..1cc82e3377 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -22,27 +22,23 @@
class InteractiveRemoteSession:
"""SSH connection dedicated to interactive applications.
- This connection is created using paramiko and is a persistent connection to the
- host. This class defines methods for connecting to the node and configures this
- connection to send "keep alive" packets every 30 seconds. Because paramiko attempts
- to use SSH keys to establish a connection first, providing a password is optional.
- This session is utilized by InteractiveShells and cannot be interacted with
- directly.
-
- Arguments:
- node_config: Configuration class for the node you are connecting to.
- _logger: Desired logger for this session to use.
+ The connection is created using `paramiko <https://docs.paramiko.org/en/latest/>`_
+ and is a persistent connection to the host. This class defines the methods for connecting
+ to the node and configures the connection to send "keep alive" packets every 30 seconds.
+ Because paramiko attempts to use SSH keys to establish a connection first, providing
+ a password is optional. This session is utilized by InteractiveShells
+ and cannot be interacted with directly.
Attributes:
- hostname: Hostname that will be used to initialize a connection to the node.
- ip: A subsection of hostname that removes the port for the connection if there
+ hostname: The hostname that will be used to initialize a connection to the node.
+ ip: A subsection of `hostname` that removes the port for the connection if there
is one. If there is no port, this will be the same as hostname.
- port: Port to use for the ssh connection. This will be extracted from the
- hostname if there is a port included, otherwise it will default to 22.
+ port: Port to use for the ssh connection. This will be extracted from `hostname`
+ if there is a port included, otherwise it will default to ``22``.
username: User to connect to the node with.
password: Password of the user connecting to the host. This will default to an
empty string if a password is not provided.
- session: Underlying paramiko connection.
+ session: The underlying paramiko connection.
Raises:
SSHConnectionError: There is an error creating the SSH connection.
@@ -58,9 +54,15 @@ class InteractiveRemoteSession:
_node_config: NodeConfiguration
_transport: Transport | None
- def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG) -> None:
+ def __init__(self, node_config: NodeConfiguration, logger: DTSLOG) -> None:
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+ """
self._node_config = node_config
- self._logger = _logger
+ self._logger = logger
self.hostname = node_config.hostname
self.username = node_config.user
self.password = node_config.password if node_config.password else ""
diff --git a/dts/framework/remote_session/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
index 4db19fb9b3..b158f963b6 100644
--- a/dts/framework/remote_session/interactive_shell.py
+++ b/dts/framework/remote_session/interactive_shell.py
@@ -3,18 +3,20 @@
"""Common functionality for interactive shell handling.
-This base class, InteractiveShell, is meant to be extended by other classes that
-contain functionality specific to that shell type. These derived classes will often
-modify things like the prompt to expect or the arguments to pass into the application,
-but still utilize the same method for sending a command and collecting output. How
-this output is handled however is often application specific. If an application needs
-elevated privileges to start it is expected that the method for gaining those
-privileges is provided when initializing the class.
+The base class, :class:`InteractiveShell`, is meant to be extended by subclasses that contain
+functionality specific to that shell type. These subclasses will often modify things like
+the prompt to expect or the arguments to pass into the application, but still utilize
+the same method for sending a command and collecting output. How this output is handled however
+is often application specific. If an application needs elevated privileges to start it is expected
+that the method for gaining those privileges is provided when initializing the class.
+
+The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+environment variable configure the timeout of getting the output from command execution.
"""
from abc import ABC
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from paramiko import Channel, SSHClient, channel # type: ignore[import]
@@ -30,28 +32,6 @@ class InteractiveShell(ABC):
and collecting input until reaching a certain prompt. All interactive applications
will use the same SSH connection, but each will create their own channel on that
session.
-
- Arguments:
- interactive_session: The SSH session dedicated to interactive shells.
- logger: Logger used for displaying information in the console.
- get_privileged_command: Method for modifying a command to allow it to use
- elevated privileges. If this is None, the application will not be started
- with elevated privileges.
- app_args: Command line arguments to be passed to the application on startup.
- timeout: Timeout used for the SSH channel that is dedicated to this interactive
- shell. This timeout is for collecting output, so if reading from the buffer
- and no output is gathered within the timeout, an exception is thrown.
-
- Attributes
- _default_prompt: Prompt to expect at the end of output when sending a command.
- This is often overridden by derived classes.
- _command_extra_chars: Extra characters to add to the end of every command
- before sending them. This is often overridden by derived classes and is
- most commonly an additional newline character.
- path: Path to the executable to start the interactive application.
- dpdk_app: Whether this application is a DPDK app. If it is, the build
- directory for DPDK on the node will be prepended to the path to the
- executable.
"""
_interactive_session: SSHClient
@@ -61,10 +41,22 @@ class InteractiveShell(ABC):
_logger: DTSLOG
_timeout: float
_app_args: str
- _default_prompt: str = ""
- _command_extra_chars: str = ""
- path: PurePath
- dpdk_app: bool = False
+
+ #: Prompt to expect at the end of output when sending a command.
+ #: This is often overridden by subclasses.
+ _default_prompt: ClassVar[str] = ""
+
+ #: Extra characters to add to the end of every command
+ #: before sending them. This is often overridden by subclasses and is
+ #: most commonly an additional newline character.
+ _command_extra_chars: ClassVar[str] = ""
+
+ #: Path to the executable to start the interactive application.
+ path: ClassVar[PurePath]
+
+ #: Whether this application is a DPDK app. If it is, the build directory
+ #: for DPDK on the node will be prepended to the path to the executable.
+ dpdk_app: ClassVar[bool] = False
def __init__(
self,
@@ -74,6 +66,19 @@ def __init__(
app_args: str = "",
timeout: float = SETTINGS.timeout,
) -> None:
+ """Create an SSH channel during initialization.
+
+ Args:
+ interactive_session: The SSH session dedicated to interactive shells.
+ logger: The logger instance this session will use.
+ get_privileged_command: A method for modifying a command to allow it to use
+ elevated privileges. If :data:`None`, the application will not be started
+ with elevated privileges.
+ app_args: The command line arguments to be passed to the application on startup.
+ timeout: The timeout used for the SSH channel that is dedicated to this interactive
+ shell. This timeout is for collecting output, so if reading from the buffer
+ and no output is gathered within the timeout, an exception is thrown.
+ """
self._interactive_session = interactive_session
self._ssh_channel = self._interactive_session.invoke_shell()
self._stdin = self._ssh_channel.makefile_stdin("w")
@@ -90,6 +95,10 @@ def _start_application(self, get_privileged_command: Callable[[str], str] | None
This method is often overridden by subclasses as their process for
starting may look different.
+
+ Args:
+ get_privileged_command: A function (but could be any callable) that produces
+ the version of the command with elevated privileges.
"""
start_command = f"{self.path} {self._app_args}"
if get_privileged_command is not None:
@@ -97,16 +106,24 @@ def _start_application(self, get_privileged_command: Callable[[str], str] | None
self.send_command(start_command)
def send_command(self, command: str, prompt: str | None = None) -> str:
- """Send a command and get all output before the expected ending string.
+ """Send `command` and get all output before the expected ending string.
Lines that expect input are not included in the stdout buffer, so they cannot
- be used for expect. For example, if you were prompted to log into something
- with a username and password, you cannot expect "username:" because it won't
- yet be in the stdout buffer. A workaround for this could be consuming an
- extra newline character to force the current prompt into the stdout buffer.
+ be used for expect.
+
+ Example:
+ If you were prompted to log into something with a username and password,
+ you cannot expect ``username:`` because it won't yet be in the stdout buffer.
+ A workaround for this could be consuming an extra newline character to force
+ the current `prompt` into the stdout buffer.
+
+ Args:
+ command: The command to send.
+ prompt: After sending the command, `send_command` will be expecting this string.
+ If :data:`None`, will use the class's default prompt.
Returns:
- All output in the buffer before expected string
+ All output in the buffer before expected string.
"""
self._logger.info(f"Sending: '{command}'")
if prompt is None:
@@ -124,8 +141,10 @@ def send_command(self, command: str, prompt: str | None = None) -> str:
return out
def close(self) -> None:
+ """Properly free all resources."""
self._stdin.close()
self._ssh_channel.close()
def __del__(self) -> None:
+ """Make sure the session is properly closed before deleting the object."""
self.close()
diff --git a/dts/framework/remote_session/python_shell.py b/dts/framework/remote_session/python_shell.py
index cc3ad48a68..ccfd3783e8 100644
--- a/dts/framework/remote_session/python_shell.py
+++ b/dts/framework/remote_session/python_shell.py
@@ -1,12 +1,32 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Python interactive shell.
+
+Typical usage example in a TestSuite::
+
+ from framework.remote_session import PythonShell
+ python_shell = self.tg_node.create_interactive_shell(
+ PythonShell, timeout=5, privileged=True
+ )
+ python_shell.send_command("print('Hello World')")
+ python_shell.close()
+"""
+
from pathlib import PurePath
+from typing import ClassVar
from .interactive_shell import InteractiveShell
class PythonShell(InteractiveShell):
- _default_prompt: str = ">>>"
- _command_extra_chars: str = "\n"
- path: PurePath = PurePath("python3")
+ """Python interactive shell."""
+
+ #: Python's prompt.
+ _default_prompt: ClassVar[str] = ">>>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
+
+ #: The Python executable.
+ path: ClassVar[PurePath] = PurePath("python3")
diff --git a/dts/framework/remote_session/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
index 08ac311016..79481e845c 100644
--- a/dts/framework/remote_session/testpmd_shell.py
+++ b/dts/framework/remote_session/testpmd_shell.py
@@ -1,41 +1,79 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Testpmd interactive shell.
+
+Typical usage example in a TestSuite::
+
+ testpmd_shell = self.sut_node.create_interactive_shell(
+ TestPmdShell, privileged=True
+ )
+ devices = testpmd_shell.get_devices()
+ for device in devices:
+ print(device)
+ testpmd_shell.close()
+"""
+
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from .interactive_shell import InteractiveShell
class TestPmdDevice(object):
+ """The data of a device that testpmd can recognize.
+
+ Attributes:
+ pci_address: The PCI address of the device.
+ """
+
pci_address: str
def __init__(self, pci_address_line: str):
+ """Initialize the device from the testpmd output line string.
+
+ Args:
+ pci_address_line: A line of testpmd output that contains a device.
+ """
self.pci_address = pci_address_line.strip().split(": ")[1].strip()
def __str__(self) -> str:
+ """The PCI address captures what the device is."""
return self.pci_address
class TestPmdShell(InteractiveShell):
- path: PurePath = PurePath("app", "dpdk-testpmd")
- dpdk_app: bool = True
- _default_prompt: str = "testpmd>"
- _command_extra_chars: str = "\n" # We want to append an extra newline to every command
+ """Testpmd interactive shell.
+
+ The testpmd shell users should never use
+ the :meth:`~.interactive_shell.InteractiveShell.send_command` method directly, but rather
+ call specialized methods. If there isn't one that satisfies a need, it should be added.
+ """
+
+ #: The path to the testpmd executable.
+ path: ClassVar[PurePath] = PurePath("app", "dpdk-testpmd")
+
+ #: Flag this as a DPDK app so that it's clear this is not a system app and
+ #: needs to be looked in a specific path.
+ dpdk_app: ClassVar[bool] = True
+
+ #: The testpmd's prompt.
+ _default_prompt: ClassVar[str] = "testpmd>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
def _start_application(self, get_privileged_command: Callable[[str], str] | None) -> None:
- """See "_start_application" in InteractiveShell."""
self._app_args += " -- -i"
super()._start_application(get_privileged_command)
def get_devices(self) -> list[TestPmdDevice]:
- """Get a list of device names that are known to testpmd
+ """Get a list of device names that are known to testpmd.
- Uses the device info listed in testpmd and then parses the output to
- return only the names of the devices.
+ Uses the device info listed in testpmd and then parses the output.
Returns:
- A list of strings representing device names (e.g. 0000:14:00.1)
+ A list of devices.
"""
dev_info: str = self.send_command("show device info all")
dev_list: list[TestPmdDevice] = []
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 12/21] dts: interactive remote session docstring update
2023-11-23 15:13 ` [PATCH v8 12/21] dts: interactive " Juraj Linkeš
@ 2023-11-30 21:49 ` Jeremy Spewock
2023-12-04 9:50 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-11-30 21:49 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
[-- Attachment #1: Type: text/plain, Size: 18142 bytes --]
On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> .../interactive_remote_session.py | 36 +++----
> .../remote_session/interactive_shell.py | 99 +++++++++++--------
> dts/framework/remote_session/python_shell.py | 26 ++++-
> dts/framework/remote_session/testpmd_shell.py | 58 +++++++++--
> 4 files changed, 149 insertions(+), 70 deletions(-)
>
> diff --git a/dts/framework/remote_session/interactive_remote_session.py
> b/dts/framework/remote_session/interactive_remote_session.py
> index 098ded1bb0..1cc82e3377 100644
> --- a/dts/framework/remote_session/interactive_remote_session.py
> +++ b/dts/framework/remote_session/interactive_remote_session.py
> @@ -22,27 +22,23 @@
> class InteractiveRemoteSession:
> """SSH connection dedicated to interactive applications.
>
> - This connection is created using paramiko and is a persistent
> connection to the
> - host. This class defines methods for connecting to the node and
> configures this
> - connection to send "keep alive" packets every 30 seconds. Because
> paramiko attempts
> - to use SSH keys to establish a connection first, providing a password
> is optional.
> - This session is utilized by InteractiveShells and cannot be
> interacted with
> - directly.
> -
> - Arguments:
> - node_config: Configuration class for the node you are connecting
> to.
> - _logger: Desired logger for this session to use.
> + The connection is created using `paramiko <
> https://docs.paramiko.org/en/latest/>`_
> + and is a persistent connection to the host. This class defines the
> methods for connecting
> + to the node and configures the connection to send "keep alive"
> packets every 30 seconds.
> + Because paramiko attempts to use SSH keys to establish a connection
> first, providing
> + a password is optional. This session is utilized by InteractiveShells
> + and cannot be interacted with directly.
>
> Attributes:
> - hostname: Hostname that will be used to initialize a connection
> to the node.
> - ip: A subsection of hostname that removes the port for the
> connection if there
> + hostname: The hostname that will be used to initialize a
> connection to the node.
> + ip: A subsection of `hostname` that removes the port for the
> connection if there
> is one. If there is no port, this will be the same as
> hostname.
> - port: Port to use for the ssh connection. This will be extracted
> from the
> - hostname if there is a port included, otherwise it will
> default to 22.
> + port: Port to use for the ssh connection. This will be extracted
> from `hostname`
> + if there is a port included, otherwise it will default to
> ``22``.
> username: User to connect to the node with.
> password: Password of the user connecting to the host. This will
> default to an
> empty string if a password is not provided.
> - session: Underlying paramiko connection.
> + session: The underlying paramiko connection.
>
> Raises:
> SSHConnectionError: There is an error creating the SSH connection.
> @@ -58,9 +54,15 @@ class InteractiveRemoteSession:
> _node_config: NodeConfiguration
> _transport: Transport | None
>
> - def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG)
> -> None:
> + def __init__(self, node_config: NodeConfiguration, logger: DTSLOG) ->
> None:
> + """Connect to the node during initialization.
> +
> + Args:
> + node_config: The test run configuration of the node to
> connect to.
> + logger: The logger instance this session will use.
> + """
> self._node_config = node_config
> - self._logger = _logger
> + self._logger = logger
> self.hostname = node_config.hostname
> self.username = node_config.user
> self.password = node_config.password if node_config.password else
> ""
> diff --git a/dts/framework/remote_session/interactive_shell.py
> b/dts/framework/remote_session/interactive_shell.py
> index 4db19fb9b3..b158f963b6 100644
> --- a/dts/framework/remote_session/interactive_shell.py
> +++ b/dts/framework/remote_session/interactive_shell.py
> @@ -3,18 +3,20 @@
>
> """Common functionality for interactive shell handling.
>
> -This base class, InteractiveShell, is meant to be extended by other
> classes that
> -contain functionality specific to that shell type. These derived classes
> will often
> -modify things like the prompt to expect or the arguments to pass into the
> application,
> -but still utilize the same method for sending a command and collecting
> output. How
> -this output is handled however is often application specific. If an
> application needs
> -elevated privileges to start it is expected that the method for gaining
> those
> -privileges is provided when initializing the class.
> +The base class, :class:`InteractiveShell`, is meant to be extended by
> subclasses that contain
> +functionality specific to that shell type. These subclasses will often
> modify things like
> +the prompt to expect or the arguments to pass into the application, but
> still utilize
> +the same method for sending a command and collecting output. How this
> output is handled however
> +is often application specific. If an application needs elevated
> privileges to start it is expected
> +that the method for gaining those privileges is provided when
> initializing the class.
> +
> +The :option:`--timeout` command line argument and the
> :envvar:`DTS_TIMEOUT`
> +environment variable configure the timeout of getting the output from
> command execution.
> """
>
> from abc import ABC
> from pathlib import PurePath
> -from typing import Callable
> +from typing import Callable, ClassVar
>
> from paramiko import Channel, SSHClient, channel # type: ignore[import]
>
> @@ -30,28 +32,6 @@ class InteractiveShell(ABC):
> and collecting input until reaching a certain prompt. All interactive
> applications
> will use the same SSH connection, but each will create their own
> channel on that
> session.
> -
> - Arguments:
> - interactive_session: The SSH session dedicated to interactive
> shells.
> - logger: Logger used for displaying information in the console.
> - get_privileged_command: Method for modifying a command to allow
> it to use
> - elevated privileges. If this is None, the application will
> not be started
> - with elevated privileges.
> - app_args: Command line arguments to be passed to the application
> on startup.
> - timeout: Timeout used for the SSH channel that is dedicated to
> this interactive
> - shell. This timeout is for collecting output, so if reading
> from the buffer
> - and no output is gathered within the timeout, an exception is
> thrown.
> -
> - Attributes
> - _default_prompt: Prompt to expect at the end of output when
> sending a command.
> - This is often overridden by derived classes.
> - _command_extra_chars: Extra characters to add to the end of every
> command
> - before sending them. This is often overridden by derived
> classes and is
> - most commonly an additional newline character.
> - path: Path to the executable to start the interactive application.
> - dpdk_app: Whether this application is a DPDK app. If it is, the
> build
> - directory for DPDK on the node will be prepended to the path
> to the
> - executable.
> """
>
> _interactive_session: SSHClient
> @@ -61,10 +41,22 @@ class InteractiveShell(ABC):
> _logger: DTSLOG
> _timeout: float
> _app_args: str
> - _default_prompt: str = ""
> - _command_extra_chars: str = ""
> - path: PurePath
> - dpdk_app: bool = False
> +
> + #: Prompt to expect at the end of output when sending a command.
> + #: This is often overridden by subclasses.
> + _default_prompt: ClassVar[str] = ""
> +
> + #: Extra characters to add to the end of every command
> + #: before sending them. This is often overridden by subclasses and is
> + #: most commonly an additional newline character.
> + _command_extra_chars: ClassVar[str] = ""
> +
> + #: Path to the executable to start the interactive application.
> + path: ClassVar[PurePath]
> +
> + #: Whether this application is a DPDK app. If it is, the build
> directory
> + #: for DPDK on the node will be prepended to the path to the
> executable.
> + dpdk_app: ClassVar[bool] = False
>
> def __init__(
> self,
> @@ -74,6 +66,19 @@ def __init__(
> app_args: str = "",
> timeout: float = SETTINGS.timeout,
> ) -> None:
> + """Create an SSH channel during initialization.
> +
> + Args:
> + interactive_session: The SSH session dedicated to interactive
> shells.
> + logger: The logger instance this session will use.
> + get_privileged_command: A method for modifying a command to
> allow it to use
> + elevated privileges. If :data:`None`, the application
> will not be started
> + with elevated privileges.
> + app_args: The command line arguments to be passed to the
> application on startup.
> + timeout: The timeout used for the SSH channel that is
> dedicated to this interactive
> + shell. This timeout is for collecting output, so if
> reading from the buffer
> + and no output is gathered within the timeout, an
> exception is thrown.
> + """
> self._interactive_session = interactive_session
> self._ssh_channel = self._interactive_session.invoke_shell()
> self._stdin = self._ssh_channel.makefile_stdin("w")
> @@ -90,6 +95,10 @@ def _start_application(self, get_privileged_command:
> Callable[[str], str] | None
>
> This method is often overridden by subclasses as their process for
> starting may look different.
> +
> + Args:
> + get_privileged_command: A function (but could be any
> callable) that produces
> + the version of the command with elevated privileges.
> """
> start_command = f"{self.path} {self._app_args}"
> if get_privileged_command is not None:
> @@ -97,16 +106,24 @@ def _start_application(self, get_privileged_command:
> Callable[[str], str] | None
> self.send_command(start_command)
>
> def send_command(self, command: str, prompt: str | None = None) ->
> str:
> - """Send a command and get all output before the expected ending
> string.
> + """Send `command` and get all output before the expected ending
> string.
>
> Lines that expect input are not included in the stdout buffer, so
> they cannot
> - be used for expect. For example, if you were prompted to log into
> something
> - with a username and password, you cannot expect "username:"
> because it won't
> - yet be in the stdout buffer. A workaround for this could be
> consuming an
> - extra newline character to force the current prompt into the
> stdout buffer.
> + be used for expect.
> +
> + Example:
> + If you were prompted to log into something with a username
> and password,
> + you cannot expect ``username:`` because it won't yet be in
> the stdout buffer.
> + A workaround for this could be consuming an extra newline
> character to force
> + the current `prompt` into the stdout buffer.
> +
> + Args:
> + command: The command to send.
> + prompt: After sending the command, `send_command` will be
> expecting this string.
> + If :data:`None`, will use the class's default prompt.
>
> Returns:
> - All output in the buffer before expected string
> + All output in the buffer before expected string.
> """
> self._logger.info(f"Sending: '{command}'")
> if prompt is None:
> @@ -124,8 +141,10 @@ def send_command(self, command: str, prompt: str |
> None = None) -> str:
> return out
>
> def close(self) -> None:
> + """Properly free all resources."""
> self._stdin.close()
> self._ssh_channel.close()
>
> def __del__(self) -> None:
> + """Make sure the session is properly closed before deleting the
> object."""
> self.close()
> diff --git a/dts/framework/remote_session/python_shell.py
> b/dts/framework/remote_session/python_shell.py
> index cc3ad48a68..ccfd3783e8 100644
> --- a/dts/framework/remote_session/python_shell.py
> +++ b/dts/framework/remote_session/python_shell.py
> @@ -1,12 +1,32 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> +"""Python interactive shell.
> +
> +Typical usage example in a TestSuite::
> +
> + from framework.remote_session import PythonShell
> + python_shell = self.tg_node.create_interactive_shell(
> + PythonShell, timeout=5, privileged=True
> + )
> + python_shell.send_command("print('Hello World')")
> + python_shell.close()
> +"""
> +
> from pathlib import PurePath
> +from typing import ClassVar
>
> from .interactive_shell import InteractiveShell
>
>
> class PythonShell(InteractiveShell):
> - _default_prompt: str = ">>>"
> - _command_extra_chars: str = "\n"
> - path: PurePath = PurePath("python3")
> + """Python interactive shell."""
> +
> + #: Python's prompt.
> + _default_prompt: ClassVar[str] = ">>>"
> +
> + #: This forces the prompt to appear after sending a command.
> + _command_extra_chars: ClassVar[str] = "\n"
> +
> + #: The Python executable.
> + path: ClassVar[PurePath] = PurePath("python3")
> diff --git a/dts/framework/remote_session/testpmd_shell.py
> b/dts/framework/remote_session/testpmd_shell.py
> index 08ac311016..79481e845c 100644
> --- a/dts/framework/remote_session/testpmd_shell.py
> +++ b/dts/framework/remote_session/testpmd_shell.py
> @@ -1,41 +1,79 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 University of New Hampshire
>
>
Should you add to the copyright here for adding comments?
> +"""Testpmd interactive shell.
> +
> +Typical usage example in a TestSuite::
> +
> + testpmd_shell = self.sut_node.create_interactive_shell(
> + TestPmdShell, privileged=True
> + )
> + devices = testpmd_shell.get_devices()
> + for device in devices:
> + print(device)
> + testpmd_shell.close()
> +"""
> +
> from pathlib import PurePath
> -from typing import Callable
> +from typing import Callable, ClassVar
>
> from .interactive_shell import InteractiveShell
>
>
> class TestPmdDevice(object):
> + """The data of a device that testpmd can recognize.
> +
> + Attributes:
> + pci_address: The PCI address of the device.
> + """
> +
> pci_address: str
>
> def __init__(self, pci_address_line: str):
> + """Initialize the device from the testpmd output line string.
> +
> + Args:
> + pci_address_line: A line of testpmd output that contains a
> device.
> + """
> self.pci_address = pci_address_line.strip().split(": ")[1].strip()
>
> def __str__(self) -> str:
> + """The PCI address captures what the device is."""
> return self.pci_address
>
>
> class TestPmdShell(InteractiveShell):
> - path: PurePath = PurePath("app", "dpdk-testpmd")
> - dpdk_app: bool = True
> - _default_prompt: str = "testpmd>"
> - _command_extra_chars: str = "\n" # We want to append an extra
> newline to every command
> + """Testpmd interactive shell.
> +
> + The testpmd shell users should never use
> + the :meth:`~.interactive_shell.InteractiveShell.send_command` method
> directly, but rather
> + call specialized methods. If there isn't one that satisfies a need,
> it should be added.
> + """
> +
> + #: The path to the testpmd executable.
> + path: ClassVar[PurePath] = PurePath("app", "dpdk-testpmd")
> +
> + #: Flag this as a DPDK app so that it's clear this is not a system
> app and
> + #: needs to be looked in a specific path.
> + dpdk_app: ClassVar[bool] = True
> +
> + #: The testpmd's prompt.
> + _default_prompt: ClassVar[str] = "testpmd>"
> +
> + #: This forces the prompt to appear after sending a command.
> + _command_extra_chars: ClassVar[str] = "\n"
>
> def _start_application(self, get_privileged_command: Callable[[str],
> str] | None) -> None:
> - """See "_start_application" in InteractiveShell."""
> self._app_args += " -- -i"
> super()._start_application(get_privileged_command)
>
> def get_devices(self) -> list[TestPmdDevice]:
> - """Get a list of device names that are known to testpmd
> + """Get a list of device names that are known to testpmd.
>
> - Uses the device info listed in testpmd and then parses the output
> to
> - return only the names of the devices.
> + Uses the device info listed in testpmd and then parses the output.
>
> Returns:
> - A list of strings representing device names (e.g.
> 0000:14:00.1)
> + A list of devices.
> """
> dev_info: str = self.send_command("show device info all")
> dev_list: list[TestPmdDevice] = []
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 21199 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 12/21] dts: interactive remote session docstring update
2023-11-30 21:49 ` Jeremy Spewock
@ 2023-12-04 9:50 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 9:50 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
On Thu, Nov 30, 2023 at 10:50 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
>
>
>
> On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>>
>> Format according to the Google format and PEP257, with slight
>> deviations.
>>
>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>> ---
>> .../interactive_remote_session.py | 36 +++----
>> .../remote_session/interactive_shell.py | 99 +++++++++++--------
>> dts/framework/remote_session/python_shell.py | 26 ++++-
>> dts/framework/remote_session/testpmd_shell.py | 58 +++++++++--
>> 4 files changed, 149 insertions(+), 70 deletions(-)
>>
>> diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
>> index 098ded1bb0..1cc82e3377 100644
>> --- a/dts/framework/remote_session/interactive_remote_session.py
>> +++ b/dts/framework/remote_session/interactive_remote_session.py
>> @@ -22,27 +22,23 @@
>> class InteractiveRemoteSession:
>> """SSH connection dedicated to interactive applications.
>>
>> - This connection is created using paramiko and is a persistent connection to the
>> - host. This class defines methods for connecting to the node and configures this
>> - connection to send "keep alive" packets every 30 seconds. Because paramiko attempts
>> - to use SSH keys to establish a connection first, providing a password is optional.
>> - This session is utilized by InteractiveShells and cannot be interacted with
>> - directly.
>> -
>> - Arguments:
>> - node_config: Configuration class for the node you are connecting to.
>> - _logger: Desired logger for this session to use.
>> + The connection is created using `paramiko <https://docs.paramiko.org/en/latest/>`_
>> + and is a persistent connection to the host. This class defines the methods for connecting
>> + to the node and configures the connection to send "keep alive" packets every 30 seconds.
>> + Because paramiko attempts to use SSH keys to establish a connection first, providing
>> + a password is optional. This session is utilized by InteractiveShells
>> + and cannot be interacted with directly.
>>
>> Attributes:
>> - hostname: Hostname that will be used to initialize a connection to the node.
>> - ip: A subsection of hostname that removes the port for the connection if there
>> + hostname: The hostname that will be used to initialize a connection to the node.
>> + ip: A subsection of `hostname` that removes the port for the connection if there
>> is one. If there is no port, this will be the same as hostname.
>> - port: Port to use for the ssh connection. This will be extracted from the
>> - hostname if there is a port included, otherwise it will default to 22.
>> + port: Port to use for the ssh connection. This will be extracted from `hostname`
>> + if there is a port included, otherwise it will default to ``22``.
>> username: User to connect to the node with.
>> password: Password of the user connecting to the host. This will default to an
>> empty string if a password is not provided.
>> - session: Underlying paramiko connection.
>> + session: The underlying paramiko connection.
>>
>> Raises:
>> SSHConnectionError: There is an error creating the SSH connection.
>> @@ -58,9 +54,15 @@ class InteractiveRemoteSession:
>> _node_config: NodeConfiguration
>> _transport: Transport | None
>>
>> - def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG) -> None:
>> + def __init__(self, node_config: NodeConfiguration, logger: DTSLOG) -> None:
>> + """Connect to the node during initialization.
>> +
>> + Args:
>> + node_config: The test run configuration of the node to connect to.
>> + logger: The logger instance this session will use.
>> + """
>> self._node_config = node_config
>> - self._logger = _logger
>> + self._logger = logger
>> self.hostname = node_config.hostname
>> self.username = node_config.user
>> self.password = node_config.password if node_config.password else ""
>> diff --git a/dts/framework/remote_session/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
>> index 4db19fb9b3..b158f963b6 100644
>> --- a/dts/framework/remote_session/interactive_shell.py
>> +++ b/dts/framework/remote_session/interactive_shell.py
>> @@ -3,18 +3,20 @@
>>
>> """Common functionality for interactive shell handling.
>>
>> -This base class, InteractiveShell, is meant to be extended by other classes that
>> -contain functionality specific to that shell type. These derived classes will often
>> -modify things like the prompt to expect or the arguments to pass into the application,
>> -but still utilize the same method for sending a command and collecting output. How
>> -this output is handled however is often application specific. If an application needs
>> -elevated privileges to start it is expected that the method for gaining those
>> -privileges is provided when initializing the class.
>> +The base class, :class:`InteractiveShell`, is meant to be extended by subclasses that contain
>> +functionality specific to that shell type. These subclasses will often modify things like
>> +the prompt to expect or the arguments to pass into the application, but still utilize
>> +the same method for sending a command and collecting output. How this output is handled however
>> +is often application specific. If an application needs elevated privileges to start it is expected
>> +that the method for gaining those privileges is provided when initializing the class.
>> +
>> +The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
>> +environment variable configure the timeout of getting the output from command execution.
>> """
>>
>> from abc import ABC
>> from pathlib import PurePath
>> -from typing import Callable
>> +from typing import Callable, ClassVar
>>
>> from paramiko import Channel, SSHClient, channel # type: ignore[import]
>>
>> @@ -30,28 +32,6 @@ class InteractiveShell(ABC):
>> and collecting input until reaching a certain prompt. All interactive applications
>> will use the same SSH connection, but each will create their own channel on that
>> session.
>> -
>> - Arguments:
>> - interactive_session: The SSH session dedicated to interactive shells.
>> - logger: Logger used for displaying information in the console.
>> - get_privileged_command: Method for modifying a command to allow it to use
>> - elevated privileges. If this is None, the application will not be started
>> - with elevated privileges.
>> - app_args: Command line arguments to be passed to the application on startup.
>> - timeout: Timeout used for the SSH channel that is dedicated to this interactive
>> - shell. This timeout is for collecting output, so if reading from the buffer
>> - and no output is gathered within the timeout, an exception is thrown.
>> -
>> - Attributes
>> - _default_prompt: Prompt to expect at the end of output when sending a command.
>> - This is often overridden by derived classes.
>> - _command_extra_chars: Extra characters to add to the end of every command
>> - before sending them. This is often overridden by derived classes and is
>> - most commonly an additional newline character.
>> - path: Path to the executable to start the interactive application.
>> - dpdk_app: Whether this application is a DPDK app. If it is, the build
>> - directory for DPDK on the node will be prepended to the path to the
>> - executable.
>> """
>>
>> _interactive_session: SSHClient
>> @@ -61,10 +41,22 @@ class InteractiveShell(ABC):
>> _logger: DTSLOG
>> _timeout: float
>> _app_args: str
>> - _default_prompt: str = ""
>> - _command_extra_chars: str = ""
>> - path: PurePath
>> - dpdk_app: bool = False
>> +
>> + #: Prompt to expect at the end of output when sending a command.
>> + #: This is often overridden by subclasses.
>> + _default_prompt: ClassVar[str] = ""
>> +
>> + #: Extra characters to add to the end of every command
>> + #: before sending them. This is often overridden by subclasses and is
>> + #: most commonly an additional newline character.
>> + _command_extra_chars: ClassVar[str] = ""
>> +
>> + #: Path to the executable to start the interactive application.
>> + path: ClassVar[PurePath]
>> +
>> + #: Whether this application is a DPDK app. If it is, the build directory
>> + #: for DPDK on the node will be prepended to the path to the executable.
>> + dpdk_app: ClassVar[bool] = False
>>
>> def __init__(
>> self,
>> @@ -74,6 +66,19 @@ def __init__(
>> app_args: str = "",
>> timeout: float = SETTINGS.timeout,
>> ) -> None:
>> + """Create an SSH channel during initialization.
>> +
>> + Args:
>> + interactive_session: The SSH session dedicated to interactive shells.
>> + logger: The logger instance this session will use.
>> + get_privileged_command: A method for modifying a command to allow it to use
>> + elevated privileges. If :data:`None`, the application will not be started
>> + with elevated privileges.
>> + app_args: The command line arguments to be passed to the application on startup.
>> + timeout: The timeout used for the SSH channel that is dedicated to this interactive
>> + shell. This timeout is for collecting output, so if reading from the buffer
>> + and no output is gathered within the timeout, an exception is thrown.
>> + """
>> self._interactive_session = interactive_session
>> self._ssh_channel = self._interactive_session.invoke_shell()
>> self._stdin = self._ssh_channel.makefile_stdin("w")
>> @@ -90,6 +95,10 @@ def _start_application(self, get_privileged_command: Callable[[str], str] | None
>>
>> This method is often overridden by subclasses as their process for
>> starting may look different.
>> +
>> + Args:
>> + get_privileged_command: A function (but could be any callable) that produces
>> + the version of the command with elevated privileges.
>> """
>> start_command = f"{self.path} {self._app_args}"
>> if get_privileged_command is not None:
>> @@ -97,16 +106,24 @@ def _start_application(self, get_privileged_command: Callable[[str], str] | None
>> self.send_command(start_command)
>>
>> def send_command(self, command: str, prompt: str | None = None) -> str:
>> - """Send a command and get all output before the expected ending string.
>> + """Send `command` and get all output before the expected ending string.
>>
>> Lines that expect input are not included in the stdout buffer, so they cannot
>> - be used for expect. For example, if you were prompted to log into something
>> - with a username and password, you cannot expect "username:" because it won't
>> - yet be in the stdout buffer. A workaround for this could be consuming an
>> - extra newline character to force the current prompt into the stdout buffer.
>> + be used for expect.
>> +
>> + Example:
>> + If you were prompted to log into something with a username and password,
>> + you cannot expect ``username:`` because it won't yet be in the stdout buffer.
>> + A workaround for this could be consuming an extra newline character to force
>> + the current `prompt` into the stdout buffer.
>> +
>> + Args:
>> + command: The command to send.
>> + prompt: After sending the command, `send_command` will be expecting this string.
>> + If :data:`None`, will use the class's default prompt.
>>
>> Returns:
>> - All output in the buffer before expected string
>> + All output in the buffer before expected string.
>> """
>> self._logger.info(f"Sending: '{command}'")
>> if prompt is None:
>> @@ -124,8 +141,10 @@ def send_command(self, command: str, prompt: str | None = None) -> str:
>> return out
>>
>> def close(self) -> None:
>> + """Properly free all resources."""
>> self._stdin.close()
>> self._ssh_channel.close()
>>
>> def __del__(self) -> None:
>> + """Make sure the session is properly closed before deleting the object."""
>> self.close()
>> diff --git a/dts/framework/remote_session/python_shell.py b/dts/framework/remote_session/python_shell.py
>> index cc3ad48a68..ccfd3783e8 100644
>> --- a/dts/framework/remote_session/python_shell.py
>> +++ b/dts/framework/remote_session/python_shell.py
>> @@ -1,12 +1,32 @@
>> # SPDX-License-Identifier: BSD-3-Clause
>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>>
>> +"""Python interactive shell.
>> +
>> +Typical usage example in a TestSuite::
>> +
>> + from framework.remote_session import PythonShell
>> + python_shell = self.tg_node.create_interactive_shell(
>> + PythonShell, timeout=5, privileged=True
>> + )
>> + python_shell.send_command("print('Hello World')")
>> + python_shell.close()
>> +"""
>> +
>> from pathlib import PurePath
>> +from typing import ClassVar
>>
>> from .interactive_shell import InteractiveShell
>>
>>
>> class PythonShell(InteractiveShell):
>> - _default_prompt: str = ">>>"
>> - _command_extra_chars: str = "\n"
>> - path: PurePath = PurePath("python3")
>> + """Python interactive shell."""
>> +
>> + #: Python's prompt.
>> + _default_prompt: ClassVar[str] = ">>>"
>> +
>> + #: This forces the prompt to appear after sending a command.
>> + _command_extra_chars: ClassVar[str] = "\n"
>> +
>> + #: The Python executable.
>> + path: ClassVar[PurePath] = PurePath("python3")
>> diff --git a/dts/framework/remote_session/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
>> index 08ac311016..79481e845c 100644
>> --- a/dts/framework/remote_session/testpmd_shell.py
>> +++ b/dts/framework/remote_session/testpmd_shell.py
>> @@ -1,41 +1,79 @@
>> # SPDX-License-Identifier: BSD-3-Clause
>> # Copyright(c) 2023 University of New Hampshire
>>
>
> Should you add to the copyright here for adding comments?
>
I'll add it, as it sounds fine to me (it is a real contribution), but
I actually don't know.
>>
>> +"""Testpmd interactive shell.
>> +
>> +Typical usage example in a TestSuite::
>> +
>> + testpmd_shell = self.sut_node.create_interactive_shell(
>> + TestPmdShell, privileged=True
>> + )
>> + devices = testpmd_shell.get_devices()
>> + for device in devices:
>> + print(device)
>> + testpmd_shell.close()
>> +"""
>> +
>> from pathlib import PurePath
>> -from typing import Callable
>> +from typing import Callable, ClassVar
>>
>> from .interactive_shell import InteractiveShell
>>
>>
>> class TestPmdDevice(object):
>> + """The data of a device that testpmd can recognize.
>> +
>> + Attributes:
>> + pci_address: The PCI address of the device.
>> + """
>> +
>> pci_address: str
>>
>> def __init__(self, pci_address_line: str):
>> + """Initialize the device from the testpmd output line string.
>> +
>> + Args:
>> + pci_address_line: A line of testpmd output that contains a device.
>> + """
>> self.pci_address = pci_address_line.strip().split(": ")[1].strip()
>>
>> def __str__(self) -> str:
>> + """The PCI address captures what the device is."""
>> return self.pci_address
>>
>>
>> class TestPmdShell(InteractiveShell):
>> - path: PurePath = PurePath("app", "dpdk-testpmd")
>> - dpdk_app: bool = True
>> - _default_prompt: str = "testpmd>"
>> - _command_extra_chars: str = "\n" # We want to append an extra newline to every command
>> + """Testpmd interactive shell.
>> +
>> + The testpmd shell users should never use
>> + the :meth:`~.interactive_shell.InteractiveShell.send_command` method directly, but rather
>> + call specialized methods. If there isn't one that satisfies a need, it should be added.
>> + """
>> +
>> + #: The path to the testpmd executable.
>> + path: ClassVar[PurePath] = PurePath("app", "dpdk-testpmd")
>> +
>> + #: Flag this as a DPDK app so that it's clear this is not a system app and
>> + #: needs to be looked in a specific path.
>> + dpdk_app: ClassVar[bool] = True
>> +
>> + #: The testpmd's prompt.
>> + _default_prompt: ClassVar[str] = "testpmd>"
>> +
>> + #: This forces the prompt to appear after sending a command.
>> + _command_extra_chars: ClassVar[str] = "\n"
>>
>> def _start_application(self, get_privileged_command: Callable[[str], str] | None) -> None:
>> - """See "_start_application" in InteractiveShell."""
>> self._app_args += " -- -i"
>> super()._start_application(get_privileged_command)
>>
>> def get_devices(self) -> list[TestPmdDevice]:
>> - """Get a list of device names that are known to testpmd
>> + """Get a list of device names that are known to testpmd.
>>
>> - Uses the device info listed in testpmd and then parses the output to
>> - return only the names of the devices.
>> + Uses the device info listed in testpmd and then parses the output.
>>
>> Returns:
>> - A list of strings representing device names (e.g. 0000:14:00.1)
>> + A list of devices.
>> """
>> dev_info: str = self.send_command("show device info all")
>> dev_list: list[TestPmdDevice] = []
>> --
>> 2.34.1
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 13/21] dts: port and virtual device docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (11 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 12/21] dts: interactive " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 14/21] dts: cpu " Juraj Linkeš
` (9 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/__init__.py | 17 ++++--
dts/framework/testbed_model/port.py | 53 +++++++++++++++----
dts/framework/testbed_model/virtual_device.py | 17 +++++-
3 files changed, 72 insertions(+), 15 deletions(-)
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 8ced05653b..6086512ca2 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -2,9 +2,20 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-This package contains the classes used to model the physical traffic generator,
-system under test and any other components that need to be interacted with.
+"""Testbed modelling.
+
+This package defines the testbed elements DTS works with:
+
+ * A system under test node: :class:`~.sut_node.SutNode`,
+ * A traffic generator node: :class:`~.tg_node.TGNode`,
+ * The ports of network interface cards (NICs) present on nodes: :class:`~.port.Port`,
+ * The logical cores of CPUs present on nodes: :class:`~.cpu.LogicalCore`,
+ * The virtual devices that can be created on nodes: :class:`~.virtual_device.VirtualDevice`,
+ * The operating systems running on nodes: :class:`~.linux_session.LinuxSession`
+ and :class:`~.posix_session.PosixSession`.
+
+DTS needs to be able to connect to nodes and understand some of the hardware present on these nodes
+to properly build and test DPDK.
"""
# pylama:ignore=W0611
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 680c29bfe3..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -2,6 +2,13 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""NIC port model.
+
+Basic port information, such as location (the port are identified by their PCI address on a node),
+drivers and address.
+"""
+
+
from dataclasses import dataclass
from framework.config import PortConfig
@@ -9,24 +16,35 @@
@dataclass(slots=True, frozen=True)
class PortIdentifier:
+ """The port identifier.
+
+ Attributes:
+ node: The node where the port resides.
+ pci: The PCI address of the port on `node`.
+ """
+
node: str
pci: str
@dataclass(slots=True)
class Port:
- """
- identifier: The PCI address of the port on a node.
-
- os_driver: The driver used by this port when the OS is controlling it.
- Example: i40e
- os_driver_for_dpdk: The driver the device must be bound to for DPDK to use it,
- Example: vfio-pci.
+ """Physical port on a node.
- Note: os_driver and os_driver_for_dpdk may be the same thing.
- Example: mlx5_core
+ The ports are identified by the node they're on and their PCI addresses. The port on the other
+ side of the connection is also captured here.
+ Each port is serviced by a driver, which may be different for the operating system (`os_driver`)
+ and for DPDK (`os_driver_for_dpdk`). For some devices, they are the same, e.g.: ``mlx5_core``.
- peer: The identifier of a port this port is connected with.
+ Attributes:
+ identifier: The PCI address of the port on a node.
+ os_driver: The operating system driver name when the operating system controls the port,
+ e.g.: ``i40e``.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK, e.g.: ``vfio-pci``.
+ peer: The identifier of a port this port is connected with.
+ The `peer` is on a different node.
+ mac_address: The MAC address of the port.
+ logical_name: The logical name of the port. Must be discovered.
"""
identifier: PortIdentifier
@@ -37,6 +55,12 @@ class Port:
logical_name: str = ""
def __init__(self, node_name: str, config: PortConfig):
+ """Initialize the port from `node_name` and `config`.
+
+ Args:
+ node_name: The name of the port's node.
+ config: The test run configuration of the port.
+ """
self.identifier = PortIdentifier(
node=node_name,
pci=config.pci,
@@ -47,14 +71,23 @@ def __init__(self, node_name: str, config: PortConfig):
@property
def node(self) -> str:
+ """The node where the port resides."""
return self.identifier.node
@property
def pci(self) -> str:
+ """The PCI address of the port."""
return self.identifier.pci
@dataclass(slots=True, frozen=True)
class PortLink:
+ """The physical, cabled connection between the ports.
+
+ Attributes:
+ sut_port: The port on the SUT node connected to `tg_port`.
+ tg_port: The port on the TG node connected to `sut_port`.
+ """
+
sut_port: Port
tg_port: Port
diff --git a/dts/framework/testbed_model/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
index eb664d9f17..e9b5e9c3be 100644
--- a/dts/framework/testbed_model/virtual_device.py
+++ b/dts/framework/testbed_model/virtual_device.py
@@ -1,16 +1,29 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Virtual devices model.
+
+Alongside support for physical hardware, DPDK can create various virtual devices.
+"""
+
class VirtualDevice(object):
- """
- Base class for virtual devices used by DPDK.
+ """Base class for virtual devices used by DPDK.
+
+ Attributes:
+ name: The name of the virtual device.
"""
name: str
def __init__(self, name: str):
+ """Initialize the virtual device.
+
+ Args:
+ name: The name of the virtual device.
+ """
self.name = name
def __str__(self) -> str:
+ """This corresponds to the name used for DPDK devices."""
return self.name
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 14/21] dts: cpu docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (12 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 13/21] dts: port and virtual device " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 15/21] dts: os session " Juraj Linkeš
` (8 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/cpu.py | 196 +++++++++++++++++++++--------
1 file changed, 144 insertions(+), 52 deletions(-)
diff --git a/dts/framework/testbed_model/cpu.py b/dts/framework/testbed_model/cpu.py
index 1b392689f5..9e33b2825d 100644
--- a/dts/framework/testbed_model/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -1,6 +1,22 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""CPU core representation and filtering.
+
+This module provides a unified representation of logical CPU cores along
+with filtering capabilities.
+
+When symmetric multiprocessing (SMP or multithreading) is enabled on a server,
+the physical CPU cores are split into logical CPU cores with different IDs.
+
+:class:`LogicalCoreCountFilter` filters by the number of logical cores. It's possible to specify
+the socket from which to filter the number of logical cores. It's also possible to not use all
+logical CPU cores from each physical core (e.g. only the first logical core of each physical core).
+
+:class:`LogicalCoreListFilter` filters by logical core IDs. This mostly checks that
+the logical cores are actually present on the server.
+"""
+
import dataclasses
from abc import ABC, abstractmethod
from collections.abc import Iterable, ValuesView
@@ -11,9 +27,17 @@
@dataclass(slots=True, frozen=True)
class LogicalCore(object):
- """
- Representation of a CPU core. A physical core is represented in OS
- by multiple logical cores (lcores) if CPU multithreading is enabled.
+ """Representation of a logical CPU core.
+
+ A physical core is represented in OS by multiple logical cores (lcores)
+ if CPU multithreading is enabled. When multithreading is disabled, their IDs are the same.
+
+ Attributes:
+ lcore: The logical core ID of a CPU core. It's the same as `core` with
+ disabled multithreading.
+ core: The physical core ID of a CPU core.
+ socket: The physical socket ID where the CPU resides.
+ node: The NUMA node ID where the CPU resides.
"""
lcore: int
@@ -22,27 +46,36 @@ class LogicalCore(object):
node: int
def __int__(self) -> int:
+ """The CPU is best represented by the logical core, as that's what we configure in EAL."""
return self.lcore
class LogicalCoreList(object):
- """
- Convert these options into a list of logical core ids.
- lcore_list=[LogicalCore1, LogicalCore2] - a list of LogicalCores
- lcore_list=[0,1,2,3] - a list of int indices
- lcore_list=['0','1','2-3'] - a list of str indices; ranges are supported
- lcore_list='0,1,2-3' - a comma delimited str of indices; ranges are supported
-
- The class creates a unified format used across the framework and allows
- the user to use either a str representation (using str(instance) or directly
- in f-strings) or a list representation (by accessing instance.lcore_list).
- Empty lcore_list is allowed.
+ r"""A unified way to store :class:`LogicalCore`\s.
+
+ Create a unified format used across the framework and allow the user to use
+ either a :class:`str` representation (using ``str(instance)`` or directly in f-strings)
+ or a :class:`list` representation (by accessing the `lcore_list` property,
+ which stores logical core IDs).
"""
_lcore_list: list[int]
_lcore_str: str
def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
+ """Process `lcore_list`, then sort.
+
+ There are four supported logical core list formats::
+
+ lcore_list=[LogicalCore1, LogicalCore2] # a list of LogicalCores
+ lcore_list=[0,1,2,3] # a list of int indices
+ lcore_list=['0','1','2-3'] # a list of str indices; ranges are supported
+ lcore_list='0,1,2-3' # a comma delimited str of indices; ranges are supported
+
+ Args:
+ lcore_list: Various ways to represent multiple logical cores.
+ Empty `lcore_list` is allowed.
+ """
self._lcore_list = []
if isinstance(lcore_list, str):
lcore_list = lcore_list.split(",")
@@ -58,6 +91,7 @@ def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
@property
def lcore_list(self) -> list[int]:
+ """The logical core IDs."""
return self._lcore_list
def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
@@ -83,28 +117,30 @@ def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
return formatted_core_list
def __str__(self) -> str:
+ """The consecutive ranges of logical core IDs."""
return self._lcore_str
@dataclasses.dataclass(slots=True, frozen=True)
class LogicalCoreCount(object):
- """
- Define the number of logical cores to use.
- If sockets is not None, socket_count is ignored.
- """
+ """Define the number of logical cores per physical cores per sockets."""
+ #: Use this many logical cores per each physical core.
lcores_per_core: int = 1
+ #: Use this many physical cores per each socket.
cores_per_socket: int = 2
+ #: Use this many sockets.
socket_count: int = 1
+ #: Use exactly these sockets. This takes precedence over `socket_count`,
+ #: so when `sockets` is not :data:`None`, `socket_count` is ignored.
sockets: list[int] | None = None
class LogicalCoreFilter(ABC):
- """
- Filter according to the input filter specifier. Each filter needs to be
- implemented in a derived class.
- This class only implements operations common to all filters, such as sorting
- the list to be filtered beforehand.
+ """Common filtering class.
+
+ Each filter needs to be implemented in a subclass. This base class sorts the list of cores
+ and defines the filtering method, which must be implemented by subclasses.
"""
_filter_specifier: LogicalCoreCount | LogicalCoreList
@@ -116,6 +152,17 @@ def __init__(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
):
+ """Filter according to the input filter specifier.
+
+ The input `lcore_list` is copied and sorted by physical core before filtering.
+ The list is copied so that the original is left intact.
+
+ Args:
+ lcore_list: The logical CPU cores to filter.
+ filter_specifier: Filter cores from `lcore_list` according to this filter.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If data:`False`,
+ sort in descending order.
+ """
self._filter_specifier = filter_specifier
# sorting by core is needed in case hyperthreading is enabled
@@ -124,31 +171,45 @@ def __init__(
@abstractmethod
def filter(self) -> list[LogicalCore]:
- """
- Use self._filter_specifier to filter self._lcores_to_filter
- and return the list of filtered LogicalCores.
- self._lcores_to_filter is a sorted copy of the original list,
- so it may be modified.
+ r"""Filter the cores.
+
+ Use `self._filter_specifier` to filter `self._lcores_to_filter` and return
+ the filtered :class:`LogicalCore`\s.
+ `self._lcores_to_filter` is a sorted copy of the original list, so it may be modified.
+
+ Returns:
+ The filtered cores.
"""
class LogicalCoreCountFilter(LogicalCoreFilter):
- """
+ """Filter cores by specified counts.
+
Filter the input list of LogicalCores according to specified rules:
- Use cores from the specified number of sockets or from the specified socket ids.
- If sockets is specified, it takes precedence over socket_count.
- From each of those sockets, use only cores_per_socket of cores.
- And for each core, use lcores_per_core of logical cores. Hypertheading
- must be enabled for this to take effect.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+
+ * The input `filter_specifier` is :class:`LogicalCoreCount`,
+ * Use cores from the specified number of sockets or from the specified socket ids,
+ * If `sockets` is specified, it takes precedence over `socket_count`,
+ * From each of those sockets, use only `cores_per_socket` of cores,
+ * And for each core, use `lcores_per_core` of logical cores. Hypertheading
+ must be enabled for this to take effect.
"""
_filter_specifier: LogicalCoreCount
def filter(self) -> list[LogicalCore]:
+ """Filter the cores according to :class:`LogicalCoreCount`.
+
+ Start by filtering the allowed sockets. The cores matching the allowed sockets are returned.
+ The cores of each socket are stored in separate lists.
+
+ Then filter the allowed physical cores from those lists of cores per socket. When filtering
+ physical cores, store the desired number of logical cores per physical core which then
+ together constitute the final filtered list.
+
+ Returns:
+ The filtered cores.
+ """
sockets_to_filter = self._filter_sockets(self._lcores_to_filter)
filtered_lcores = []
for socket_to_filter in sockets_to_filter:
@@ -158,24 +219,37 @@ def filter(self) -> list[LogicalCore]:
def _filter_sockets(
self, lcores_to_filter: Iterable[LogicalCore]
) -> ValuesView[list[LogicalCore]]:
- """
- Remove all lcores that don't match the specified socket(s).
- If self._filter_specifier.sockets is not None, keep lcores from those sockets,
- otherwise keep lcores from the first
- self._filter_specifier.socket_count sockets.
+ """Filter a list of cores per each allowed socket.
+
+ The sockets may be specified in two ways, either a number or a specific list of sockets.
+ In case of a specific list, we just need to return the cores from those sockets.
+ If filtering a number of cores, we need to go through all cores and note which sockets
+ appear and only filter from the first n that appear.
+
+ Args:
+ lcores_to_filter: The cores to filter. These must be sorted by the physical core.
+
+ Returns:
+ A list of lists of logical CPU cores. Each list contains cores from one socket.
"""
allowed_sockets: set[int] = set()
socket_count = self._filter_specifier.socket_count
if self._filter_specifier.sockets:
+ # when sockets in filter is specified, the sockets are already set
socket_count = len(self._filter_specifier.sockets)
allowed_sockets = set(self._filter_specifier.sockets)
+ # filter socket_count sockets from all sockets by checking the socket of each CPU
filtered_lcores: dict[int, list[LogicalCore]] = {}
for lcore in lcores_to_filter:
if not self._filter_specifier.sockets:
+ # this is when sockets is not set, so we do the actual filtering
+ # when it is set, allowed_sockets is already defined and can't be changed
if len(allowed_sockets) < socket_count:
+ # allowed_sockets is a set, so adding an existing socket won't re-add it
allowed_sockets.add(lcore.socket)
if lcore.socket in allowed_sockets:
+ # separate lcores into sockets; this makes it easier in further processing
if lcore.socket in filtered_lcores:
filtered_lcores[lcore.socket].append(lcore)
else:
@@ -192,12 +266,13 @@ def _filter_sockets(
def _filter_cores_from_socket(
self, lcores_to_filter: Iterable[LogicalCore]
) -> list[LogicalCore]:
- """
- Keep only the first self._filter_specifier.cores_per_socket cores.
- In multithreaded environments, keep only
- the first self._filter_specifier.lcores_per_core lcores of those cores.
- """
+ """Filter a list of cores from the given socket.
+
+ Go through the cores and note how many logical cores per physical core have been filtered.
+ Returns:
+ The filtered logical CPU cores.
+ """
# no need to use ordered dict, from Python3.7 the dict
# insertion order is preserved (LIFO).
lcore_count_per_core_map: dict[int, int] = {}
@@ -238,15 +313,21 @@ def _filter_cores_from_socket(
class LogicalCoreListFilter(LogicalCoreFilter):
- """
- Filter the input list of Logical Cores according to the input list of
- lcore indices.
- An empty LogicalCoreList won't filter anything.
+ """Filter the logical CPU cores by logical CPU core IDs.
+
+ This is a simple filter that looks at logical CPU IDs and only filter those that match.
+
+ The input filter is :class:`LogicalCoreList`. An empty LogicalCoreList won't filter anything.
"""
_filter_specifier: LogicalCoreList
def filter(self) -> list[LogicalCore]:
+ """Filter based on logical CPU core ID.
+
+ Return:
+ The filtered logical CPU cores.
+ """
if not len(self._filter_specifier.lcore_list):
return self._lcores_to_filter
@@ -269,6 +350,17 @@ def lcore_filter(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool,
) -> LogicalCoreFilter:
+ """Factory for providing the filter that corresponds to `filter_specifier`.
+
+ Args:
+ core_list: The logical CPU cores to filter.
+ filter_specifier: The filter to use.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If :data:`False`,
+ sort in descending order.
+
+ Returns:
+ The filter that corresponds to `filter_specifier`.
+ """
if isinstance(filter_specifier, LogicalCoreList):
return LogicalCoreListFilter(core_list, filter_specifier, ascending)
elif isinstance(filter_specifier, LogicalCoreCount):
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 15/21] dts: os session docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (13 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 14/21] dts: cpu " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-12-01 17:33 ` Jeremy Spewock
2023-11-23 15:13 ` [PATCH v8 16/21] dts: posix and linux sessions " Juraj Linkeš
` (7 subsequent siblings)
22 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/os_session.py | 272 ++++++++++++++++------
1 file changed, 205 insertions(+), 67 deletions(-)
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 76e595a518..cfdbd1c4bd 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -2,6 +2,26 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""OS-aware remote session.
+
+DPDK supports multiple different operating systems, meaning it can run on these different operating
+systems. This module defines the common API that OS-unaware layers use and translates the API into
+OS-aware calls/utility usage.
+
+Note:
+ Running commands with administrative privileges requires OS awareness. This is the only layer
+ that's aware of OS differences, so this is where non-privileged command get converted
+ to privileged commands.
+
+Example:
+ A user wishes to remove a directory on a remote :class:`~.sut_node.SutNode`.
+ The :class:`~.sut_node.SutNode` object isn't aware what OS the node is running - it delegates
+ the OS translation logic to :attr:`~.node.Node.main_session`. The SUT node calls
+ :meth:`~OSSession.remove_remote_dir` with a generic, OS-unaware path and
+ the :attr:`~.node.Node.main_session` translates that to ``rm -rf`` if the node's OS is Linux
+ and other commands for other OSs. It also translates the path to match the underlying OS.
+"""
+
from abc import ABC, abstractmethod
from collections.abc import Iterable
from ipaddress import IPv4Interface, IPv6Interface
@@ -28,10 +48,16 @@
class OSSession(ABC):
- """
- The OS classes create a DTS node remote session and implement OS specific
+ """OS-unaware to OS-aware translation API definition.
+
+ The OSSession classes create a remote session to a DTS node and implement OS specific
behavior. There a few control methods implemented by the base class, the rest need
- to be implemented by derived classes.
+ to be implemented by subclasses.
+
+ Attributes:
+ name: The name of the session.
+ remote_session: The remote session maintaining the connection to the node.
+ interactive_session: The interactive remote session maintaining the connection to the node.
"""
_config: NodeConfiguration
@@ -46,6 +72,15 @@ def __init__(
name: str,
logger: DTSLOG,
):
+ """Initialize the OS-aware session.
+
+ Connect to the node right away and also create an interactive remote session.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
self._config = node_config
self.name = name
self._logger = logger
@@ -53,15 +88,15 @@ def __init__(
self.interactive_session = create_interactive_session(node_config, logger)
def close(self, force: bool = False) -> None:
- """
- Close the remote session.
+ """Close the underlying remote session.
+
+ Args:
+ force: Force the closure of the connection.
"""
self.remote_session.close(force)
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the underlying remote session is still responding."""
return self.remote_session.is_alive()
def send_command(
@@ -72,10 +107,23 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- An all-purpose API in case the command to be executed is already
- OS-agnostic, such as when the path to the executed command has been
- constructed beforehand.
+ """An all-purpose API for OS-agnostic commands.
+
+ This can be used for an execution of a portable command that's executed the same way
+ on all operating systems, such as Python.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds for `command` execution to complete.
+ privileged: Whether to run the command with administrative privileges.
+ verify: If :data:`True`, will check the exit code of the command.
+ env: A dictionary with environment variables to be used with the command execution.
+
+ Raises:
+ RemoteCommandExecutionError: If verify is :data:`True` and the command failed.
"""
if privileged:
command = self._get_privileged_command(command)
@@ -89,8 +137,20 @@ def create_interactive_shell(
privileged: bool,
app_args: str,
) -> InteractiveShellType:
- """
- See "create_interactive_shell" in SutNode
+ """Factory for interactive session handlers.
+
+ Instantiate `shell_cls` according to the remote OS specifics.
+
+ Args:
+ shell_cls: The class of the shell.
+ timeout: Timeout for reading output from the SSH channel. If you are
+ reading from the buffer and don't receive any data within the timeout
+ it will throw an error.
+ privileged: Whether to run the shell with administrative privileges.
+ app_args: The arguments to be passed to the application.
+
+ Returns:
+ An instance of the desired interactive application shell.
"""
return shell_cls(
self.interactive_session.session,
@@ -114,27 +174,42 @@ def _get_privileged_command(command: str) -> str:
@abstractmethod
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
- """
- Try to find DPDK remote dir in remote_dir.
+ """Try to find DPDK directory in `remote_dir`.
+
+ The directory is the one which is created after the extraction of the tarball. The files
+ are usually extracted into a directory starting with ``dpdk-``.
+
+ Returns:
+ The absolute path of the DPDK remote directory, empty path if not found.
"""
@abstractmethod
def get_remote_tmp_dir(self) -> PurePath:
- """
- Get the path of the temporary directory of the remote OS.
+ """Get the path of the temporary directory of the remote OS.
+
+ Returns:
+ The absolute path of the temporary directory.
"""
@abstractmethod
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for the target architecture. Get
- information from the node if needed.
+ """Create extra environment variables needed for the target architecture.
+
+ Different architectures may require different configuration, such as setting 32-bit CFLAGS.
+
+ Returns:
+ A dictionary with keys as environment variables.
"""
@abstractmethod
def join_remote_path(self, *args: str | PurePath) -> PurePath:
- """
- Join path parts using the path separator that fits the remote OS.
+ """Join path parts using the path separator that fits the remote OS.
+
+ Args:
+ args: Any number of paths to join.
+
+ Returns:
+ The resulting joined path.
"""
@abstractmethod
@@ -143,13 +218,13 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from the remote Node to the local filesystem.
+ """Copy a file from the remote node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote node associated with this remote
+ session to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
+ source_file: the file on the remote node.
destination_file: a file or directory path on the local filesystem.
"""
@@ -159,14 +234,14 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from local filesystem to the remote Node.
+ """Copy a file from local filesystem to the remote node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file`
+ on the remote node associated with this remote session.
Args:
source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ destination_file: a file or directory path on the remote node.
"""
@abstractmethod
@@ -176,8 +251,12 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
- """
- Remove remote directory, by default remove recursively and forcefully.
+ """Remove remote directory, by default remove recursively and forcefully.
+
+ Args:
+ remote_dir_path: The path of the directory to remove.
+ recursive: If :data:`True`, also remove all contents inside the directory.
+ force: If :data:`True`, ignore all warnings and try to remove at all costs.
"""
@abstractmethod
@@ -186,9 +265,12 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
- """
- Extract remote tarball in place. If expected_dir is a non-empty string, check
- whether the dir exists after extracting the archive.
+ """Extract remote tarball in its remote directory.
+
+ Args:
+ remote_tarball_path: The path of the tarball on the remote node.
+ expected_dir: If non-empty, check whether `expected_dir` exists after extracting
+ the archive.
"""
@abstractmethod
@@ -201,69 +283,119 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
- """
- Build DPDK in the input dir with specified environment variables and meson
- arguments.
+ """Build DPDK on the remote node.
+
+ An extracted DPDK tarball must be present on the node. The build consists of two steps::
+
+ meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
+ ninja -C remote_dpdk_build_dir
+
+ The :option:`--compile-timeout` command line argument and the :envvar:`DTS_COMPILE_TIMEOUT`
+ environment variable configure the timeout of DPDK build.
+
+ Args:
+ env_vars: Use these environment variables then building DPDK.
+ meson_args: Use these meson arguments when building DPDK.
+ remote_dpdk_dir: The directory on the remote node where DPDK will be built.
+ remote_dpdk_build_dir: The target build directory on the remote node.
+ rebuild: If :data:`True`, do a subsequent build with ``meson configure`` instead
+ of ``meson setup``.
+ timeout: Wait at most this long in seconds for the build execution to complete.
"""
@abstractmethod
def get_dpdk_version(self, version_path: str | PurePath) -> str:
- """
- Inspect DPDK version on the remote node from version_path.
+ """Inspect the DPDK version on the remote node.
+
+ Args:
+ version_path: The path to the VERSION file containing the DPDK version.
+
+ Returns:
+ The DPDK version.
"""
@abstractmethod
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
- """
- Compose a list of LogicalCores present on the remote node.
- If use_first_core is False, the first physical core won't be used.
+ r"""Get the list of :class:`~.cpu.LogicalCore`\s on the remote node.
+
+ Args:
+ use_first_core: If :data:`False`, the first physical core won't be used.
+
+ Returns:
+ The logical cores present on the node.
"""
@abstractmethod
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
- """
- Kill and cleanup all DPDK apps identified by dpdk_prefix_list. If
- dpdk_prefix_list is empty, attempt to find running DPDK apps to kill and clean.
+ """Kill and cleanup all DPDK apps.
+
+ Args:
+ dpdk_prefix_list: Kill all apps identified by `dpdk_prefix_list`.
+ If `dpdk_prefix_list` is empty, attempt to find running DPDK apps to kill and clean.
"""
@abstractmethod
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
- """
- Get the DPDK file prefix that will be used when running DPDK apps.
+ """Make OS-specific modification to the DPDK file prefix.
+
+ Args:
+ dpdk_prefix: The OS-unaware file prefix.
+
+ Returns:
+ The OS-specific file prefix.
"""
@abstractmethod
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
- """
- Get the node's Hugepage Size, configure the specified amount of hugepages
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Configure hugepages on the node.
+
+ Get the node's Hugepage Size, configure the specified count of hugepages
if needed and mount the hugepages if needed.
- If force_first_numa is True, configure hugepages just on the first socket.
+
+ Args:
+ hugepage_count: Configure this many hugepages.
+ force_first_numa: If :data:`True`, configure hugepages just on the first numa node.
"""
@abstractmethod
def get_compiler_version(self, compiler_name: str) -> str:
- """
- Get installed version of compiler used for DPDK
+ """Get installed version of compiler used for DPDK.
+
+ Args:
+ compiler_name: The name of the compiler executable.
+
+ Returns:
+ The compiler's version.
"""
@abstractmethod
def get_node_info(self) -> NodeInfo:
- """
- Collect information about the node
+ """Collect additional information about the node.
+
+ Returns:
+ Node information.
"""
@abstractmethod
def update_ports(self, ports: list[Port]) -> None:
- """
- Get additional information about ports:
- Logical name (e.g. enp7s0) if applicable
- Mac address
+ """Get additional information about ports from the operating system and update them.
+
+ The additional information is:
+
+ * Logical name (e.g. ``enp7s0``) if applicable,
+ * Mac address.
+
+ Args:
+ ports: The ports to update.
"""
@abstractmethod
def configure_port_state(self, port: Port, enable: bool) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port` in the operating system.
+
+ Args:
+ port: The port to configure.
+ enable: If :data:`True`, enable the port, otherwise shut it down.
"""
@abstractmethod
@@ -273,12 +405,18 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
- """
- Configure (add or delete) an IP address of the input port.
+ """Configure an IP address on `port` in the operating system.
+
+ Args:
+ address: The address to configure.
+ port: The port to configure.
+ delete: If :data:`True`, remove the IP address, otherwise configure it.
"""
@abstractmethod
def configure_ipv4_forwarding(self, enable: bool) -> None:
- """
- Enable IPv4 forwarding in the underlying OS.
+ """Enable IPv4 forwarding in the operating system.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
"""
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 15/21] dts: os session docstring update
2023-11-23 15:13 ` [PATCH v8 15/21] dts: os session " Juraj Linkeš
@ 2023-12-01 17:33 ` Jeremy Spewock
2023-12-04 9:53 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-12-01 17:33 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
[-- Attachment #1: Type: text/plain, Size: 18401 bytes --]
On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/testbed_model/os_session.py | 272 ++++++++++++++++------
> 1 file changed, 205 insertions(+), 67 deletions(-)
>
> diff --git a/dts/framework/testbed_model/os_session.py
> b/dts/framework/testbed_model/os_session.py
> index 76e595a518..cfdbd1c4bd 100644
> --- a/dts/framework/testbed_model/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -2,6 +2,26 @@
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2023 University of New Hampshire
>
> +"""OS-aware remote session.
> +
> +DPDK supports multiple different operating systems, meaning it can run on
> these different operating
> +systems. This module defines the common API that OS-unaware layers use
> and translates the API into
> +OS-aware calls/utility usage.
> +
> +Note:
> + Running commands with administrative privileges requires OS
> awareness. This is the only layer
> + that's aware of OS differences, so this is where non-privileged
> command get converted
> + to privileged commands.
> +
> +Example:
> + A user wishes to remove a directory on a remote
> :class:`~.sut_node.SutNode`.
> + The :class:`~.sut_node.SutNode` object isn't aware what OS the node
> is running - it delegates
> + the OS translation logic to :attr:`~.node.Node.main_session`. The SUT
> node calls
> + :meth:`~OSSession.remove_remote_dir` with a generic, OS-unaware path
> and
> + the :attr:`~.node.Node.main_session` translates that to ``rm -rf`` if
> the node's OS is Linux
> + and other commands for other OSs. It also translates the path to
> match the underlying OS.
> +"""
> +
> from abc import ABC, abstractmethod
> from collections.abc import Iterable
> from ipaddress import IPv4Interface, IPv6Interface
> @@ -28,10 +48,16 @@
>
>
> class OSSession(ABC):
> - """
> - The OS classes create a DTS node remote session and implement OS
> specific
> + """OS-unaware to OS-aware translation API definition.
> +
> + The OSSession classes create a remote session to a DTS node and
> implement OS specific
> behavior. There a few control methods implemented by the base class,
> the rest need
> - to be implemented by derived classes.
> + to be implemented by subclasses.
> +
> + Attributes:
> + name: The name of the session.
> + remote_session: The remote session maintaining the connection to
> the node.
> + interactive_session: The interactive remote session maintaining
> the connection to the node.
> """
>
> _config: NodeConfiguration
> @@ -46,6 +72,15 @@ def __init__(
> name: str,
> logger: DTSLOG,
> ):
> + """Initialize the OS-aware session.
> +
> + Connect to the node right away and also create an interactive
> remote session.
> +
> + Args:
> + node_config: The test run configuration of the node to
> connect to.
> + name: The name of the session.
> + logger: The logger instance this session will use.
> + """
> self._config = node_config
> self.name = name
> self._logger = logger
> @@ -53,15 +88,15 @@ def __init__(
> self.interactive_session =
> create_interactive_session(node_config, logger)
>
> def close(self, force: bool = False) -> None:
> - """
> - Close the remote session.
> + """Close the underlying remote session.
> +
> + Args:
> + force: Force the closure of the connection.
> """
> self.remote_session.close(force)
>
> def is_alive(self) -> bool:
> - """
> - Check whether the remote session is still responding.
> - """
> + """Check whether the underlying remote session is still
> responding."""
> return self.remote_session.is_alive()
>
> def send_command(
> @@ -72,10 +107,23 @@ def send_command(
> verify: bool = False,
> env: dict | None = None,
> ) -> CommandResult:
> - """
> - An all-purpose API in case the command to be executed is already
> - OS-agnostic, such as when the path to the executed command has
> been
> - constructed beforehand.
> + """An all-purpose API for OS-agnostic commands.
> +
> + This can be used for an execution of a portable command that's
> executed the same way
> + on all operating systems, such as Python.
> +
> + The :option:`--timeout` command line argument and the
> :envvar:`DTS_TIMEOUT`
> + environment variable configure the timeout of command execution.
> +
> + Args:
> + command: The command to execute.
> + timeout: Wait at most this long in seconds for `command`
> execution to complete.
> + privileged: Whether to run the command with administrative
> privileges.
> + verify: If :data:`True`, will check the exit code of the
> command.
> + env: A dictionary with environment variables to be used with
> the command execution.
> +
> + Raises:
> + RemoteCommandExecutionError: If verify is :data:`True` and
> the command failed.
> """
> if privileged:
> command = self._get_privileged_command(command)
> @@ -89,8 +137,20 @@ def create_interactive_shell(
> privileged: bool,
> app_args: str,
> ) -> InteractiveShellType:
> - """
> - See "create_interactive_shell" in SutNode
> + """Factory for interactive session handlers.
> +
> + Instantiate `shell_cls` according to the remote OS specifics.
> +
> + Args:
> + shell_cls: The class of the shell.
> + timeout: Timeout for reading output from the SSH channel. If
> you are
> + reading from the buffer and don't receive any data within
> the timeout
> + it will throw an error.
> + privileged: Whether to run the shell with administrative
> privileges.
> + app_args: The arguments to be passed to the application.
> +
> + Returns:
> + An instance of the desired interactive application shell.
> """
> return shell_cls(
> self.interactive_session.session,
> @@ -114,27 +174,42 @@ def _get_privileged_command(command: str) -> str:
>
> @abstractmethod
> def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) ->
> PurePath:
> - """
> - Try to find DPDK remote dir in remote_dir.
> + """Try to find DPDK directory in `remote_dir`.
> +
> + The directory is the one which is created after the extraction of
> the tarball. The files
> + are usually extracted into a directory starting with ``dpdk-``.
> +
> + Returns:
> + The absolute path of the DPDK remote directory, empty path if
> not found.
> """
>
> @abstractmethod
> def get_remote_tmp_dir(self) -> PurePath:
> - """
> - Get the path of the temporary directory of the remote OS.
> + """Get the path of the temporary directory of the remote OS.
> +
> + Returns:
> + The absolute path of the temporary directory.
> """
>
> @abstractmethod
> def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
> - """
> - Create extra environment variables needed for the target
> architecture. Get
> - information from the node if needed.
> + """Create extra environment variables needed for the target
> architecture.
> +
> + Different architectures may require different configuration, such
> as setting 32-bit CFLAGS.
> +
> + Returns:
> + A dictionary with keys as environment variables.
> """
>
> @abstractmethod
> def join_remote_path(self, *args: str | PurePath) -> PurePath:
> - """
> - Join path parts using the path separator that fits the remote OS.
> + """Join path parts using the path separator that fits the remote
> OS.
> +
> + Args:
> + args: Any number of paths to join.
> +
> + Returns:
> + The resulting joined path.
> """
>
> @abstractmethod
> @@ -143,13 +218,13 @@ def copy_from(
> source_file: str | PurePath,
> destination_file: str | PurePath,
> ) -> None:
> - """Copy a file from the remote Node to the local filesystem.
> + """Copy a file from the remote node to the local filesystem.
>
> - Copy source_file from the remote Node associated with this remote
> - session to destination_file on the local filesystem.
> + Copy `source_file` from the remote node associated with this
> remote
> + session to `destination_file` on the local filesystem.
>
> Args:
> - source_file: the file on the remote Node.
> + source_file: the file on the remote node.
> destination_file: a file or directory path on the local
> filesystem.
> """
>
> @@ -159,14 +234,14 @@ def copy_to(
> source_file: str | PurePath,
> destination_file: str | PurePath,
> ) -> None:
> - """Copy a file from local filesystem to the remote Node.
> + """Copy a file from local filesystem to the remote node.
>
> - Copy source_file from local filesystem to destination_file
> - on the remote Node associated with this remote session.
> + Copy `source_file` from local filesystem to `destination_file`
> + on the remote node associated with this remote session.
>
> Args:
> source_file: the file on the local filesystem.
> - destination_file: a file or directory path on the remote Node.
> + destination_file: a file or directory path on the remote node.
> """
>
> @abstractmethod
> @@ -176,8 +251,12 @@ def remove_remote_dir(
> recursive: bool = True,
> force: bool = True,
> ) -> None:
> - """
> - Remove remote directory, by default remove recursively and
> forcefully.
> + """Remove remote directory, by default remove recursively and
> forcefully.
> +
> + Args:
> + remote_dir_path: The path of the directory to remove.
> + recursive: If :data:`True`, also remove all contents inside
> the directory.
> + force: If :data:`True`, ignore all warnings and try to remove
> at all costs.
> """
>
> @abstractmethod
> @@ -186,9 +265,12 @@ def extract_remote_tarball(
> remote_tarball_path: str | PurePath,
> expected_dir: str | PurePath | None = None,
> ) -> None:
> - """
> - Extract remote tarball in place. If expected_dir is a non-empty
> string, check
> - whether the dir exists after extracting the archive.
> + """Extract remote tarball in its remote directory.
> +
> + Args:
> + remote_tarball_path: The path of the tarball on the remote
> node.
> + expected_dir: If non-empty, check whether `expected_dir`
> exists after extracting
> + the archive.
> """
>
> @abstractmethod
> @@ -201,69 +283,119 @@ def build_dpdk(
> rebuild: bool = False,
> timeout: float = SETTINGS.compile_timeout,
> ) -> None:
> - """
> - Build DPDK in the input dir with specified environment variables
> and meson
> - arguments.
> + """Build DPDK on the remote node.
> +
> + An extracted DPDK tarball must be present on the node. The build
> consists of two steps::
> +
> + meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
> + ninja -C remote_dpdk_build_dir
> +
> + The :option:`--compile-timeout` command line argument and the
> :envvar:`DTS_COMPILE_TIMEOUT`
> + environment variable configure the timeout of DPDK build.
> +
> + Args:
> + env_vars: Use these environment variables then building DPDK.
>
I think this is meant to be "when building DPDK" instead.
> + meson_args: Use these meson arguments when building DPDK.
> + remote_dpdk_dir: The directory on the remote node where DPDK
> will be built.
> + remote_dpdk_build_dir: The target build directory on the
> remote node.
> + rebuild: If :data:`True`, do a subsequent build with ``meson
> configure`` instead
> + of ``meson setup``.
> + timeout: Wait at most this long in seconds for the build
> execution to complete.
> """
>
> @abstractmethod
> def get_dpdk_version(self, version_path: str | PurePath) -> str:
> - """
> - Inspect DPDK version on the remote node from version_path.
> + """Inspect the DPDK version on the remote node.
> +
> + Args:
> + version_path: The path to the VERSION file containing the
> DPDK version.
> +
> + Returns:
> + The DPDK version.
> """
>
> @abstractmethod
> def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
> - """
> - Compose a list of LogicalCores present on the remote node.
> - If use_first_core is False, the first physical core won't be used.
> + r"""Get the list of :class:`~.cpu.LogicalCore`\s on the remote
> node.
> +
> + Args:
> + use_first_core: If :data:`False`, the first physical core
> won't be used.
> +
> + Returns:
> + The logical cores present on the node.
> """
>
> @abstractmethod
> def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) ->
> None:
> - """
> - Kill and cleanup all DPDK apps identified by dpdk_prefix_list. If
> - dpdk_prefix_list is empty, attempt to find running DPDK apps to
> kill and clean.
> + """Kill and cleanup all DPDK apps.
> +
> + Args:
> + dpdk_prefix_list: Kill all apps identified by
> `dpdk_prefix_list`.
> + If `dpdk_prefix_list` is empty, attempt to find running
> DPDK apps to kill and clean.
> """
>
> @abstractmethod
> def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
> - """
> - Get the DPDK file prefix that will be used when running DPDK apps.
> + """Make OS-specific modification to the DPDK file prefix.
> +
> + Args:
> + dpdk_prefix: The OS-unaware file prefix.
> +
> + Returns:
> + The OS-specific file prefix.
> """
>
> @abstractmethod
> - def setup_hugepages(self, hugepage_amount: int, force_first_numa:
> bool) -> None:
> - """
> - Get the node's Hugepage Size, configure the specified amount of
> hugepages
> + def setup_hugepages(self, hugepage_count: int, force_first_numa:
> bool) -> None:
> + """Configure hugepages on the node.
> +
> + Get the node's Hugepage Size, configure the specified count of
> hugepages
> if needed and mount the hugepages if needed.
> - If force_first_numa is True, configure hugepages just on the
> first socket.
> +
> + Args:
> + hugepage_count: Configure this many hugepages.
> + force_first_numa: If :data:`True`, configure hugepages just
> on the first numa node.
> """
>
> @abstractmethod
> def get_compiler_version(self, compiler_name: str) -> str:
> - """
> - Get installed version of compiler used for DPDK
> + """Get installed version of compiler used for DPDK.
> +
> + Args:
> + compiler_name: The name of the compiler executable.
> +
> + Returns:
> + The compiler's version.
> """
>
> @abstractmethod
> def get_node_info(self) -> NodeInfo:
> - """
> - Collect information about the node
> + """Collect additional information about the node.
> +
> + Returns:
> + Node information.
> """
>
> @abstractmethod
> def update_ports(self, ports: list[Port]) -> None:
> - """
> - Get additional information about ports:
> - Logical name (e.g. enp7s0) if applicable
> - Mac address
> + """Get additional information about ports from the operating
> system and update them.
> +
> + The additional information is:
> +
> + * Logical name (e.g. ``enp7s0``) if applicable,
> + * Mac address.
> +
> + Args:
> + ports: The ports to update.
> """
>
> @abstractmethod
> def configure_port_state(self, port: Port, enable: bool) -> None:
> - """
> - Enable/disable port.
> + """Enable/disable `port` in the operating system.
> +
> + Args:
> + port: The port to configure.
> + enable: If :data:`True`, enable the port, otherwise shut it
> down.
> """
>
> @abstractmethod
> @@ -273,12 +405,18 @@ def configure_port_ip_address(
> port: Port,
> delete: bool,
> ) -> None:
> - """
> - Configure (add or delete) an IP address of the input port.
> + """Configure an IP address on `port` in the operating system.
> +
> + Args:
> + address: The address to configure.
> + port: The port to configure.
> + delete: If :data:`True`, remove the IP address, otherwise
> configure it.
> """
>
> @abstractmethod
> def configure_ipv4_forwarding(self, enable: bool) -> None:
> - """
> - Enable IPv4 forwarding in the underlying OS.
> + """Enable IPv4 forwarding in the operating system.
> +
> + Args:
> + enable: If :data:`True`, enable the forwarding, otherwise
> disable it.
> """
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 22555 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 15/21] dts: os session docstring update
2023-12-01 17:33 ` Jeremy Spewock
@ 2023-12-04 9:53 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 9:53 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
>> diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
>> index 76e595a518..cfdbd1c4bd 100644
>> --- a/dts/framework/testbed_model/os_session.py
>> +++ b/dts/framework/testbed_model/os_session.py
<snip>
>> @@ -201,69 +283,119 @@ def build_dpdk(
>> rebuild: bool = False,
>> timeout: float = SETTINGS.compile_timeout,
>> ) -> None:
>> - """
>> - Build DPDK in the input dir with specified environment variables and meson
>> - arguments.
>> + """Build DPDK on the remote node.
>> +
>> + An extracted DPDK tarball must be present on the node. The build consists of two steps::
>> +
>> + meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
>> + ninja -C remote_dpdk_build_dir
>> +
>> + The :option:`--compile-timeout` command line argument and the :envvar:`DTS_COMPILE_TIMEOUT`
>> + environment variable configure the timeout of DPDK build.
>> +
>> + Args:
>> + env_vars: Use these environment variables then building DPDK.
>
>
> I think this is meant to be "when building DPDK" instead.
>
Yes, good catch.
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 16/21] dts: posix and linux sessions docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (14 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 15/21] dts: os session " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 17/21] dts: node " Juraj Linkeš
` (6 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/linux_session.py | 64 +++++++++++-----
dts/framework/testbed_model/posix_session.py | 81 +++++++++++++++++---
2 files changed, 114 insertions(+), 31 deletions(-)
diff --git a/dts/framework/testbed_model/linux_session.py b/dts/framework/testbed_model/linux_session.py
index 055765ba2d..0ab59cef85 100644
--- a/dts/framework/testbed_model/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -2,6 +2,13 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""Linux OS translator.
+
+Translate OS-unaware calls into Linux calls/utilities. Most of Linux distributions are mostly
+compliant with POSIX standards, so this module only implements the parts that aren't.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import json
from ipaddress import IPv4Interface, IPv6Interface
from typing import TypedDict, Union
@@ -17,43 +24,52 @@
class LshwConfigurationOutput(TypedDict):
+ """The relevant parts of ``lshw``'s ``configuration`` section."""
+
+ #:
link: str
class LshwOutput(TypedDict):
- """
- A model of the relevant information from json lshw output, e.g.:
- {
- ...
- "businfo" : "pci@0000:08:00.0",
- "logicalname" : "enp8s0",
- "version" : "00",
- "serial" : "52:54:00:59:e1:ac",
- ...
- "configuration" : {
- ...
- "link" : "yes",
- ...
- },
- ...
+ """A model of the relevant information from ``lshw``'s json output.
+
+ Example:
+ ::
+
+ {
+ ...
+ "businfo" : "pci@0000:08:00.0",
+ "logicalname" : "enp8s0",
+ "version" : "00",
+ "serial" : "52:54:00:59:e1:ac",
+ ...
+ "configuration" : {
+ ...
+ "link" : "yes",
+ ...
+ },
+ ...
"""
+ #:
businfo: str
+ #:
logicalname: NotRequired[str]
+ #:
serial: NotRequired[str]
+ #:
configuration: LshwConfigurationOutput
class LinuxSession(PosixSession):
- """
- The implementation of non-Posix compliant parts of Linux remote sessions.
- """
+ """The implementation of non-Posix compliant parts of Linux."""
@staticmethod
def _get_privileged_command(command: str) -> str:
return f"sudo -- sh -c '{command}'"
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_cpus`."""
cpu_info = self.send_command("lscpu -p=CPU,CORE,SOCKET,NODE|grep -v \\#").stdout
lcores = []
for cpu_line in cpu_info.splitlines():
@@ -65,18 +81,20 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
return lcores
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return dpdk_prefix
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.setup_hugepages`."""
self._logger.info("Getting Hugepage information.")
hugepage_size = self._get_hugepage_size()
hugepages_total = self._get_hugepages_total()
self._numa_nodes = self._get_numa_nodes()
- if force_first_numa or hugepages_total != hugepage_amount:
+ if force_first_numa or hugepages_total != hugepage_count:
# when forcing numa, we need to clear existing hugepages regardless
# of size, so they can be moved to the first numa node
- self._configure_huge_pages(hugepage_amount, hugepage_size, force_first_numa)
+ self._configure_huge_pages(hugepage_count, hugepage_size, force_first_numa)
else:
self._logger.info("Hugepages already configured.")
self._mount_huge_pages()
@@ -132,6 +150,7 @@ def _configure_huge_pages(self, amount: int, size: int, force_first_numa: bool)
self.send_command(f"echo {amount} | tee {hugepage_config_path}", privileged=True)
def update_ports(self, ports: list[Port]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.update_ports`."""
self._logger.debug("Gathering port info.")
for port in ports:
assert port.node == self.name, "Attempted to gather port info on the wrong node"
@@ -161,6 +180,7 @@ def _update_port_attr(self, port: Port, attr_value: str | None, attr_name: str)
)
def configure_port_state(self, port: Port, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_state`."""
state = "up" if enable else "down"
self.send_command(f"ip link set dev {port.logical_name} {state}", privileged=True)
@@ -170,6 +190,7 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_ip_address`."""
command = "del" if delete else "add"
self.send_command(
f"ip address {command} {address} dev {port.logical_name}",
@@ -178,5 +199,6 @@ def configure_port_ip_address(
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_ipv4_forwarding`."""
state = 1 if enable else 0
self.send_command(f"sysctl -w net.ipv4.ip_forward={state}", privileged=True)
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 5657cc0bc9..d279bb8b53 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -2,6 +2,15 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""POSIX compliant OS translator.
+
+Translates OS-unaware calls into POSIX compliant calls/utilities. POSIX is a set of standards
+for portability between Unix operating systems which not all Linux distributions
+(or the tools most frequently bundled with said distributions) adhere to. Most of Linux
+distributions are mostly compliant though.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import re
from collections.abc import Iterable
from pathlib import PurePath, PurePosixPath
@@ -15,13 +24,21 @@
class PosixSession(OSSession):
- """
- An intermediary class implementing the Posix compliant parts of
- Linux and other OS remote sessions.
- """
+ """An intermediary class implementing the POSIX standard."""
@staticmethod
def combine_short_options(**opts: bool) -> str:
+ """Combine shell options into one argument.
+
+ These are options such as ``-x``, ``-v``, ``-f`` which are combined into ``-xvf``.
+
+ Args:
+ opts: The keys are option names (usually one letter) and the bool values indicate
+ whether to include the option in the resulting argument.
+
+ Returns:
+ The options combined into one argument.
+ """
ret_opts = ""
for opt, include in opts.items():
if include:
@@ -33,17 +50,19 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.guess_dpdk_remote_dir`."""
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
def get_remote_tmp_dir(self) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_tmp_dir`."""
return PurePosixPath("/tmp")
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for i686 arch build. Get information
- from the node if needed.
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_build_env_vars`.
+
+ Supported architecture: ``i686``.
"""
env_vars = {}
if arch == Architecture.i686:
@@ -63,6 +82,7 @@ def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
return env_vars
def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
return PurePosixPath(*args)
def copy_from(
@@ -70,6 +90,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_from`."""
self.remote_session.copy_from(source_file, destination_file)
def copy_to(
@@ -77,6 +98,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_to`."""
self.remote_session.copy_to(source_file, destination_file)
def remove_remote_dir(
@@ -85,6 +107,7 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.remove_remote_dir`."""
opts = PosixSession.combine_short_options(r=recursive, f=force)
self.send_command(f"rm{opts} {remote_dir_path}")
@@ -93,6 +116,7 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
self.send_command(
f"tar xfm {remote_tarball_path} -C {PurePosixPath(remote_tarball_path).parent}",
60,
@@ -109,6 +133,7 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.build_dpdk`."""
try:
if rebuild:
# reconfigure, then build
@@ -138,10 +163,12 @@ def build_dpdk(
raise DPDKBuildError(f"DPDK build failed when doing '{e.command}'.")
def get_dpdk_version(self, build_dir: str | PurePath) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_version`."""
out = self.send_command(f"cat {self.join_remote_path(build_dir, 'VERSION')}", verify=True)
return out.stdout
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.kill_cleanup_dpdk_apps`."""
self._logger.info("Cleaning up DPDK apps.")
dpdk_runtime_dirs = self._get_dpdk_runtime_dirs(dpdk_prefix_list)
if dpdk_runtime_dirs:
@@ -153,6 +180,14 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
self._remove_dpdk_runtime_dirs(dpdk_runtime_dirs)
def _get_dpdk_runtime_dirs(self, dpdk_prefix_list: Iterable[str]) -> list[PurePosixPath]:
+ """Find runtime directories DPDK apps are currently using.
+
+ Args:
+ dpdk_prefix_list: The prefixes DPDK apps were started with.
+
+ Returns:
+ The paths of DPDK apps' runtime dirs.
+ """
prefix = PurePosixPath("/var", "run", "dpdk")
if not dpdk_prefix_list:
remote_prefixes = self._list_remote_dirs(prefix)
@@ -164,9 +199,13 @@ def _get_dpdk_runtime_dirs(self, dpdk_prefix_list: Iterable[str]) -> list[PurePo
return [PurePosixPath(prefix, dpdk_prefix) for dpdk_prefix in dpdk_prefix_list]
def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
- """
- Return a list of directories of the remote_dir.
- If remote_path doesn't exist, return None.
+ """Contents of remote_path.
+
+ Args:
+ remote_path: List the contents of this path.
+
+ Returns:
+ The contents of remote_path. If remote_path doesn't exist, return None.
"""
out = self.send_command(f"ls -l {remote_path} | awk '/^d/ {{print $NF}}'").stdout
if "No such file or directory" in out:
@@ -175,6 +214,17 @@ def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
return out.splitlines()
def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[int]:
+ """Find PIDs of running DPDK apps.
+
+ Look at each "config" file found in dpdk_runtime_dirs and find the PIDs of processes
+ that opened those file.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+
+ Returns:
+ The PIDs of running DPDK apps.
+ """
pids = []
pid_regex = r"p(\d+)"
for dpdk_runtime_dir in dpdk_runtime_dirs:
@@ -193,6 +243,14 @@ def _remote_files_exists(self, remote_path: PurePath) -> bool:
return not result.return_code
def _check_dpdk_hugepages(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> None:
+ """Check there aren't any leftover hugepages.
+
+ If any hugepages are found, emit a warning. The hugepages are investigated in the
+ "hugepage_info" file of dpdk_runtime_dirs.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+ """
for dpdk_runtime_dir in dpdk_runtime_dirs:
hugepage_info = PurePosixPath(dpdk_runtime_dir, "hugepage_info")
if self._remote_files_exists(hugepage_info):
@@ -208,9 +266,11 @@ def _remove_dpdk_runtime_dirs(self, dpdk_runtime_dirs: Iterable[str | PurePath])
self.remove_remote_dir(dpdk_runtime_dir)
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return ""
def get_compiler_version(self, compiler_name: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_compiler_version`."""
match compiler_name:
case "gcc":
return self.send_command(
@@ -228,6 +288,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
raise ValueError(f"Unknown compiler {compiler_name}")
def get_node_info(self) -> NodeInfo:
+ """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
os_release_info = self.send_command(
"awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
SETTINGS.timeout,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 17/21] dts: node docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (15 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 16/21] dts: posix and linux sessions " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-11-23 15:13 ` [PATCH v8 18/21] dts: sut and tg nodes " Juraj Linkeš
` (5 subsequent siblings)
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/node.py | 191 +++++++++++++++++++---------
1 file changed, 131 insertions(+), 60 deletions(-)
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index b313b5ad54..6eecbdfd6a 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+A node is any host/server DTS connects to.
+
+The base class, :class:`Node`, provides functionality common to all nodes and is supposed
+to be extended by subclasses with functionalities specific to each node type.
+The :func:`~Node.skip_setup` decorator can be used without subclassing.
"""
from abc import ABC
@@ -35,10 +40,22 @@
class Node(ABC):
- """
- Basic class for node management. This class implements methods that
- manage a node, such as information gathering (of CPU/PCI/NIC) and
- environment setup.
+ """The base class for node management.
+
+ It shouldn't be instantiated, but rather subclassed.
+ It implements common methods to manage any node:
+
+ * Connection to the node,
+ * Hugepages setup.
+
+ Attributes:
+ main_session: The primary OS-aware remote session used to communicate with the node.
+ config: The node configuration.
+ name: The name of the node.
+ lcores: The list of logical cores that DTS can use on the node.
+ It's derived from logical cores present on the node and the test run configuration.
+ ports: The ports of this node specified in the test run configuration.
+ virtual_devices: The virtual devices used on the node.
"""
main_session: OSSession
@@ -52,6 +69,17 @@ class Node(ABC):
virtual_devices: list[VirtualDevice]
def __init__(self, node_config: NodeConfiguration):
+ """Connect to the node and gather info during initialization.
+
+ Extra gathered information:
+
+ * The list of available logical CPUs. This is then filtered by
+ the ``lcores`` configuration in the YAML test run configuration file,
+ * Information about ports from the YAML test run configuration file.
+
+ Args:
+ node_config: The node's test run configuration.
+ """
self.config = node_config
self.name = node_config.name
self._logger = getLogger(self.name)
@@ -60,7 +88,7 @@ def __init__(self, node_config: NodeConfiguration):
self._logger.info(f"Connected to node: {self.name}")
self._get_remote_cpus()
- # filter the node lcores according to user config
+ # filter the node lcores according to the test run configuration
self.lcores = LogicalCoreListFilter(
self.lcores, LogicalCoreList(self.config.lcores)
).filter()
@@ -76,9 +104,14 @@ def _init_ports(self) -> None:
self.configure_port_state(port)
def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- Perform the execution setup that will be done for each execution
- this node is part of.
+ """Execution setup steps.
+
+ Configure hugepages and call :meth:`_set_up_execution` where
+ the rest of the configuration steps (if any) are implemented.
+
+ Args:
+ execution_config: The execution test run configuration according to which
+ the setup steps will be taken.
"""
self._setup_hugepages()
self._set_up_execution(execution_config)
@@ -87,54 +120,70 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
self.virtual_devices.append(VirtualDevice(vdev))
def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution setup steps.
"""
def tear_down_execution(self) -> None:
- """
- Perform the execution teardown that will be done after each execution
- this node is part of concludes.
+ """Execution teardown steps.
+
+ There are currently no common execution teardown steps common to all DTS node types.
"""
self.virtual_devices = []
self._tear_down_execution()
def _tear_down_execution(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution teardown steps.
"""
def set_up_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
- """
- Perform the build target setup that will be done for each build target
- tested on this node.
+ """Build target setup steps.
+
+ There are currently no common build target setup steps common to all DTS node types.
+
+ Args:
+ build_target_config: The build target test run configuration according to which
+ the setup steps will be taken.
"""
self._set_up_build_target(build_target_config)
def _set_up_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target setup steps.
"""
def tear_down_build_target(self) -> None:
- """
- Perform the build target teardown that will be done after each build target
- tested on this node.
+ """Build target teardown steps.
+
+ There are currently no common build target teardown steps common to all DTS node types.
"""
self._tear_down_build_target()
def _tear_down_build_target(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target teardown steps.
"""
def create_session(self, name: str) -> OSSession:
- """
- Create and return a new OSSession tailored to the remote OS.
+ """Create and return a new OS-aware remote session.
+
+ The returned session won't be used by the node creating it. The session must be used by
+ the caller. The session will be maintained for the entire lifecycle of the node object,
+ at the end of which the session will be cleaned up automatically.
+
+ Note:
+ Any number of these supplementary sessions may be created.
+
+ Args:
+ name: The name of the session.
+
+ Returns:
+ A new OS-aware remote session.
"""
session_name = f"{self.name} {name}"
connection = create_session(
@@ -152,19 +201,19 @@ def create_interactive_shell(
privileged: bool = False,
app_args: str = "",
) -> InteractiveShellType:
- """Create a handler for an interactive session.
+ """Factory for interactive session handlers.
- Instantiate shell_cls according to the remote OS specifics.
+ Instantiate `shell_cls` according to the remote OS specifics.
Args:
shell_cls: The class of the shell.
- timeout: Timeout for reading output from the SSH channel. If you are
- reading from the buffer and don't receive any data within the timeout
- it will throw an error.
+ timeout: Timeout for reading output from the SSH channel. If you are reading from
+ the buffer and don't receive any data within the timeout it will throw an error.
privileged: Whether to run the shell with administrative privileges.
app_args: The arguments to be passed to the application.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not shell_cls.dpdk_app:
shell_cls.path = self.main_session.join_remote_path(shell_cls.path)
@@ -181,14 +230,22 @@ def filter_lcores(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
) -> list[LogicalCore]:
- """
- Filter the LogicalCores found on the Node according to
- a LogicalCoreCount or a LogicalCoreList.
+ """Filter the node's logical cores that DTS can use.
+
+ Logical cores that DTS can use are the ones that are present on the node, but filtered
+ according to the test run configuration. The `filter_specifier` will filter cores from
+ those logical cores.
+
+ Args:
+ filter_specifier: Two different filters can be used, one that specifies the number
+ of logical cores per core, cores per socket and the number of sockets,
+ and another one that specifies a logical core list.
+ ascending: If :data:`True`, use cores with the lowest numerical id first and continue
+ in ascending order. If :data:`False`, start with the highest id and continue
+ in descending order. This ordering affects which sockets to consider first as well.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+ Returns:
+ The filtered logical cores.
"""
self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
return lcore_filter(
@@ -198,17 +255,14 @@ def filter_lcores(
).filter()
def _get_remote_cpus(self) -> None:
- """
- Scan CPUs in the remote OS and store a list of LogicalCores.
- """
+ """Scan CPUs in the remote OS and store a list of LogicalCores."""
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
def _setup_hugepages(self) -> None:
- """
- Setup hugepages on the Node. Different architectures can supply different
- amounts of memory for hugepages and numa-based hugepage allocation may need
- to be considered.
+ """Setup hugepages on the node.
+
+ Configure the hugepages only if they're specified in the node's test run configuration.
"""
if self.config.hugepages:
self.main_session.setup_hugepages(
@@ -216,8 +270,11 @@ def _setup_hugepages(self) -> None:
)
def configure_port_state(self, port: Port, enable: bool = True) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port`.
+
+ Args:
+ port: The port to enable/disable.
+ enable: :data:`True` to enable, :data:`False` to disable.
"""
self.main_session.configure_port_state(port, enable)
@@ -227,15 +284,17 @@ def configure_port_ip_address(
port: Port,
delete: bool = False,
) -> None:
- """
- Configure the IP address of a port on this node.
+ """Add an IP address to `port` on this node.
+
+ Args:
+ address: The IP address with mask in CIDR format. Can be either IPv4 or IPv6.
+ port: The port to which to add the address.
+ delete: If :data:`True`, will delete the address from the port instead of adding it.
"""
self.main_session.configure_port_ip_address(address, port, delete)
def close(self) -> None:
- """
- Close all connections and free other resources.
- """
+ """Close all connections and free other resources."""
if self.main_session:
self.main_session.close()
for session in self._other_sessions:
@@ -244,6 +303,11 @@ def close(self) -> None:
@staticmethod
def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+ """Skip the decorated function.
+
+ The :option:`--skip-setup` command line argument and the :envvar:`DTS_SKIP_SETUP`
+ environment variable enable the decorator.
+ """
if SETTINGS.skip_setup:
return lambda *args: None
else:
@@ -251,6 +315,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
def create_session(node_config: NodeConfiguration, name: str, logger: DTSLOG) -> OSSession:
+ """Factory for OS-aware sessions.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
match node_config.os:
case OS.linux:
return LinuxSession(node_config, name, logger)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 18/21] dts: sut and tg nodes docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (16 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 17/21] dts: node " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-12-01 18:06 ` Jeremy Spewock
2023-11-23 15:13 ` [PATCH v8 19/21] dts: base traffic generators " Juraj Linkeš
` (4 subsequent siblings)
22 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/sut_node.py | 230 ++++++++++++++++--------
dts/framework/testbed_model/tg_node.py | 42 +++--
2 files changed, 176 insertions(+), 96 deletions(-)
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 5ce9446dba..c4acea38d1 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -3,6 +3,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""System under test (DPDK + hardware) node.
+
+A system under test (SUT) is the combination of DPDK
+and the hardware we're testing with DPDK (NICs, crypto and other devices).
+An SUT node is where this SUT runs.
+"""
+
+
import os
import tarfile
import time
@@ -26,6 +34,11 @@
class EalParameters(object):
+ """The environment abstraction layer parameters.
+
+ The string representation can be created by converting the instance to a string.
+ """
+
def __init__(
self,
lcore_list: LogicalCoreList,
@@ -35,21 +48,23 @@ def __init__(
vdevs: list[VirtualDevice],
other_eal_param: str,
):
- """
- Generate eal parameters character string;
- :param lcore_list: the list of logical cores to use.
- :param memory_channels: the number of memory channels to use.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
+ """Initialize the parameters according to inputs.
+
+ Process the parameters into the format used on the command line.
+
+ Args:
+ lcore_list: The list of logical cores to use.
+ memory_channels: The number of memory channels to use.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``
"""
self._lcore_list = f"-l {lcore_list}"
self._memory_channels = f"-n {memory_channels}"
@@ -61,6 +76,7 @@ def __init__(
self._other_eal_param = other_eal_param
def __str__(self) -> str:
+ """Create the EAL string."""
return (
f"{self._lcore_list} "
f"{self._memory_channels} "
@@ -72,11 +88,21 @@ def __str__(self) -> str:
class SutNode(Node):
- """
- A class for managing connections to the System under Test, providing
- methods that retrieve the necessary information about the node (such as
- CPU, memory and NIC details) and configuration capabilities.
- Another key capability is building DPDK according to given build target.
+ """The system under test node.
+
+ The SUT node extends :class:`Node` with DPDK specific features:
+
+ * DPDK build,
+ * Gathering of DPDK build info,
+ * The running of DPDK apps, interactively or one-time execution,
+ * DPDK apps cleanup.
+
+ The :option:`--tarball` command line argument and the :envvar:`DTS_DPDK_TARBALL`
+ environment variable configure the path to the DPDK tarball
+ or the git commit ID, tag ID or tree ID to test.
+
+ Attributes:
+ config: The SUT node configuration
"""
config: SutNodeConfiguration
@@ -94,6 +120,11 @@ class SutNode(Node):
_path_to_devbind_script: PurePath | None
def __init__(self, node_config: SutNodeConfiguration):
+ """Extend the constructor with SUT node specifics.
+
+ Args:
+ node_config: The SUT node's test run configuration.
+ """
super(SutNode, self).__init__(node_config)
self._dpdk_prefix_list = []
self._build_target_config = None
@@ -113,6 +144,12 @@ def __init__(self, node_config: SutNodeConfiguration):
@property
def _remote_dpdk_dir(self) -> PurePath:
+ """The remote DPDK dir.
+
+ This internal property should be set after extracting the DPDK tarball. If it's not set,
+ that implies the DPDK setup step has been skipped, in which case we can guess where
+ a previous build was located.
+ """
if self.__remote_dpdk_dir is None:
self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
return self.__remote_dpdk_dir
@@ -123,6 +160,11 @@ def _remote_dpdk_dir(self, value: PurePath) -> None:
@property
def remote_dpdk_build_dir(self) -> PurePath:
+ """The remote DPDK build directory.
+
+ This is the directory where DPDK was built.
+ We assume it was built in a subdirectory of the extracted tarball.
+ """
if self._build_target_config:
return self.main_session.join_remote_path(
self._remote_dpdk_dir, self._build_target_config.name
@@ -132,18 +174,21 @@ def remote_dpdk_build_dir(self) -> PurePath:
@property
def dpdk_version(self) -> str:
+ """Last built DPDK version."""
if self._dpdk_version is None:
self._dpdk_version = self.main_session.get_dpdk_version(self._remote_dpdk_dir)
return self._dpdk_version
@property
def node_info(self) -> NodeInfo:
+ """Additional node information."""
if self._node_info is None:
self._node_info = self.main_session.get_node_info()
return self._node_info
@property
def compiler_version(self) -> str:
+ """The node's compiler version."""
if self._compiler_version is None:
if self._build_target_config is not None:
self._compiler_version = self.main_session.get_compiler_version(
@@ -158,6 +203,7 @@ def compiler_version(self) -> str:
@property
def path_to_devbind_script(self) -> PurePath:
+ """The path to the dpdk-devbind.py script on the node."""
if self._path_to_devbind_script is None:
self._path_to_devbind_script = self.main_session.join_remote_path(
self._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
@@ -165,6 +211,11 @@ def path_to_devbind_script(self) -> PurePath:
return self._path_to_devbind_script
def get_build_target_info(self) -> BuildTargetInfo:
+ """Get additional build target information.
+
+ Returns:
+ The build target information,
+ """
return BuildTargetInfo(
dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
)
@@ -173,8 +224,9 @@ def _guess_dpdk_remote_dir(self) -> PurePath:
return self.main_session.guess_dpdk_remote_dir(self._remote_tmp_dir)
def _set_up_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
- """
- Setup DPDK on the SUT node.
+ """Setup DPDK on the SUT node.
+
+ Additional build target setup steps on top of those in :class:`Node`.
"""
# we want to ensure that dpdk_version and compiler_version is reset for new
# build targets
@@ -186,16 +238,14 @@ def _set_up_build_target(self, build_target_config: BuildTargetConfiguration) ->
self.bind_ports_to_driver()
def _tear_down_build_target(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Bind ports to the operating system drivers.
+
+ Additional build target teardown steps on top of those in :class:`Node`.
"""
self.bind_ports_to_driver(for_dpdk=False)
def _configure_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
- """
- Populate common environment variables and set build target config.
- """
+ """Populate common environment variables and set build target config."""
self._env_vars = {}
self._build_target_config = build_target_config
self._env_vars.update(self.main_session.get_dpdk_build_env_vars(build_target_config.arch))
@@ -207,9 +257,7 @@ def _configure_build_target(self, build_target_config: BuildTargetConfiguration)
@Node.skip_setup
def _copy_dpdk_tarball(self) -> None:
- """
- Copy to and extract DPDK tarball on the SUT node.
- """
+ """Copy to and extract DPDK tarball on the SUT node."""
self._logger.info("Copying DPDK tarball to SUT.")
self.main_session.copy_to(SETTINGS.dpdk_tarball_path, self._remote_tmp_dir)
@@ -238,8 +286,9 @@ def _copy_dpdk_tarball(self) -> None:
@Node.skip_setup
def _build_dpdk(self) -> None:
- """
- Build DPDK. Uses the already configured target. Assumes that the tarball has
+ """Build DPDK.
+
+ Uses the already configured target. Assumes that the tarball has
already been copied to and extracted on the SUT node.
"""
self.main_session.build_dpdk(
@@ -250,15 +299,19 @@ def _build_dpdk(self) -> None:
)
def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePath:
- """
- Build one or all DPDK apps. Requires DPDK to be already built on the SUT node.
- When app_name is 'all', build all example apps.
- When app_name is any other string, tries to build that example app.
- Return the directory path of the built app. If building all apps, return
- the path to the examples directory (where all apps reside).
- The meson_dpdk_args are keyword arguments
- found in meson_option.txt in root DPDK directory. Do not use -D with them,
- for example: enable_kmods=True.
+ """Build one or all DPDK apps.
+
+ Requires DPDK to be already built on the SUT node.
+
+ Args:
+ app_name: The name of the DPDK app to build.
+ When `app_name` is ``all``, build all example apps.
+ meson_dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Returns:
+ The directory path of the built app. If building all apps, return
+ the path to the examples directory (where all apps reside).
"""
self.main_session.build_dpdk(
self._env_vars,
@@ -277,9 +330,7 @@ def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePa
)
def kill_cleanup_dpdk_apps(self) -> None:
- """
- Kill all dpdk applications on the SUT. Cleanup hugepages.
- """
+ """Kill all dpdk applications on the SUT, then clean up hugepages."""
if self._dpdk_kill_session and self._dpdk_kill_session.is_alive():
# we can use the session if it exists and responds
self._dpdk_kill_session.kill_cleanup_dpdk_apps(self._dpdk_prefix_list)
@@ -298,33 +349,34 @@ def create_eal_parameters(
vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
- """
- Generate eal parameters character string;
- :param lcore_filter_specifier: a number of lcores/cores/sockets to use
- or a list of lcore ids to use.
- The default will select one lcore for each of two cores
- on one socket, in ascending order of core ids.
- :param ascending_cores: True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the
- highest id and continue in descending order. This ordering
- affects which sockets to consider first as well.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param append_prefix_timestamp: if True, will append a timestamp to
- DPDK file prefix.
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
- :return: eal param string, eg:
- '-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420';
- """
+ """Compose the EAL parameters.
+
+ Process the list of cores and the DPDK prefix and pass that along with
+ the rest of the arguments.
+ Args:
+ lcore_filter_specifier: A number of lcores/cores/sockets to use
+ or a list of lcore ids to use.
+ The default will select one lcore for each of two cores
+ on one socket, in ascending order of core ids.
+ ascending_cores: Sort cores in ascending order (lowest to highest IDs).
+ If :data:`False`, sort in descending order.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ append_prefix_timestamp: If :data:`True`, will append a timestamp to DPDK file prefix.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``.
+
+ Returns:
+ An EAL param string, such as
+ ``-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420``.
+ """
lcore_list = LogicalCoreList(self.filter_lcores(lcore_filter_specifier, ascending_cores))
if append_prefix_timestamp:
@@ -348,14 +400,29 @@ def create_eal_parameters(
def run_dpdk_app(
self, app_path: PurePath, eal_args: "EalParameters", timeout: float = 30
) -> CommandResult:
- """
- Run DPDK application on the remote node.
+ """Run DPDK application on the remote node.
+
+ The application is not run interactively - the command that starts the application
+ is executed and then the call waits for it to finish execution.
+
+ Args:
+ app_path: The remote path to the DPDK application.
+ eal_args: EAL parameters to run the DPDK application with.
+ timeout: Wait at most this long in seconds for `command` execution to complete.
+
+ Returns:
+ The result of the DPDK app execution.
"""
return self.main_session.send_command(
f"{app_path} {eal_args}", timeout, privileged=True, verify=True
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Enable/disable IPv4 forwarding on the node.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
+ """
self.main_session.configure_ipv4_forwarding(enable)
def create_interactive_shell(
@@ -365,9 +432,13 @@ def create_interactive_shell(
privileged: bool = False,
eal_parameters: EalParameters | str | None = None,
) -> InteractiveShellType:
- """Factory method for creating a handler for an interactive session.
+ """Extend the factory for interactive session handlers.
+
+ The extensions are SUT node specific:
- Instantiate shell_cls according to the remote OS specifics.
+ * The default for `eal_parameters`,
+ * The interactive shell path `shell_cls.path` is prepended with path to the remote
+ DPDK build directory for DPDK apps.
Args:
shell_cls: The class of the shell.
@@ -377,9 +448,10 @@ def create_interactive_shell(
privileged: Whether to run the shell with administrative privileges.
eal_parameters: List of EAL parameters to use to launch the app. If this
isn't provided or an empty string is passed, it will default to calling
- create_eal_parameters().
+ :meth:`create_eal_parameters`.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not eal_parameters:
eal_parameters = self.create_eal_parameters()
@@ -396,8 +468,8 @@ def bind_ports_to_driver(self, for_dpdk: bool = True) -> None:
"""Bind all ports on the SUT to a driver.
Args:
- for_dpdk: Boolean that, when True, binds ports to os_driver_for_dpdk
- or, when False, binds to os_driver. Defaults to True.
+ for_dpdk: If :data:`True`, binds ports to os_driver_for_dpdk.
+ If :data:`False`, binds to os_driver.
"""
for port in self.ports:
driver = port.os_driver_for_dpdk if for_dpdk else port.os_driver
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 8a8f0019f3..f269d4c585 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -5,13 +5,8 @@
"""Traffic generator node.
-This is the node where the traffic generator resides.
-The distinction between a node and a traffic generator is as follows:
-A node is a host that DTS connects to. It could be a baremetal server,
-a VM or a container.
-A traffic generator is software running on the node.
-A traffic generator node is a node running a traffic generator.
-A node can be a traffic generator node as well as system under test node.
+A traffic generator (TG) generates traffic that's sent towards the SUT node.
+A TG node is where the TG runs.
"""
from scapy.packet import Packet # type: ignore[import]
@@ -24,13 +19,16 @@
class TGNode(Node):
- """Manage connections to a node with a traffic generator.
+ """The traffic generator node.
- Apart from basic node management capabilities, the Traffic Generator node has
- specialized methods for handling the traffic generator running on it.
+ The TG node extends :class:`Node` with TG specific features:
- Arguments:
- node_config: The user configuration of the traffic generator node.
+ * Traffic generator initialization,
+ * The sending of traffic and receiving packets,
+ * The sending of traffic without receiving packets.
+
+ Not all traffic generators are capable of capturing traffic, which is why there
+ must be a way to send traffic without that.
Attributes:
traffic_generator: The traffic generator running on the node.
@@ -39,6 +37,13 @@ class TGNode(Node):
traffic_generator: CapturingTrafficGenerator
def __init__(self, node_config: TGNodeConfiguration):
+ """Extend the constructor with TG node specifics.
+
+ Initialize the traffic generator on the TG node.
+
+ Args:
+ node_config: The TG node's test run configuration.
+ """
super(TGNode, self).__init__(node_config)
self.traffic_generator = create_traffic_generator(self, node_config.traffic_generator)
self._logger.info(f"Created node: {self.name}")
@@ -50,17 +55,17 @@ def send_packet_and_capture(
receive_port: Port,
duration: float = 1,
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet`, return received traffic.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given duration. Also record the captured traffic
in a pcap file.
Args:
packet: The packet to send.
send_port: The egress port on the TG node.
receive_port: The ingress port in the TG node.
- duration: Capture traffic for this amount of time after sending the packet.
+ duration: Capture traffic for this amount of time after sending `packet`.
Returns:
A list of received packets. May be empty if no packets are captured.
@@ -70,6 +75,9 @@ def send_packet_and_capture(
)
def close(self) -> None:
- """Free all resources used by the node"""
+ """Free all resources used by the node.
+
+ This extends the superclass method with TG cleanup.
+ """
self.traffic_generator.close()
super(TGNode, self).close()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 18/21] dts: sut and tg nodes docstring update
2023-11-23 15:13 ` [PATCH v8 18/21] dts: sut and tg nodes " Juraj Linkeš
@ 2023-12-01 18:06 ` Jeremy Spewock
2023-12-04 10:02 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-12-01 18:06 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
[-- Attachment #1: Type: text/plain, Size: 23415 bytes --]
On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/framework/testbed_model/sut_node.py | 230 ++++++++++++++++--------
> dts/framework/testbed_model/tg_node.py | 42 +++--
> 2 files changed, 176 insertions(+), 96 deletions(-)
>
> diff --git a/dts/framework/testbed_model/sut_node.py
> b/dts/framework/testbed_model/sut_node.py
> index 5ce9446dba..c4acea38d1 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -3,6 +3,14 @@
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> # Copyright(c) 2023 University of New Hampshire
>
> +"""System under test (DPDK + hardware) node.
> +
> +A system under test (SUT) is the combination of DPDK
> +and the hardware we're testing with DPDK (NICs, crypto and other devices).
> +An SUT node is where this SUT runs.
> +"""
>
I think this should just be "A SUT node"
> +
> +
> import os
> import tarfile
> import time
> @@ -26,6 +34,11 @@
>
>
> class EalParameters(object):
> + """The environment abstraction layer parameters.
> +
> + The string representation can be created by converting the instance
> to a string.
> + """
> +
> def __init__(
> self,
> lcore_list: LogicalCoreList,
> @@ -35,21 +48,23 @@ def __init__(
> vdevs: list[VirtualDevice],
> other_eal_param: str,
> ):
> - """
> - Generate eal parameters character string;
> - :param lcore_list: the list of logical cores to use.
> - :param memory_channels: the number of memory channels to use.
> - :param prefix: set file prefix string, eg:
> - prefix='vf'
> - :param no_pci: switch of disable PCI bus eg:
> - no_pci=True
> - :param vdevs: virtual device list, eg:
> - vdevs=[
> - VirtualDevice('net_ring0'),
> - VirtualDevice('net_ring1')
> - ]
> - :param other_eal_param: user defined DPDK eal parameters, eg:
> - other_eal_param='--single-file-segments'
> + """Initialize the parameters according to inputs.
> +
> + Process the parameters into the format used on the command line.
> +
> + Args:
> + lcore_list: The list of logical cores to use.
> + memory_channels: The number of memory channels to use.
> + prefix: Set the file prefix string with which to start DPDK,
> e.g.: ``prefix='vf'``.
> + no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
> + vdevs: Virtual devices, e.g.::
> +
> + vdevs=[
> + VirtualDevice('net_ring0'),
> + VirtualDevice('net_ring1')
> + ]
> + other_eal_param: user defined DPDK EAL parameters, e.g.:
> + ``other_eal_param='--single-file-segments'``
> """
> self._lcore_list = f"-l {lcore_list}"
> self._memory_channels = f"-n {memory_channels}"
> @@ -61,6 +76,7 @@ def __init__(
> self._other_eal_param = other_eal_param
>
> def __str__(self) -> str:
> + """Create the EAL string."""
> return (
> f"{self._lcore_list} "
> f"{self._memory_channels} "
> @@ -72,11 +88,21 @@ def __str__(self) -> str:
>
>
> class SutNode(Node):
> - """
> - A class for managing connections to the System under Test, providing
> - methods that retrieve the necessary information about the node (such
> as
> - CPU, memory and NIC details) and configuration capabilities.
> - Another key capability is building DPDK according to given build
> target.
> + """The system under test node.
> +
> + The SUT node extends :class:`Node` with DPDK specific features:
> +
> + * DPDK build,
> + * Gathering of DPDK build info,
> + * The running of DPDK apps, interactively or one-time execution,
> + * DPDK apps cleanup.
> +
> + The :option:`--tarball` command line argument and the
> :envvar:`DTS_DPDK_TARBALL`
> + environment variable configure the path to the DPDK tarball
> + or the git commit ID, tag ID or tree ID to test.
> +
> + Attributes:
> + config: The SUT node configuration
> """
>
> config: SutNodeConfiguration
> @@ -94,6 +120,11 @@ class SutNode(Node):
> _path_to_devbind_script: PurePath | None
>
> def __init__(self, node_config: SutNodeConfiguration):
> + """Extend the constructor with SUT node specifics.
> +
> + Args:
> + node_config: The SUT node's test run configuration.
> + """
> super(SutNode, self).__init__(node_config)
> self._dpdk_prefix_list = []
> self._build_target_config = None
> @@ -113,6 +144,12 @@ def __init__(self, node_config: SutNodeConfiguration):
>
> @property
> def _remote_dpdk_dir(self) -> PurePath:
> + """The remote DPDK dir.
> +
> + This internal property should be set after extracting the DPDK
> tarball. If it's not set,
> + that implies the DPDK setup step has been skipped, in which case
> we can guess where
> + a previous build was located.
> + """
> if self.__remote_dpdk_dir is None:
> self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
> return self.__remote_dpdk_dir
> @@ -123,6 +160,11 @@ def _remote_dpdk_dir(self, value: PurePath) -> None:
>
> @property
> def remote_dpdk_build_dir(self) -> PurePath:
> + """The remote DPDK build directory.
> +
> + This is the directory where DPDK was built.
> + We assume it was built in a subdirectory of the extracted tarball.
> + """
> if self._build_target_config:
> return self.main_session.join_remote_path(
> self._remote_dpdk_dir, self._build_target_config.name
> @@ -132,18 +174,21 @@ def remote_dpdk_build_dir(self) -> PurePath:
>
> @property
> def dpdk_version(self) -> str:
> + """Last built DPDK version."""
> if self._dpdk_version is None:
> self._dpdk_version =
> self.main_session.get_dpdk_version(self._remote_dpdk_dir)
> return self._dpdk_version
>
> @property
> def node_info(self) -> NodeInfo:
> + """Additional node information."""
> if self._node_info is None:
> self._node_info = self.main_session.get_node_info()
> return self._node_info
>
> @property
> def compiler_version(self) -> str:
> + """The node's compiler version."""
> if self._compiler_version is None:
> if self._build_target_config is not None:
> self._compiler_version =
> self.main_session.get_compiler_version(
> @@ -158,6 +203,7 @@ def compiler_version(self) -> str:
>
> @property
> def path_to_devbind_script(self) -> PurePath:
> + """The path to the dpdk-devbind.py script on the node."""
> if self._path_to_devbind_script is None:
> self._path_to_devbind_script =
> self.main_session.join_remote_path(
> self._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
> @@ -165,6 +211,11 @@ def path_to_devbind_script(self) -> PurePath:
> return self._path_to_devbind_script
>
> def get_build_target_info(self) -> BuildTargetInfo:
> + """Get additional build target information.
> +
> + Returns:
> + The build target information,
> + """
> return BuildTargetInfo(
> dpdk_version=self.dpdk_version,
> compiler_version=self.compiler_version
> )
> @@ -173,8 +224,9 @@ def _guess_dpdk_remote_dir(self) -> PurePath:
> return
> self.main_session.guess_dpdk_remote_dir(self._remote_tmp_dir)
>
> def _set_up_build_target(self, build_target_config:
> BuildTargetConfiguration) -> None:
> - """
> - Setup DPDK on the SUT node.
> + """Setup DPDK on the SUT node.
> +
> + Additional build target setup steps on top of those in
> :class:`Node`.
> """
> # we want to ensure that dpdk_version and compiler_version is
> reset for new
> # build targets
> @@ -186,16 +238,14 @@ def _set_up_build_target(self, build_target_config:
> BuildTargetConfiguration) ->
> self.bind_ports_to_driver()
>
> def _tear_down_build_target(self) -> None:
> - """
> - This method exists to be optionally overwritten by derived
> classes and
> - is not decorated so that the derived class doesn't have to use
> the decorator.
> + """Bind ports to the operating system drivers.
> +
> + Additional build target teardown steps on top of those in
> :class:`Node`.
> """
> self.bind_ports_to_driver(for_dpdk=False)
>
> def _configure_build_target(self, build_target_config:
> BuildTargetConfiguration) -> None:
> - """
> - Populate common environment variables and set build target config.
> - """
> + """Populate common environment variables and set build target
> config."""
> self._env_vars = {}
> self._build_target_config = build_target_config
>
> self._env_vars.update(self.main_session.get_dpdk_build_env_vars(build_target_config.arch))
> @@ -207,9 +257,7 @@ def _configure_build_target(self, build_target_config:
> BuildTargetConfiguration)
>
> @Node.skip_setup
> def _copy_dpdk_tarball(self) -> None:
> - """
> - Copy to and extract DPDK tarball on the SUT node.
> - """
> + """Copy to and extract DPDK tarball on the SUT node."""
> self._logger.info("Copying DPDK tarball to SUT.")
> self.main_session.copy_to(SETTINGS.dpdk_tarball_path,
> self._remote_tmp_dir)
>
> @@ -238,8 +286,9 @@ def _copy_dpdk_tarball(self) -> None:
>
> @Node.skip_setup
> def _build_dpdk(self) -> None:
> - """
> - Build DPDK. Uses the already configured target. Assumes that the
> tarball has
> + """Build DPDK.
> +
> + Uses the already configured target. Assumes that the tarball has
> already been copied to and extracted on the SUT node.
> """
> self.main_session.build_dpdk(
> @@ -250,15 +299,19 @@ def _build_dpdk(self) -> None:
> )
>
> def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str |
> bool) -> PurePath:
> - """
> - Build one or all DPDK apps. Requires DPDK to be already built on
> the SUT node.
> - When app_name is 'all', build all example apps.
> - When app_name is any other string, tries to build that example
> app.
> - Return the directory path of the built app. If building all apps,
> return
> - the path to the examples directory (where all apps reside).
> - The meson_dpdk_args are keyword arguments
> - found in meson_option.txt in root DPDK directory. Do not use -D
> with them,
> - for example: enable_kmods=True.
> + """Build one or all DPDK apps.
> +
> + Requires DPDK to be already built on the SUT node.
> +
> + Args:
> + app_name: The name of the DPDK app to build.
> + When `app_name` is ``all``, build all example apps.
> + meson_dpdk_args: The arguments found in ``meson_options.txt``
> in root DPDK directory.
> + Do not use ``-D`` with them.
> +
> + Returns:
> + The directory path of the built app. If building all apps,
> return
> + the path to the examples directory (where all apps reside).
> """
> self.main_session.build_dpdk(
> self._env_vars,
> @@ -277,9 +330,7 @@ def build_dpdk_app(self, app_name: str,
> **meson_dpdk_args: str | bool) -> PurePa
> )
>
> def kill_cleanup_dpdk_apps(self) -> None:
> - """
> - Kill all dpdk applications on the SUT. Cleanup hugepages.
> - """
> + """Kill all dpdk applications on the SUT, then clean up
> hugepages."""
> if self._dpdk_kill_session and self._dpdk_kill_session.is_alive():
> # we can use the session if it exists and responds
>
> self._dpdk_kill_session.kill_cleanup_dpdk_apps(self._dpdk_prefix_list)
> @@ -298,33 +349,34 @@ def create_eal_parameters(
> vdevs: list[VirtualDevice] | None = None,
> other_eal_param: str = "",
> ) -> "EalParameters":
> - """
> - Generate eal parameters character string;
> - :param lcore_filter_specifier: a number of lcores/cores/sockets
> to use
> - or a list of lcore ids to use.
> - The default will select one lcore for each of two
> cores
> - on one socket, in ascending order of core ids.
> - :param ascending_cores: True, use cores with the lowest numerical
> id first
> - and continue in ascending order. If False, start
> with the
> - highest id and continue in descending order. This
> ordering
> - affects which sockets to consider first as well.
> - :param prefix: set file prefix string, eg:
> - prefix='vf'
> - :param append_prefix_timestamp: if True, will append a timestamp
> to
> - DPDK file prefix.
> - :param no_pci: switch of disable PCI bus eg:
> - no_pci=True
> - :param vdevs: virtual device list, eg:
> - vdevs=[
> - VirtualDevice('net_ring0'),
> - VirtualDevice('net_ring1')
> - ]
> - :param other_eal_param: user defined DPDK eal parameters, eg:
> - other_eal_param='--single-file-segments'
> - :return: eal param string, eg:
> - '-c 0xf -a 0000:88:00.0
> --file-prefix=dpdk_1112_20190809143420';
> - """
> + """Compose the EAL parameters.
> +
> + Process the list of cores and the DPDK prefix and pass that along
> with
> + the rest of the arguments.
>
> + Args:
> + lcore_filter_specifier: A number of lcores/cores/sockets to
> use
> + or a list of lcore ids to use.
> + The default will select one lcore for each of two cores
> + on one socket, in ascending order of core ids.
> + ascending_cores: Sort cores in ascending order (lowest to
> highest IDs).
> + If :data:`False`, sort in descending order.
> + prefix: Set the file prefix string with which to start DPDK,
> e.g.: ``prefix='vf'``.
> + append_prefix_timestamp: If :data:`True`, will append a
> timestamp to DPDK file prefix.
> + no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
> + vdevs: Virtual devices, e.g.::
> +
> + vdevs=[
> + VirtualDevice('net_ring0'),
> + VirtualDevice('net_ring1')
> + ]
> + other_eal_param: user defined DPDK EAL parameters, e.g.:
> + ``other_eal_param='--single-file-segments'``.
> +
> + Returns:
> + An EAL param string, such as
> + ``-c 0xf -a 0000:88:00.0
> --file-prefix=dpdk_1112_20190809143420``.
> + """
> lcore_list =
> LogicalCoreList(self.filter_lcores(lcore_filter_specifier, ascending_cores))
>
> if append_prefix_timestamp:
> @@ -348,14 +400,29 @@ def create_eal_parameters(
> def run_dpdk_app(
> self, app_path: PurePath, eal_args: "EalParameters", timeout:
> float = 30
> ) -> CommandResult:
> - """
> - Run DPDK application on the remote node.
> + """Run DPDK application on the remote node.
> +
> + The application is not run interactively - the command that
> starts the application
> + is executed and then the call waits for it to finish execution.
> +
> + Args:
> + app_path: The remote path to the DPDK application.
> + eal_args: EAL parameters to run the DPDK application with.
> + timeout: Wait at most this long in seconds for `command`
> execution to complete.
> +
> + Returns:
> + The result of the DPDK app execution.
> """
> return self.main_session.send_command(
> f"{app_path} {eal_args}", timeout, privileged=True,
> verify=True
> )
>
> def configure_ipv4_forwarding(self, enable: bool) -> None:
> + """Enable/disable IPv4 forwarding on the node.
> +
> + Args:
> + enable: If :data:`True`, enable the forwarding, otherwise
> disable it.
> + """
> self.main_session.configure_ipv4_forwarding(enable)
>
> def create_interactive_shell(
> @@ -365,9 +432,13 @@ def create_interactive_shell(
> privileged: bool = False,
> eal_parameters: EalParameters | str | None = None,
> ) -> InteractiveShellType:
> - """Factory method for creating a handler for an interactive
> session.
> + """Extend the factory for interactive session handlers.
> +
> + The extensions are SUT node specific:
>
> - Instantiate shell_cls according to the remote OS specifics.
> + * The default for `eal_parameters`,
> + * The interactive shell path `shell_cls.path` is prepended
> with path to the remote
> + DPDK build directory for DPDK apps.
>
> Args:
> shell_cls: The class of the shell.
> @@ -377,9 +448,10 @@ def create_interactive_shell(
> privileged: Whether to run the shell with administrative
> privileges.
> eal_parameters: List of EAL parameters to use to launch the
> app. If this
> isn't provided or an empty string is passed, it will
> default to calling
> - create_eal_parameters().
> + :meth:`create_eal_parameters`.
> +
> Returns:
> - Instance of the desired interactive application.
> + An instance of the desired interactive application shell.
> """
> if not eal_parameters:
> eal_parameters = self.create_eal_parameters()
> @@ -396,8 +468,8 @@ def bind_ports_to_driver(self, for_dpdk: bool = True)
> -> None:
> """Bind all ports on the SUT to a driver.
>
> Args:
> - for_dpdk: Boolean that, when True, binds ports to
> os_driver_for_dpdk
> - or, when False, binds to os_driver. Defaults to True.
> + for_dpdk: If :data:`True`, binds ports to os_driver_for_dpdk.
> + If :data:`False`, binds to os_driver.
> """
> for port in self.ports:
> driver = port.os_driver_for_dpdk if for_dpdk else
> port.os_driver
> diff --git a/dts/framework/testbed_model/tg_node.py
> b/dts/framework/testbed_model/tg_node.py
> index 8a8f0019f3..f269d4c585 100644
> --- a/dts/framework/testbed_model/tg_node.py
> +++ b/dts/framework/testbed_model/tg_node.py
> @@ -5,13 +5,8 @@
>
> """Traffic generator node.
>
> -This is the node where the traffic generator resides.
> -The distinction between a node and a traffic generator is as follows:
> -A node is a host that DTS connects to. It could be a baremetal server,
> -a VM or a container.
> -A traffic generator is software running on the node.
> -A traffic generator node is a node running a traffic generator.
> -A node can be a traffic generator node as well as system under test node.
> +A traffic generator (TG) generates traffic that's sent towards the SUT
> node.
> +A TG node is where the TG runs.
> """
>
> from scapy.packet import Packet # type: ignore[import]
> @@ -24,13 +19,16 @@
>
>
> class TGNode(Node):
> - """Manage connections to a node with a traffic generator.
> + """The traffic generator node.
>
> - Apart from basic node management capabilities, the Traffic Generator
> node has
> - specialized methods for handling the traffic generator running on it.
> + The TG node extends :class:`Node` with TG specific features:
>
> - Arguments:
> - node_config: The user configuration of the traffic generator node.
> + * Traffic generator initialization,
> + * The sending of traffic and receiving packets,
> + * The sending of traffic without receiving packets.
> +
> + Not all traffic generators are capable of capturing traffic, which is
> why there
> + must be a way to send traffic without that.
>
> Attributes:
> traffic_generator: The traffic generator running on the node.
> @@ -39,6 +37,13 @@ class TGNode(Node):
> traffic_generator: CapturingTrafficGenerator
>
> def __init__(self, node_config: TGNodeConfiguration):
> + """Extend the constructor with TG node specifics.
> +
> + Initialize the traffic generator on the TG node.
> +
> + Args:
> + node_config: The TG node's test run configuration.
> + """
> super(TGNode, self).__init__(node_config)
> self.traffic_generator = create_traffic_generator(self,
> node_config.traffic_generator)
> self._logger.info(f"Created node: {self.name}")
> @@ -50,17 +55,17 @@ def send_packet_and_capture(
> receive_port: Port,
> duration: float = 1,
> ) -> list[Packet]:
> - """Send a packet, return received traffic.
> + """Send `packet`, return received traffic.
>
> - Send a packet on the send_port and then return all traffic
> captured
> - on the receive_port for the given duration. Also record the
> captured traffic
> + Send `packet` on `send_port` and then return all traffic captured
> + on `receive_port` for the given duration. Also record the
> captured traffic
> in a pcap file.
>
> Args:
> packet: The packet to send.
> send_port: The egress port on the TG node.
> receive_port: The ingress port in the TG node.
> - duration: Capture traffic for this amount of time after
> sending the packet.
> + duration: Capture traffic for this amount of time after
> sending `packet`.
>
> Returns:
> A list of received packets. May be empty if no packets are
> captured.
> @@ -70,6 +75,9 @@ def send_packet_and_capture(
> )
>
> def close(self) -> None:
> - """Free all resources used by the node"""
> + """Free all resources used by the node.
> +
> + This extends the superclass method with TG cleanup.
> + """
> self.traffic_generator.close()
> super(TGNode, self).close()
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 28595 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 18/21] dts: sut and tg nodes docstring update
2023-12-01 18:06 ` Jeremy Spewock
@ 2023-12-04 10:02 ` Juraj Linkeš
2023-12-04 11:02 ` Bruce Richardson
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:02 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
On Fri, Dec 1, 2023 at 7:06 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
>
>
>
> On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>>
>> Format according to the Google format and PEP257, with slight
>> deviations.
>>
>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>> ---
>> dts/framework/testbed_model/sut_node.py | 230 ++++++++++++++++--------
>> dts/framework/testbed_model/tg_node.py | 42 +++--
>> 2 files changed, 176 insertions(+), 96 deletions(-)
>>
>> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
>> index 5ce9446dba..c4acea38d1 100644
>> --- a/dts/framework/testbed_model/sut_node.py
>> +++ b/dts/framework/testbed_model/sut_node.py
>> @@ -3,6 +3,14 @@
>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>> # Copyright(c) 2023 University of New Hampshire
>>
>> +"""System under test (DPDK + hardware) node.
>> +
>> +A system under test (SUT) is the combination of DPDK
>> +and the hardware we're testing with DPDK (NICs, crypto and other devices).
>> +An SUT node is where this SUT runs.
>> +"""
>
>
> I think this should just be "A SUT node"
>
I always spell it out which is why I used "an" (an es, ju:, ti: node).
From what I understand, the article is based on how the word is
pronounced. If it's an initialism (it's spelled), we should use "an"
and if it's an abbreviation (pronounced as the whole word), we should
use "a". It always made sense to me as an initialism - I think that's
the common usage.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 18/21] dts: sut and tg nodes docstring update
2023-12-04 10:02 ` Juraj Linkeš
@ 2023-12-04 11:02 ` Bruce Richardson
0 siblings, 0 replies; 393+ messages in thread
From: Bruce Richardson @ 2023-12-04 11:02 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Jeremy Spewock, thomas, Honnappa.Nagarahalli, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro, dev
On Mon, Dec 04, 2023 at 11:02:21AM +0100, Juraj Linkeš wrote:
> On Fri, Dec 1, 2023 at 7:06 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
> >
> >
> >
> > On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
> >>
> >> Format according to the Google format and PEP257, with slight
> >> deviations.
> >>
> >> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> >> ---
> >> dts/framework/testbed_model/sut_node.py | 230 ++++++++++++++++--------
> >> dts/framework/testbed_model/tg_node.py | 42 +++--
> >> 2 files changed, 176 insertions(+), 96 deletions(-)
> >>
> >> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> >> index 5ce9446dba..c4acea38d1 100644
> >> --- a/dts/framework/testbed_model/sut_node.py
> >> +++ b/dts/framework/testbed_model/sut_node.py
> >> @@ -3,6 +3,14 @@
> >> # Copyright(c) 2023 PANTHEON.tech s.r.o.
> >> # Copyright(c) 2023 University of New Hampshire
> >>
> >> +"""System under test (DPDK + hardware) node.
> >> +
> >> +A system under test (SUT) is the combination of DPDK
> >> +and the hardware we're testing with DPDK (NICs, crypto and other devices).
> >> +An SUT node is where this SUT runs.
> >> +"""
> >
> >
> > I think this should just be "A SUT node"
> >
>
> I always spell it out which is why I used "an" (an es, ju:, ti: node).
> From what I understand, the article is based on how the word is
> pronounced. If it's an initialism (it's spelled), we should use "an"
> and if it's an abbreviation (pronounced as the whole word), we should
> use "a". It always made sense to me as an initialism - I think that's
> the common usage.
+1 for using "an" instead of "a" in front of "SUT".
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 19/21] dts: base traffic generators docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (17 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 18/21] dts: sut and tg nodes " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-12-01 18:05 ` Jeremy Spewock
2023-11-23 15:13 ` [PATCH v8 20/21] dts: scapy tg " Juraj Linkeš
` (3 subsequent siblings)
22 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../traffic_generator/__init__.py | 22 ++++++++-
.../capturing_traffic_generator.py | 45 +++++++++++--------
.../traffic_generator/traffic_generator.py | 33 ++++++++------
3 files changed, 67 insertions(+), 33 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index 52888d03fa..11e2bd7d97 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -1,6 +1,19 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""DTS traffic generators.
+
+A traffic generator is capable of generating traffic and then monitor returning traffic.
+All traffic generators must count the number of received packets. Some may additionally capture
+individual packets.
+
+A traffic generator may be software running on generic hardware or it could be specialized hardware.
+
+The traffic generators that only count the number of received packets are suitable only for
+performance testing. In functional testing, we need to be able to dissect each arrived packet
+and a capturing traffic generator is required.
+"""
+
from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
from framework.exception import ConfigurationError
from framework.testbed_model.node import Node
@@ -12,8 +25,15 @@
def create_traffic_generator(
tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
+ """The factory function for creating traffic generator objects from the test run configuration.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ traffic_generator_config: The traffic generator config.
+ Returns:
+ A traffic generator capable of capturing received packets.
+ """
match traffic_generator_config.traffic_generator_type:
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGenerator(tg_node, traffic_generator_config)
diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index 1fc7f98c05..0246590333 100644
--- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -23,19 +23,21 @@
def _get_default_capture_name() -> str:
- """
- This is the function used for the default implementation of capture names.
- """
return str(uuid.uuid4())
class CapturingTrafficGenerator(TrafficGenerator):
"""Capture packets after sending traffic.
- A mixin interface which enables a packet generator to declare that it can capture
+ The intermediary interface which enables a packet generator to declare that it can capture
packets and return them to the user.
+ Similarly to :class:`~.traffic_generator.TrafficGenerator`, this class exposes
+ the public methods specific to capturing traffic generators and defines a private method
+ that must implement the traffic generation and capturing logic in subclasses.
+
The methods of capturing traffic generators obey the following workflow:
+
1. send packets
2. capture packets
3. write the capture to a .pcap file
@@ -44,6 +46,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
@property
def is_capturing(self) -> bool:
+ """This traffic generator can capture traffic."""
return True
def send_packet_and_capture(
@@ -54,11 +57,12 @@ def send_packet_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet` and capture received traffic.
+
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ The captured traffic is recorded in the `capture_name`.pcap file.
Args:
packet: The packet to send.
@@ -68,7 +72,7 @@ def send_packet_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
return self.send_packets_and_capture(
[packet], send_port, receive_port, duration, capture_name
@@ -82,11 +86,14 @@ def send_packets_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send packets, return received traffic.
+ """Send `packets` and capture received traffic.
- Send packets on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ Send `packets` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
+
+ The captured traffic is recorded in the `capture_name`.pcap file. The target directory
+ can be configured with the :option:`--output-dir` command line argument or
+ the :envvar:`DTS_OUTPUT_DIR` environment variable.
Args:
packets: The packets to send.
@@ -96,7 +103,7 @@ def send_packets_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
self._logger.debug(get_packet_summaries(packets))
self._logger.debug(
@@ -121,10 +128,12 @@ def _send_packets_and_capture(
receive_port: Port,
duration: float,
) -> list[Packet]:
- """
- The extended classes must implement this method which
- sends packets on send_port and receives packets on the receive_port
- for the specified duration. It must be able to handle no received packets.
+ """The implementation of :method:`send_packets_and_capture`.
+
+ The subclasses must implement this method which sends `packets` on `send_port`
+ and receives packets on `receive_port` for the specified `duration`.
+
+ It must be able to handle receiving no packets.
"""
def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]) -> None:
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 0d9902ddb7..5fb9824568 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -22,7 +22,8 @@
class TrafficGenerator(ABC):
"""The base traffic generator.
- Defines the few basic methods that each traffic generator must implement.
+ Exposes the common public methods of all traffic generators and defines private methods
+ that must implement the traffic generation logic in subclasses.
"""
_config: TrafficGeneratorConfig
@@ -30,14 +31,20 @@ class TrafficGenerator(ABC):
_logger: DTSLOG
def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ """Initialize the traffic generator.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ config: The traffic generator's test run configuration.
+ """
self._config = config
self._tg_node = tg_node
self._logger = getLogger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
def send_packet(self, packet: Packet, port: Port) -> None:
- """Send a packet and block until it is fully sent.
+ """Send `packet` and block until it is fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packet` on `port`, then wait until `packet` is fully sent.
Args:
packet: The packet to send.
@@ -46,9 +53,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
self.send_packets([packet], port)
def send_packets(self, packets: list[Packet], port: Port) -> None:
- """Send packets and block until they are fully sent.
+ """Send `packets` and block until they are fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packets` on `port`, then wait until `packets` are fully sent.
Args:
packets: The packets to send.
@@ -60,19 +67,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
@abstractmethod
def _send_packets(self, packets: list[Packet], port: Port) -> None:
- """
- The extended classes must implement this method which
- sends packets on send_port. The method should block until all packets
- are fully sent.
+ """The implementation of :method:`send_packets`.
+
+ The subclasses must implement this method which sends `packets` on `port`.
+ The method should block until all `packets` are fully sent.
+
+ What full sent means is defined by the traffic generator.
"""
@property
def is_capturing(self) -> bool:
- """Whether this traffic generator can capture traffic.
-
- Returns:
- True if the traffic generator can capture traffic, False otherwise.
- """
+ """This traffic generator can't capture traffic."""
return False
@abstractmethod
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 19/21] dts: base traffic generators docstring update
2023-11-23 15:13 ` [PATCH v8 19/21] dts: base traffic generators " Juraj Linkeš
@ 2023-12-01 18:05 ` Jeremy Spewock
2023-12-04 10:03 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-12-01 18:05 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
[-- Attachment #1: Type: text/plain, Size: 11286 bytes --]
On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> .../traffic_generator/__init__.py | 22 ++++++++-
> .../capturing_traffic_generator.py | 45 +++++++++++--------
> .../traffic_generator/traffic_generator.py | 33 ++++++++------
> 3 files changed, 67 insertions(+), 33 deletions(-)
>
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py
> b/dts/framework/testbed_model/traffic_generator/__init__.py
> index 52888d03fa..11e2bd7d97 100644
> --- a/dts/framework/testbed_model/traffic_generator/__init__.py
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -1,6 +1,19 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> +"""DTS traffic generators.
> +
> +A traffic generator is capable of generating traffic and then monitor
> returning traffic.
> +All traffic generators must count the number of received packets. Some
> may additionally capture
> +individual packets.
> +
> +A traffic generator may be software running on generic hardware or it
> could be specialized hardware.
> +
> +The traffic generators that only count the number of received packets are
> suitable only for
> +performance testing. In functional testing, we need to be able to dissect
> each arrived packet
> +and a capturing traffic generator is required.
> +"""
> +
> from framework.config import ScapyTrafficGeneratorConfig,
> TrafficGeneratorType
> from framework.exception import ConfigurationError
> from framework.testbed_model.node import Node
> @@ -12,8 +25,15 @@
> def create_traffic_generator(
> tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
> ) -> CapturingTrafficGenerator:
> - """A factory function for creating traffic generator object from user
> config."""
> + """The factory function for creating traffic generator objects from
> the test run configuration.
> +
> + Args:
> + tg_node: The traffic generator node where the created traffic
> generator will be running.
> + traffic_generator_config: The traffic generator config.
>
> + Returns:
> + A traffic generator capable of capturing received packets.
> + """
> match traffic_generator_config.traffic_generator_type:
> case TrafficGeneratorType.SCAPY:
> return ScapyTrafficGenerator(tg_node,
> traffic_generator_config)
> diff --git
> a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> index 1fc7f98c05..0246590333 100644
> ---
> a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> +++
> b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> @@ -23,19 +23,21 @@
>
>
> def _get_default_capture_name() -> str:
> - """
> - This is the function used for the default implementation of capture
> names.
> - """
> return str(uuid.uuid4())
>
>
> class CapturingTrafficGenerator(TrafficGenerator):
> """Capture packets after sending traffic.
>
> - A mixin interface which enables a packet generator to declare that it
> can capture
> + The intermediary interface which enables a packet generator to
> declare that it can capture
> packets and return them to the user.
>
> + Similarly to :class:`~.traffic_generator.TrafficGenerator`, this
> class exposes
> + the public methods specific to capturing traffic generators and
> defines a private method
> + that must implement the traffic generation and capturing logic in
> subclasses.
> +
> The methods of capturing traffic generators obey the following
> workflow:
> +
> 1. send packets
> 2. capture packets
> 3. write the capture to a .pcap file
> @@ -44,6 +46,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
>
> @property
> def is_capturing(self) -> bool:
> + """This traffic generator can capture traffic."""
> return True
>
> def send_packet_and_capture(
> @@ -54,11 +57,12 @@ def send_packet_and_capture(
> duration: float,
> capture_name: str = _get_default_capture_name(),
> ) -> list[Packet]:
> - """Send a packet, return received traffic.
> + """Send `packet` and capture received traffic.
> +
> + Send `packet` on `send_port` and then return all traffic captured
> + on `receive_port` for the given `duration`.
>
> - Send a packet on the send_port and then return all traffic
> captured
> - on the receive_port for the given duration. Also record the
> captured traffic
> - in a pcap file.
> + The captured traffic is recorded in the `capture_name`.pcap file.
>
> Args:
> packet: The packet to send.
> @@ -68,7 +72,7 @@ def send_packet_and_capture(
> capture_name: The name of the .pcap file where to store the
> capture.
>
> Returns:
> - A list of received packets. May be empty if no packets are
> captured.
> + The received packets. May be empty if no packets are
> captured.
> """
> return self.send_packets_and_capture(
> [packet], send_port, receive_port, duration, capture_name
> @@ -82,11 +86,14 @@ def send_packets_and_capture(
> duration: float,
> capture_name: str = _get_default_capture_name(),
> ) -> list[Packet]:
> - """Send packets, return received traffic.
> + """Send `packets` and capture received traffic.
>
> - Send packets on the send_port and then return all traffic captured
> - on the receive_port for the given duration. Also record the
> captured traffic
> - in a pcap file.
> + Send `packets` on `send_port` and then return all traffic captured
> + on `receive_port` for the given `duration`.
> +
> + The captured traffic is recorded in the `capture_name`.pcap file.
> The target directory
> + can be configured with the :option:`--output-dir` command line
> argument or
> + the :envvar:`DTS_OUTPUT_DIR` environment variable.
>
> Args:
> packets: The packets to send.
> @@ -96,7 +103,7 @@ def send_packets_and_capture(
> capture_name: The name of the .pcap file where to store the
> capture.
>
> Returns:
> - A list of received packets. May be empty if no packets are
> captured.
> + The received packets. May be empty if no packets are
> captured.
> """
> self._logger.debug(get_packet_summaries(packets))
> self._logger.debug(
> @@ -121,10 +128,12 @@ def _send_packets_and_capture(
> receive_port: Port,
> duration: float,
> ) -> list[Packet]:
> - """
> - The extended classes must implement this method which
> - sends packets on send_port and receives packets on the
> receive_port
> - for the specified duration. It must be able to handle no received
> packets.
> + """The implementation of :method:`send_packets_and_capture`.
> +
> + The subclasses must implement this method which sends `packets`
> on `send_port`
> + and receives packets on `receive_port` for the specified
> `duration`.
> +
> + It must be able to handle receiving no packets.
> """
>
> def _write_capture_from_packets(self, capture_name: str, packets:
> list[Packet]) -> None:
> diff --git
> a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index 0d9902ddb7..5fb9824568 100644
> --- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -22,7 +22,8 @@
> class TrafficGenerator(ABC):
> """The base traffic generator.
>
> - Defines the few basic methods that each traffic generator must
> implement.
> + Exposes the common public methods of all traffic generators and
> defines private methods
> + that must implement the traffic generation logic in subclasses.
> """
>
> _config: TrafficGeneratorConfig
> @@ -30,14 +31,20 @@ class TrafficGenerator(ABC):
> _logger: DTSLOG
>
> def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> + """Initialize the traffic generator.
> +
> + Args:
> + tg_node: The traffic generator node where the created traffic
> generator will be running.
> + config: The traffic generator's test run configuration.
> + """
> self._config = config
> self._tg_node = tg_node
> self._logger = getLogger(f"{self._tg_node.name}
> {self._config.traffic_generator_type}")
>
> def send_packet(self, packet: Packet, port: Port) -> None:
> - """Send a packet and block until it is fully sent.
> + """Send `packet` and block until it is fully sent.
>
> - What fully sent means is defined by the traffic generator.
> + Send `packet` on `port`, then wait until `packet` is fully sent.
>
> Args:
> packet: The packet to send.
> @@ -46,9 +53,9 @@ def send_packet(self, packet: Packet, port: Port) ->
> None:
> self.send_packets([packet], port)
>
> def send_packets(self, packets: list[Packet], port: Port) -> None:
> - """Send packets and block until they are fully sent.
> + """Send `packets` and block until they are fully sent.
>
> - What fully sent means is defined by the traffic generator.
> + Send `packets` on `port`, then wait until `packets` are fully
> sent.
>
> Args:
> packets: The packets to send.
> @@ -60,19 +67,17 @@ def send_packets(self, packets: list[Packet], port:
> Port) -> None:
>
> @abstractmethod
> def _send_packets(self, packets: list[Packet], port: Port) -> None:
> - """
> - The extended classes must implement this method which
> - sends packets on send_port. The method should block until all
> packets
> - are fully sent.
> + """The implementation of :method:`send_packets`.
> +
> + The subclasses must implement this method which sends `packets`
> on `port`.
> + The method should block until all `packets` are fully sent.
> +
> + What full sent means is defined by the traffic generator.
> """
>
I think this should be "what fully sent means"
> @property
> def is_capturing(self) -> bool:
> - """Whether this traffic generator can capture traffic.
> -
> - Returns:
> - True if the traffic generator can capture traffic, False
> otherwise.
> - """
> + """This traffic generator can't capture traffic."""
> return False
>
> @abstractmethod
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 13466 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 19/21] dts: base traffic generators docstring update
2023-12-01 18:05 ` Jeremy Spewock
@ 2023-12-04 10:03 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:03 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
On Fri, Dec 1, 2023 at 7:05 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
>
>
>
> On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>>
>> Format according to the Google format and PEP257, with slight
>> deviations.
>>
>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>> ---
>> .../traffic_generator/__init__.py | 22 ++++++++-
>> .../capturing_traffic_generator.py | 45 +++++++++++--------
>> .../traffic_generator/traffic_generator.py | 33 ++++++++------
>> 3 files changed, 67 insertions(+), 33 deletions(-)
>>
>> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
>> index 52888d03fa..11e2bd7d97 100644
>> --- a/dts/framework/testbed_model/traffic_generator/__init__.py
>> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
<snip>
>> @@ -60,19 +67,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
>>
>> @abstractmethod
>> def _send_packets(self, packets: list[Packet], port: Port) -> None:
>> - """
>> - The extended classes must implement this method which
>> - sends packets on send_port. The method should block until all packets
>> - are fully sent.
>> + """The implementation of :method:`send_packets`.
>> +
>> + The subclasses must implement this method which sends `packets` on `port`.
>> + The method should block until all `packets` are fully sent.
>> +
>> + What full sent means is defined by the traffic generator.
>> """
>
>
> I think this should be "what fully sent means"
>
Thanks, Yoan also caught this.
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 20/21] dts: scapy tg docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (18 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 19/21] dts: base traffic generators " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-12-01 18:17 ` Jeremy Spewock
2023-11-23 15:13 ` [PATCH v8 21/21] dts: test suites " Juraj Linkeš
` (2 subsequent siblings)
22 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
1 file changed, 54 insertions(+), 37 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
index c88cf28369..30ea3914ee 100644
--- a/dts/framework/testbed_model/traffic_generator/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -2,14 +2,15 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""Scapy traffic generator.
+"""The Scapy traffic generator.
-Traffic generator used for functional testing, implemented using the Scapy library.
+A traffic generator used for functional testing, implemented with
+`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
The traffic generator uses an XML-RPC server to run Scapy on the remote TG node.
-The XML-RPC server runs in an interactive remote SSH session running Python console,
-where we start the server. The communication with the server is facilitated with
-a local server proxy.
+The traffic generator uses the :mod:`xmlrpc.server` module to run an XML-RPC server
+in an interactive remote Python SSH session. The communication with the server is facilitated
+with a local server proxy from the :mod:`xmlrpc.client` module.
"""
import inspect
@@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
recv_iface: str,
duration: float,
) -> list[bytes]:
- """RPC function to send and capture packets.
+ """The RPC function to send and capture packets.
- The function is meant to be executed on the remote TG node.
+ The function is meant to be executed on the remote TG node via the server proxy.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
recv_iface: The logical name of the ingress interface.
duration: Capture for this amount of time, in seconds.
Returns:
A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
+ to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
sniffer = scapy.all.AsyncSniffer(
@@ -96,19 +97,15 @@ def scapy_send_packets_and_capture(
def scapy_send_packets(xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: str) -> None:
- """RPC function to send packets.
+ """The RPC function to send packets.
- The function is meant to be executed on the remote TG node.
- It doesn't return anything, only sends packets.
+ The function is meant to be executed on the remote TG node via the server proxy.
+ It only sends `xmlrpc_packets`, without capturing them.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
-
- Returns:
- A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True, verbose=True)
@@ -128,11 +125,19 @@ def scapy_send_packets(xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: s
class QuittableXMLRPCServer(SimpleXMLRPCServer):
- """Basic XML-RPC server that may be extended
- by functions serializable by the marshal module.
+ """Basic XML-RPC server.
+
+ The server may be augmented by functions serializable by the :mod:`marshal` module.
"""
def __init__(self, *args, **kwargs):
+ """Extend the XML-RPC server initialization.
+
+ Args:
+ args: The positional arguments that will be passed to the superclass's constructor.
+ kwargs: The keyword arguments that will be passed to the superclass's constructor.
+ The `allow_none` argument will be set to :data:`True`.
+ """
kwargs["allow_none"] = True
super().__init__(*args, **kwargs)
self.register_introspection_functions()
@@ -140,13 +145,12 @@ def __init__(self, *args, **kwargs):
self.register_function(self.add_rpc_function)
def quit(self) -> None:
+ """Quit the server."""
self._BaseServer__shutdown_request = True
return None
def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
- """Add a function to the server.
-
- This is meant to be executed remotely.
+ """Add a function to the server from the local server proxy.
Args:
name: The name of the function.
@@ -157,6 +161,11 @@ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> N
self.register_function(function)
def serve_forever(self, poll_interval: float = 0.5) -> None:
+ """Extend the superclass method with an additional print.
+
+ Once executed in the local server proxy, the print gives us a clear string to expect
+ when starting the server. The print means the function was executed on the XML-RPC server.
+ """
print("XMLRPC OK")
super().serve_forever(poll_interval)
@@ -164,19 +173,12 @@ def serve_forever(self, poll_interval: float = 0.5) -> None:
class ScapyTrafficGenerator(CapturingTrafficGenerator):
"""Provides access to scapy functions via an RPC interface.
- The traffic generator first starts an XML-RPC on the remote TG node.
- Then it populates the server with functions which use the Scapy library
- to send/receive traffic.
-
- Any packets sent to the remote server are first converted to bytes.
- They are received as xmlrpc.client.Binary objects on the server side.
- When the server sends the packets back, they are also received as
- xmlrpc.client.Binary object on the client side, are converted back to Scapy
- packets and only then returned from the methods.
+ The class extends the base with remote execution of scapy functions.
- Arguments:
- tg_node: The node where the traffic generator resides.
- config: The user configuration of the traffic generator.
+ Any packets sent to the remote server are first converted to bytes. They are received as
+ :class:`~xmlrpc.client.Binary` objects on the server side. When the server sends the packets
+ back, they are also received as :class:`~xmlrpc.client.Binary` objects on the client side, are
+ converted back to :class:`~scapy.packet.Packet` objects and only then returned from the methods.
Attributes:
session: The exclusive interactive remote session created by the Scapy
@@ -190,6 +192,22 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
_config: ScapyTrafficGeneratorConfig
def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ """Extend the constructor with Scapy TG specifics.
+
+ The traffic generator first starts an XML-RPC on the remote `tg_node`.
+ Then it populates the server with functions which use the Scapy library
+ to send/receive traffic:
+
+ * :func:`scapy_send_packets_and_capture`
+ * :func:`scapy_send_packets`
+
+ To enable verbose logging from the xmlrpc client, use the :option:`--verbose`
+ command line argument or the :envvar:`DTS_VERBOSE` environment variable.
+
+ Args:
+ tg_node: The node where the traffic generator resides.
+ config: The traffic generator's test run configuration.
+ """
super().__init__(tg_node, config)
assert (
@@ -231,10 +249,8 @@ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
# or class, so strip all lines containing only whitespace
src = "\n".join([line for line in src.splitlines() if not line.isspace() and line != ""])
- spacing = "\n" * 4
-
# execute it in the python terminal
- self.session.send_command(spacing + src + spacing)
+ self.session.send_command(src + "\n")
self.session.send_command(
f"server = QuittableXMLRPCServer(('0.0.0.0', {listen_port}));server.serve_forever()",
"XMLRPC OK",
@@ -267,6 +283,7 @@ def _send_packets_and_capture(
return scapy_packets
def close(self) -> None:
+ """Close the traffic generator."""
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 20/21] dts: scapy tg docstring update
2023-11-23 15:13 ` [PATCH v8 20/21] dts: scapy tg " Juraj Linkeš
@ 2023-12-01 18:17 ` Jeremy Spewock
2023-12-04 10:07 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2023-12-01 18:17 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
[-- Attachment #1: Type: text/plain, Size: 10350 bytes --]
On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> .../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
> 1 file changed, 54 insertions(+), 37 deletions(-)
>
> diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py
> b/dts/framework/testbed_model/traffic_generator/scapy.py
> index c88cf28369..30ea3914ee 100644
> --- a/dts/framework/testbed_model/traffic_generator/scapy.py
> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
> @@ -2,14 +2,15 @@
> # Copyright(c) 2022 University of New Hampshire
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> -"""Scapy traffic generator.
> +"""The Scapy traffic generator.
>
> -Traffic generator used for functional testing, implemented using the
> Scapy library.
> +A traffic generator used for functional testing, implemented with
> +`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
> The traffic generator uses an XML-RPC server to run Scapy on the remote
> TG node.
>
> -The XML-RPC server runs in an interactive remote SSH session running
> Python console,
> -where we start the server. The communication with the server is
> facilitated with
> -a local server proxy.
> +The traffic generator uses the :mod:`xmlrpc.server` module to run an
> XML-RPC server
> +in an interactive remote Python SSH session. The communication with the
> server is facilitated
> +with a local server proxy from the :mod:`xmlrpc.client` module.
> """
>
> import inspect
> @@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
> recv_iface: str,
> duration: float,
> ) -> list[bytes]:
> - """RPC function to send and capture packets.
> + """The RPC function to send and capture packets.
>
> - The function is meant to be executed on the remote TG node.
> + The function is meant to be executed on the remote TG node via the
> server proxy.
>
>
Should this maybe be "This function is meant" instead? I'm not completely
sure if it should be, I feel like it might be able to go either way.
> Args:
> xmlrpc_packets: The packets to send. These need to be converted to
> - xmlrpc.client.Binary before sending to the remote server.
> + :class:`~xmlrpc.client.Binary` objects before sending to the
> remote server.
> send_iface: The logical name of the egress interface.
> recv_iface: The logical name of the ingress interface.
> duration: Capture for this amount of time, in seconds.
>
> Returns:
> A list of bytes. Each item in the list represents one packet,
> which needs
> - to be converted back upon transfer from the remote node.
> + to be converted back upon transfer from the remote node.
> """
> scapy_packets = [scapy.all.Packet(packet.data) for packet in
> xmlrpc_packets]
> sniffer = scapy.all.AsyncSniffer(
> @@ -96,19 +97,15 @@ def scapy_send_packets_and_capture(
>
>
> def scapy_send_packets(xmlrpc_packets: list[xmlrpc.client.Binary],
> send_iface: str) -> None:
> - """RPC function to send packets.
> + """The RPC function to send packets.
>
> - The function is meant to be executed on the remote TG node.
> - It doesn't return anything, only sends packets.
> + The function is meant to be executed on the remote TG node via the
> server proxy.
>
Same thing here. I don't think it matters that much since you refer to it
as being "the RPC function" for sending packets, but it feels like you are
referring instead to this specific function on this line.
> + It only sends `xmlrpc_packets`, without capturing them.
>
> Args:
> xmlrpc_packets: The packets to send. These need to be converted to
> - xmlrpc.client.Binary before sending to the remote server.
> + :class:`~xmlrpc.client.Binary` objects before sending to the
> remote server.
> send_iface: The logical name of the egress interface.
> -
> - Returns:
> - A list of bytes. Each item in the list represents one packet,
> which needs
> - to be converted back upon transfer from the remote node.
> """
> scapy_packets = [scapy.all.Packet(packet.data) for packet in
> xmlrpc_packets]
> scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True,
> verbose=True)
> @@ -128,11 +125,19 @@ def scapy_send_packets(xmlrpc_packets:
> list[xmlrpc.client.Binary], send_iface: s
>
>
> class QuittableXMLRPCServer(SimpleXMLRPCServer):
> - """Basic XML-RPC server that may be extended
> - by functions serializable by the marshal module.
> + """Basic XML-RPC server.
> +
> + The server may be augmented by functions serializable by the
> :mod:`marshal` module.
> """
>
> def __init__(self, *args, **kwargs):
> + """Extend the XML-RPC server initialization.
> +
> + Args:
> + args: The positional arguments that will be passed to the
> superclass's constructor.
> + kwargs: The keyword arguments that will be passed to the
> superclass's constructor.
> + The `allow_none` argument will be set to :data:`True`.
> + """
> kwargs["allow_none"] = True
> super().__init__(*args, **kwargs)
> self.register_introspection_functions()
> @@ -140,13 +145,12 @@ def __init__(self, *args, **kwargs):
> self.register_function(self.add_rpc_function)
>
> def quit(self) -> None:
> + """Quit the server."""
> self._BaseServer__shutdown_request = True
> return None
>
> def add_rpc_function(self, name: str, function_bytes:
> xmlrpc.client.Binary) -> None:
> - """Add a function to the server.
> -
> - This is meant to be executed remotely.
> + """Add a function to the server from the local server proxy.
>
> Args:
> name: The name of the function.
> @@ -157,6 +161,11 @@ def add_rpc_function(self, name: str, function_bytes:
> xmlrpc.client.Binary) -> N
> self.register_function(function)
>
> def serve_forever(self, poll_interval: float = 0.5) -> None:
> + """Extend the superclass method with an additional print.
> +
> + Once executed in the local server proxy, the print gives us a
> clear string to expect
> + when starting the server. The print means the function was
> executed on the XML-RPC server.
> + """
> print("XMLRPC OK")
> super().serve_forever(poll_interval)
>
> @@ -164,19 +173,12 @@ def serve_forever(self, poll_interval: float = 0.5)
> -> None:
> class ScapyTrafficGenerator(CapturingTrafficGenerator):
> """Provides access to scapy functions via an RPC interface.
>
> - The traffic generator first starts an XML-RPC on the remote TG node.
> - Then it populates the server with functions which use the Scapy
> library
> - to send/receive traffic.
> -
> - Any packets sent to the remote server are first converted to bytes.
> - They are received as xmlrpc.client.Binary objects on the server side.
> - When the server sends the packets back, they are also received as
> - xmlrpc.client.Binary object on the client side, are converted back to
> Scapy
> - packets and only then returned from the methods.
> + The class extends the base with remote execution of scapy functions.
>
Same thing here if the above end up getting changed.
>
> - Arguments:
> - tg_node: The node where the traffic generator resides.
> - config: The user configuration of the traffic generator.
> + Any packets sent to the remote server are first converted to bytes.
> They are received as
> + :class:`~xmlrpc.client.Binary` objects on the server side. When the
> server sends the packets
> + back, they are also received as :class:`~xmlrpc.client.Binary`
> objects on the client side, are
> + converted back to :class:`~scapy.packet.Packet` objects and only then
> returned from the methods.
>
> Attributes:
> session: The exclusive interactive remote session created by the
> Scapy
> @@ -190,6 +192,22 @@ class
> ScapyTrafficGenerator(CapturingTrafficGenerator):
> _config: ScapyTrafficGeneratorConfig
>
> def __init__(self, tg_node: Node, config:
> ScapyTrafficGeneratorConfig):
> + """Extend the constructor with Scapy TG specifics.
> +
> + The traffic generator first starts an XML-RPC on the remote
> `tg_node`.
> + Then it populates the server with functions which use the Scapy
> library
> + to send/receive traffic:
> +
> + * :func:`scapy_send_packets_and_capture`
> + * :func:`scapy_send_packets`
> +
> + To enable verbose logging from the xmlrpc client, use the
> :option:`--verbose`
> + command line argument or the :envvar:`DTS_VERBOSE` environment
> variable.
> +
> + Args:
> + tg_node: The node where the traffic generator resides.
> + config: The traffic generator's test run configuration.
> + """
> super().__init__(tg_node, config)
>
> assert (
> @@ -231,10 +249,8 @@ def _start_xmlrpc_server_in_remote_python(self,
> listen_port: int) -> None:
> # or class, so strip all lines containing only whitespace
> src = "\n".join([line for line in src.splitlines() if not
> line.isspace() and line != ""])
>
> - spacing = "\n" * 4
> -
> # execute it in the python terminal
> - self.session.send_command(spacing + src + spacing)
> + self.session.send_command(src + "\n")
> self.session.send_command(
> f"server = QuittableXMLRPCServer(('0.0.0.0',
> {listen_port}));server.serve_forever()",
> "XMLRPC OK",
> @@ -267,6 +283,7 @@ def _send_packets_and_capture(
> return scapy_packets
>
> def close(self) -> None:
> + """Close the traffic generator."""
> try:
> self.rpc_server_proxy.quit()
> except ConnectionRefusedError:
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 12814 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 20/21] dts: scapy tg docstring update
2023-12-01 18:17 ` Jeremy Spewock
@ 2023-12-04 10:07 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:07 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
On Fri, Dec 1, 2023 at 7:18 PM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
>
>
>
> On Thu, Nov 23, 2023 at 10:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>>
>> Format according to the Google format and PEP257, with slight
>> deviations.
>>
>> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>> ---
>> .../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
>> 1 file changed, 54 insertions(+), 37 deletions(-)
>>
>> diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
>> index c88cf28369..30ea3914ee 100644
>> --- a/dts/framework/testbed_model/traffic_generator/scapy.py
>> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
>> @@ -2,14 +2,15 @@
>> # Copyright(c) 2022 University of New Hampshire
>> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>>
>> -"""Scapy traffic generator.
>> +"""The Scapy traffic generator.
>>
>> -Traffic generator used for functional testing, implemented using the Scapy library.
>> +A traffic generator used for functional testing, implemented with
>> +`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
>> The traffic generator uses an XML-RPC server to run Scapy on the remote TG node.
>>
>> -The XML-RPC server runs in an interactive remote SSH session running Python console,
>> -where we start the server. The communication with the server is facilitated with
>> -a local server proxy.
>> +The traffic generator uses the :mod:`xmlrpc.server` module to run an XML-RPC server
>> +in an interactive remote Python SSH session. The communication with the server is facilitated
>> +with a local server proxy from the :mod:`xmlrpc.client` module.
>> """
>>
>> import inspect
>> @@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
>> recv_iface: str,
>> duration: float,
>> ) -> list[bytes]:
>> - """RPC function to send and capture packets.
>> + """The RPC function to send and capture packets.
>>
>> - The function is meant to be executed on the remote TG node.
>> + The function is meant to be executed on the remote TG node via the server proxy.
>>
>
> Should this maybe be "This function is meant" instead? I'm not completely sure if it should be, I feel like it might be able to go either way.
>
There is something to this. It's a bit more explicit and as such less
confusing which feels better, so I'll change it in all three
instances.
>>
>> Args:
>> xmlrpc_packets: The packets to send. These need to be converted to
>> - xmlrpc.client.Binary before sending to the remote server.
>> + :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
>> send_iface: The logical name of the egress interface.
>> recv_iface: The logical name of the ingress interface.
>> duration: Capture for this amount of time, in seconds.
>>
>> Returns:
>> A list of bytes. Each item in the list represents one packet, which needs
>> - to be converted back upon transfer from the remote node.
>> + to be converted back upon transfer from the remote node.
>> """
>> scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
>> sniffer = scapy.all.AsyncSniffer(
>> @@ -96,19 +97,15 @@ def scapy_send_packets_and_capture(
>>
>>
>> def scapy_send_packets(xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: str) -> None:
>> - """RPC function to send packets.
>> + """The RPC function to send packets.
>>
>> - The function is meant to be executed on the remote TG node.
>> - It doesn't return anything, only sends packets.
>> + The function is meant to be executed on the remote TG node via the server proxy.
>
>
> Same thing here. I don't think it matters that much since you refer to it as being "the RPC function" for sending packets, but it feels like you are referring instead to this specific function on this line.
>
>>
>> + It only sends `xmlrpc_packets`, without capturing them.
>>
>> Args:
>> xmlrpc_packets: The packets to send. These need to be converted to
>> - xmlrpc.client.Binary before sending to the remote server.
>> + :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
>> send_iface: The logical name of the egress interface.
>> -
>> - Returns:
>> - A list of bytes. Each item in the list represents one packet, which needs
>> - to be converted back upon transfer from the remote node.
>> """
>> scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
>> scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True, verbose=True)
>> @@ -128,11 +125,19 @@ def scapy_send_packets(xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: s
>>
>>
>> class QuittableXMLRPCServer(SimpleXMLRPCServer):
>> - """Basic XML-RPC server that may be extended
>> - by functions serializable by the marshal module.
>> + """Basic XML-RPC server.
>> +
>> + The server may be augmented by functions serializable by the :mod:`marshal` module.
>> """
>>
>> def __init__(self, *args, **kwargs):
>> + """Extend the XML-RPC server initialization.
>> +
>> + Args:
>> + args: The positional arguments that will be passed to the superclass's constructor.
>> + kwargs: The keyword arguments that will be passed to the superclass's constructor.
>> + The `allow_none` argument will be set to :data:`True`.
>> + """
>> kwargs["allow_none"] = True
>> super().__init__(*args, **kwargs)
>> self.register_introspection_functions()
>> @@ -140,13 +145,12 @@ def __init__(self, *args, **kwargs):
>> self.register_function(self.add_rpc_function)
>>
>> def quit(self) -> None:
>> + """Quit the server."""
>> self._BaseServer__shutdown_request = True
>> return None
>>
>> def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
>> - """Add a function to the server.
>> -
>> - This is meant to be executed remotely.
>> + """Add a function to the server from the local server proxy.
>>
>> Args:
>> name: The name of the function.
>> @@ -157,6 +161,11 @@ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> N
>> self.register_function(function)
>>
>> def serve_forever(self, poll_interval: float = 0.5) -> None:
>> + """Extend the superclass method with an additional print.
>> +
>> + Once executed in the local server proxy, the print gives us a clear string to expect
>> + when starting the server. The print means the function was executed on the XML-RPC server.
>> + """
>> print("XMLRPC OK")
>> super().serve_forever(poll_interval)
>>
>> @@ -164,19 +173,12 @@ def serve_forever(self, poll_interval: float = 0.5) -> None:
>> class ScapyTrafficGenerator(CapturingTrafficGenerator):
>> """Provides access to scapy functions via an RPC interface.
>>
>> - The traffic generator first starts an XML-RPC on the remote TG node.
>> - Then it populates the server with functions which use the Scapy library
>> - to send/receive traffic.
>> -
>> - Any packets sent to the remote server are first converted to bytes.
>> - They are received as xmlrpc.client.Binary objects on the server side.
>> - When the server sends the packets back, they are also received as
>> - xmlrpc.client.Binary object on the client side, are converted back to Scapy
>> - packets and only then returned from the methods.
>> + The class extends the base with remote execution of scapy functions.
>
>
> Same thing here if the above end up getting changed.
>
>>
>>
>> - Arguments:
>> - tg_node: The node where the traffic generator resides.
>> - config: The user configuration of the traffic generator.
>> + Any packets sent to the remote server are first converted to bytes. They are received as
>> + :class:`~xmlrpc.client.Binary` objects on the server side. When the server sends the packets
>> + back, they are also received as :class:`~xmlrpc.client.Binary` objects on the client side, are
>> + converted back to :class:`~scapy.packet.Packet` objects and only then returned from the methods.
>>
>> Attributes:
>> session: The exclusive interactive remote session created by the Scapy
>> @@ -190,6 +192,22 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
>> _config: ScapyTrafficGeneratorConfig
>>
>> def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
>> + """Extend the constructor with Scapy TG specifics.
>> +
>> + The traffic generator first starts an XML-RPC on the remote `tg_node`.
>> + Then it populates the server with functions which use the Scapy library
>> + to send/receive traffic:
>> +
>> + * :func:`scapy_send_packets_and_capture`
>> + * :func:`scapy_send_packets`
>> +
>> + To enable verbose logging from the xmlrpc client, use the :option:`--verbose`
>> + command line argument or the :envvar:`DTS_VERBOSE` environment variable.
>> +
>> + Args:
>> + tg_node: The node where the traffic generator resides.
>> + config: The traffic generator's test run configuration.
>> + """
>> super().__init__(tg_node, config)
>>
>> assert (
>> @@ -231,10 +249,8 @@ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
>> # or class, so strip all lines containing only whitespace
>> src = "\n".join([line for line in src.splitlines() if not line.isspace() and line != ""])
>>
>> - spacing = "\n" * 4
>> -
>> # execute it in the python terminal
>> - self.session.send_command(spacing + src + spacing)
>> + self.session.send_command(src + "\n")
>> self.session.send_command(
>> f"server = QuittableXMLRPCServer(('0.0.0.0', {listen_port}));server.serve_forever()",
>> "XMLRPC OK",
>> @@ -267,6 +283,7 @@ def _send_packets_and_capture(
>> return scapy_packets
>>
>> def close(self) -> None:
>> + """Close the traffic generator."""
>> try:
>> self.rpc_server_proxy.quit()
>> except ConnectionRefusedError:
>> --
>> 2.34.1
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 21/21] dts: test suites docstring update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (19 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 20/21] dts: scapy tg " Juraj Linkeš
@ 2023-11-23 15:13 ` Juraj Linkeš
2023-12-01 16:00 ` [PATCH v8 00/21] dts: docstrings update Yoan Picchi
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
22 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-23 15:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/tests/TestSuite_hello_world.py | 16 +++++---
dts/tests/TestSuite_os_udp.py | 20 ++++++----
dts/tests/TestSuite_smoke_tests.py | 61 ++++++++++++++++++++++++------
3 files changed, 72 insertions(+), 25 deletions(-)
diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
index 768ba1cfa8..fd7ff1534d 100644
--- a/dts/tests/TestSuite_hello_world.py
+++ b/dts/tests/TestSuite_hello_world.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2010-2014 Intel Corporation
-"""
+"""The DPDK hello world app test suite.
+
Run the helloworld example app and verify it prints a message for each used core.
No other EAL parameters apart from cores are used.
"""
@@ -15,22 +16,25 @@
class TestHelloWorld(TestSuite):
+ """DPDK hello world app test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Build the app we're about to test - helloworld.
"""
self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
def test_hello_world_single_core(self) -> None:
- """
+ """Single core test case.
+
Steps:
Run the helloworld app on the first usable logical core.
Verify:
The app prints a message from the used core:
"hello from core <core_id>"
"""
-
# get the first usable core
lcore_amount = LogicalCoreCount(1, 1, 1)
lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
@@ -42,14 +46,14 @@ def test_hello_world_single_core(self) -> None:
)
def test_hello_world_all_cores(self) -> None:
- """
+ """All cores test case.
+
Steps:
Run the helloworld app on all usable logical cores.
Verify:
The app prints a message from all used cores:
"hello from core <core_id>"
"""
-
# get the maximum logical core number
eal_para = self.sut_node.create_eal_parameters(
lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
index bf6b93deb5..2cf29d37bb 100644
--- a/dts/tests/TestSuite_os_udp.py
+++ b/dts/tests/TestSuite_os_udp.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
+"""Basic IPv4 OS routing test suite.
+
Configure SUT node to route traffic from if1 to if2.
Send a packet to the SUT node, verify it comes back on the second port on the TG node.
"""
@@ -13,24 +14,26 @@
class TestOSUdp(TestSuite):
+ """IPv4 UDP OS routing test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
- Configure SUT ports and SUT to route traffic from if1 to if2.
+ Bind the SUT ports to the OS driver, configure the ports and configure the SUT
+ to route traffic from if1 to if2.
"""
-
- # This test uses kernel drivers
self.sut_node.bind_ports_to_driver(for_dpdk=False)
self.configure_testbed_ipv4()
def test_os_udp(self) -> None:
- """
+ """Basic UDP IPv4 traffic test case.
+
Steps:
Send a UDP packet.
Verify:
The packet with proper addresses arrives at the other TG port.
"""
-
packet = Ether() / IP() / UDP()
received_packets = self.send_packet_and_capture(packet)
@@ -40,7 +43,8 @@ def test_os_udp(self) -> None:
self.verify_packets(expected_packet, received_packets)
def tear_down_suite(self) -> None:
- """
+ """Tear down the test suite.
+
Teardown:
Remove the SUT port configuration configured in setup.
"""
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index 8958f58dac..5e2bac14bd 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -1,6 +1,17 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Smoke test suite.
+
+Smoke tests are a class of tests which are used for validating a minimal set of important features.
+These are the most important features without which (or when they're faulty) the software wouldn't
+work properly. Thus, if any failure occurs while testing these features,
+there isn't that much of a reason to continue testing, as the software is fundamentally broken.
+
+These tests don't have to include only DPDK tests, as the reason for failures could be
+in the infrastructure (a faulty link between NICs or a misconfiguration).
+"""
+
import re
from framework.config import PortConfig
@@ -11,23 +22,39 @@
class SmokeTests(TestSuite):
+ """DPDK and infrastructure smoke test suite.
+
+ The test cases validate the most basic DPDK functionality needed for all other test suites.
+ The infrastructure also needs to be tested, as that is also used by all other test suites.
+
+ Attributes:
+ is_blocking: This test suite will block the execution of all other test suites
+ in the build target after it.
+ nics_in_node: The NICs present on the SUT node.
+ """
+
is_blocking = True
# dicts in this list are expected to have two keys:
# "pci_address" and "current_driver"
nics_in_node: list[PortConfig] = []
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
- Set the build directory path and generate a list of NICs in the SUT node.
+ Set the build directory path and a list of NICs in the SUT node.
"""
self.dpdk_build_dir_path = self.sut_node.remote_dpdk_build_dir
self.nics_in_node = self.sut_node.config.ports
def test_unit_tests(self) -> None:
- """
+ """DPDK meson ``fast-tests`` unit tests.
+
+ Test that all unit test from the ``fast-tests`` suite pass.
+ The suite is a subset with only the most basic tests.
+
Test:
- Run the fast-test unit-test suite through meson.
+ Run the ``fast-tests`` unit test suite through meson.
"""
self.sut_node.main_session.send_command(
f"meson test -C {self.dpdk_build_dir_path} --suite fast-tests -t 60",
@@ -37,9 +64,14 @@ def test_unit_tests(self) -> None:
)
def test_driver_tests(self) -> None:
- """
+ """DPDK meson ``driver-tests`` unit tests.
+
+ Test that all unit test from the ``driver-tests`` suite pass.
+ The suite is a subset with driver tests. This suite may be run with virtual devices
+ configured in the test run configuration.
+
Test:
- Run the driver-test unit-test suite through meson.
+ Run the ``driver-tests`` unit test suite through meson.
"""
vdev_args = ""
for dev in self.sut_node.virtual_devices:
@@ -60,9 +92,12 @@ def test_driver_tests(self) -> None:
)
def test_devices_listed_in_testpmd(self) -> None:
- """
+ """Testpmd device discovery.
+
+ Test that the devices configured in the test run configuration are found in testpmd.
+
Test:
- Uses testpmd driver to verify that devices have been found by testpmd.
+ List all devices found in testpmd and verify the configured devices are among them.
"""
testpmd_driver = self.sut_node.create_interactive_shell(TestPmdShell, privileged=True)
dev_list = [str(x) for x in testpmd_driver.get_devices()]
@@ -74,10 +109,14 @@ def test_devices_listed_in_testpmd(self) -> None:
)
def test_device_bound_to_driver(self) -> None:
- """
+ """Device driver in OS.
+
+ Test that the devices configured in the test run configuration are bound to
+ the proper driver.
+
Test:
- Ensure that all drivers listed in the config are bound to the correct
- driver.
+ List all devices with the ``dpdk-devbind.py`` script and verify that
+ the configured devices are bound to the proper driver.
"""
path_to_devbind = self.sut_node.path_to_devbind_script
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 00/21] dts: docstrings update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (20 preceding siblings ...)
2023-11-23 15:13 ` [PATCH v8 21/21] dts: test suites " Juraj Linkeš
@ 2023-12-01 16:00 ` Yoan Picchi
2023-12-01 18:23 ` Jeremy Spewock
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
22 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-12-01 16:00 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro
Cc: dev
On 11/23/23 15:13, Juraj Linkeš wrote:
> The first commit makes changes to the code. These code changes mainly
> change the structure of the code so that the actual API docs generation
> works. There are also some code changes which get reflected in the
> documentation, such as making functions/methods/attributes private or
> public.
>
> The rest of the commits deal with the actual docstring documentation
> (from which the API docs are generated). The format of the docstrings
> is the Google format [0] with PEP257 [1] and some guidelines captured
> in the last commit of this group covering what the Google format
> doesn't.
> The docstring updates are split into many commits to make review
> possible. When accepted, they may be squashed.
> The docstrings have been composed in anticipation of [2], adhering to
> maximum line length of 100. We don't have a tool for automatic docstring
> formatting, hence the usage of 100 right away to save time.
>
> NOTE: The logger.py module is not fully documented, as it's being
> refactored and the refactor will be submitted in the near future.
> Documenting it now seems unnecessary.
>
> [0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
> [1] https://peps.python.org/pep-0257/
> [2] https://patches.dpdk.org/project/dpdk/list/?series=29844
>
> v7:
> Split the series into docstrings and api docs generation and addressed
> comments.
>
> v8:
> Addressed review comments, all of which were pretty minor - small
> gramatical changes, a little bit of rewording to remove confusion here
> and there, additional explanations and so on.
>
> Juraj Linkeš (21):
> dts: code adjustments for doc generation
> dts: add docstring checker
> dts: add basic developer docs
> dts: exceptions docstring update
> dts: settings docstring update
> dts: logger and utils docstring update
> dts: dts runner and main docstring update
> dts: test suite docstring update
> dts: test result docstring update
> dts: config docstring update
> dts: remote session docstring update
> dts: interactive remote session docstring update
> dts: port and virtual device docstring update
> dts: cpu docstring update
> dts: os session docstring update
> dts: posix and linux sessions docstring update
> dts: node docstring update
> dts: sut and tg nodes docstring update
> dts: base traffic generators docstring update
> dts: scapy tg docstring update
> dts: test suites docstring update
>
> doc/guides/tools/dts.rst | 73 +++
> dts/framework/__init__.py | 12 +-
> dts/framework/config/__init__.py | 375 +++++++++++++---
> dts/framework/config/types.py | 132 ++++++
> dts/framework/dts.py | 162 +++++--
> dts/framework/exception.py | 156 ++++---
> dts/framework/logger.py | 72 ++-
> dts/framework/remote_session/__init__.py | 80 ++--
> .../interactive_remote_session.py | 36 +-
> .../remote_session/interactive_shell.py | 150 +++++++
> dts/framework/remote_session/os_session.py | 284 ------------
> dts/framework/remote_session/python_shell.py | 32 ++
> .../remote_session/remote/__init__.py | 27 --
> .../remote/interactive_shell.py | 131 ------
> .../remote_session/remote/python_shell.py | 12 -
> .../remote_session/remote/remote_session.py | 168 -------
> .../remote_session/remote/testpmd_shell.py | 45 --
> .../remote_session/remote_session.py | 230 ++++++++++
> .../{remote => }/ssh_session.py | 28 +-
> dts/framework/remote_session/testpmd_shell.py | 83 ++++
> dts/framework/settings.py | 188 ++++++--
> dts/framework/test_result.py | 301 ++++++++++---
> dts/framework/test_suite.py | 236 +++++++---
> dts/framework/testbed_model/__init__.py | 29 +-
> dts/framework/testbed_model/{hw => }/cpu.py | 209 ++++++---
> dts/framework/testbed_model/hw/__init__.py | 27 --
> dts/framework/testbed_model/hw/port.py | 60 ---
> .../testbed_model/hw/virtual_device.py | 16 -
> .../linux_session.py | 70 ++-
> dts/framework/testbed_model/node.py | 214 ++++++---
> dts/framework/testbed_model/os_session.py | 422 ++++++++++++++++++
> dts/framework/testbed_model/port.py | 93 ++++
> .../posix_session.py | 85 +++-
> dts/framework/testbed_model/sut_node.py | 238 ++++++----
> dts/framework/testbed_model/tg_node.py | 69 ++-
> .../testbed_model/traffic_generator.py | 72 ---
> .../traffic_generator/__init__.py | 43 ++
> .../capturing_traffic_generator.py | 49 +-
> .../{ => traffic_generator}/scapy.py | 110 +++--
> .../traffic_generator/traffic_generator.py | 85 ++++
> dts/framework/testbed_model/virtual_device.py | 29 ++
> dts/framework/utils.py | 122 ++---
> dts/main.py | 19 +-
> dts/poetry.lock | 12 +-
> dts/pyproject.toml | 6 +-
> dts/tests/TestSuite_hello_world.py | 16 +-
> dts/tests/TestSuite_os_udp.py | 20 +-
> dts/tests/TestSuite_smoke_tests.py | 61 ++-
> 48 files changed, 3506 insertions(+), 1683 deletions(-)
> create mode 100644 dts/framework/config/types.py
> rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (76%)
> create mode 100644 dts/framework/remote_session/interactive_shell.py
> delete mode 100644 dts/framework/remote_session/os_session.py
> create mode 100644 dts/framework/remote_session/python_shell.py
> delete mode 100644 dts/framework/remote_session/remote/__init__.py
> delete mode 100644 dts/framework/remote_session/remote/interactive_shell.py
> delete mode 100644 dts/framework/remote_session/remote/python_shell.py
> delete mode 100644 dts/framework/remote_session/remote/remote_session.py
> delete mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
> create mode 100644 dts/framework/remote_session/remote_session.py
> rename dts/framework/remote_session/{remote => }/ssh_session.py (82%)
> create mode 100644 dts/framework/remote_session/testpmd_shell.py
> rename dts/framework/testbed_model/{hw => }/cpu.py (50%)
> delete mode 100644 dts/framework/testbed_model/hw/__init__.py
> delete mode 100644 dts/framework/testbed_model/hw/port.py
> delete mode 100644 dts/framework/testbed_model/hw/virtual_device.py
> rename dts/framework/{remote_session => testbed_model}/linux_session.py (77%)
> create mode 100644 dts/framework/testbed_model/os_session.py
> create mode 100644 dts/framework/testbed_model/port.py
> rename dts/framework/{remote_session => testbed_model}/posix_session.py (73%)
> delete mode 100644 dts/framework/testbed_model/traffic_generator.py
> create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
> rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (68%)
> rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (71%)
> create mode 100644 dts/framework/testbed_model/traffic_generator/traffic_generator.py
> create mode 100644 dts/framework/testbed_model/virtual_device.py
>
Reviewed-by: Yoan Picchi <yoan.picchi@arm.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 00/21] dts: docstrings update
2023-12-01 16:00 ` [PATCH v8 00/21] dts: docstrings update Yoan Picchi
@ 2023-12-01 18:23 ` Jeremy Spewock
0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2023-12-01 18:23 UTC (permalink / raw)
To: Yoan Picchi
Cc: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
Luca.Vizzarro, dev
[-- Attachment #1: Type: text/plain, Size: 207 bytes --]
Hey Juraj,
I looked through all the patches and left a few comments. All of the
comments I left though were very minor comments about spelling/grammar on a
few patches. Otherwise this all looks good to me.
[-- Attachment #2: Type: text/html, Size: 466 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 00/21] dts: docstrings update
2023-11-23 15:13 ` [PATCH v8 00/21] dts: docstrings update Juraj Linkeš
` (21 preceding siblings ...)
2023-12-01 16:00 ` [PATCH v8 00/21] dts: docstrings update Yoan Picchi
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 01/21] dts: code adjustments for doc generation Juraj Linkeš
` (21 more replies)
22 siblings, 22 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
The first commit makes changes to the code. These code changes mainly
change the structure of the code so that the actual API docs generation
works. There are also some code changes which get reflected in the
documentation, such as making functions/methods/attributes private or
public.
The rest of the commits deal with the actual docstring documentation
(from which the API docs are generated). The format of the docstrings
is the Google format [0] with PEP257 [1] and some guidelines captured
in the last commit of this group covering what the Google format
doesn't.
The docstring updates are split into many commits to make review
possible. When accepted, they may be squashed.
The docstrings have been composed in anticipation of [2], adhering to
maximum line length of 100. We don't have a tool for automatic docstring
formatting, hence the usage of 100 right away to save time.
NOTE: The logger.py module is not fully documented, as it's being
refactored and the refactor will be submitted in the near future.
Documenting it now seems unnecessary.
[0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
[1] https://peps.python.org/pep-0257/
[2] https://patches.dpdk.org/project/dpdk/list/?series=29844
v7:
Split the series into docstrings and api docs generation and addressed
comments.
v8:
Addressed review comments, all of which were pretty minor - small
gramatical changes, a little bit of rewording to remove confusion here
and there, additional explanations and so on.
v9:
Addressed review comments, again all minor grammar fixes.
Juraj Linkeš (21):
dts: code adjustments for doc generation
dts: add docstring checker
dts: add basic developer docs
dts: exceptions docstring update
dts: settings docstring update
dts: logger and utils docstring update
dts: dts runner and main docstring update
dts: test suite docstring update
dts: test result docstring update
dts: config docstring update
dts: remote session docstring update
dts: interactive remote session docstring update
dts: port and virtual device docstring update
dts: cpu docstring update
dts: os session docstring update
dts: posix and linux sessions docstring update
dts: node docstring update
dts: sut and tg nodes docstring update
dts: base traffic generators docstring update
dts: scapy tg docstring update
dts: test suites docstring update
doc/guides/tools/dts.rst | 73 +++
dts/framework/__init__.py | 12 +-
dts/framework/config/__init__.py | 375 +++++++++++++---
dts/framework/config/types.py | 132 ++++++
dts/framework/dts.py | 162 +++++--
dts/framework/exception.py | 156 ++++---
dts/framework/logger.py | 72 ++-
dts/framework/remote_session/__init__.py | 80 ++--
.../interactive_remote_session.py | 36 +-
.../remote_session/interactive_shell.py | 150 +++++++
dts/framework/remote_session/os_session.py | 284 ------------
dts/framework/remote_session/python_shell.py | 32 ++
.../remote_session/remote/__init__.py | 27 --
.../remote/interactive_shell.py | 131 ------
.../remote_session/remote/python_shell.py | 12 -
.../remote_session/remote/remote_session.py | 168 -------
.../remote_session/remote/testpmd_shell.py | 45 --
.../remote_session/remote_session.py | 230 ++++++++++
.../{remote => }/ssh_session.py | 28 +-
dts/framework/remote_session/testpmd_shell.py | 84 ++++
dts/framework/settings.py | 188 ++++++--
dts/framework/test_result.py | 301 ++++++++++---
dts/framework/test_suite.py | 236 +++++++---
dts/framework/testbed_model/__init__.py | 29 +-
dts/framework/testbed_model/{hw => }/cpu.py | 209 ++++++---
dts/framework/testbed_model/hw/__init__.py | 27 --
dts/framework/testbed_model/hw/port.py | 60 ---
.../testbed_model/hw/virtual_device.py | 16 -
.../linux_session.py | 70 ++-
dts/framework/testbed_model/node.py | 214 ++++++---
dts/framework/testbed_model/os_session.py | 422 ++++++++++++++++++
dts/framework/testbed_model/port.py | 93 ++++
.../posix_session.py | 85 +++-
dts/framework/testbed_model/sut_node.py | 238 ++++++----
dts/framework/testbed_model/tg_node.py | 69 ++-
.../traffic_generator/__init__.py | 43 ++
.../capturing_traffic_generator.py | 49 +-
.../{ => traffic_generator}/scapy.py | 110 +++--
.../traffic_generator.py | 47 +-
dts/framework/testbed_model/virtual_device.py | 29 ++
dts/framework/utils.py | 122 ++---
dts/main.py | 19 +-
dts/poetry.lock | 12 +-
dts/pyproject.toml | 6 +-
dts/tests/TestSuite_hello_world.py | 16 +-
dts/tests/TestSuite_os_udp.py | 20 +-
dts/tests/TestSuite_smoke_tests.py | 61 ++-
47 files changed, 3452 insertions(+), 1628 deletions(-)
create mode 100644 dts/framework/config/types.py
rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (76%)
create mode 100644 dts/framework/remote_session/interactive_shell.py
delete mode 100644 dts/framework/remote_session/os_session.py
create mode 100644 dts/framework/remote_session/python_shell.py
delete mode 100644 dts/framework/remote_session/remote/__init__.py
delete mode 100644 dts/framework/remote_session/remote/interactive_shell.py
delete mode 100644 dts/framework/remote_session/remote/python_shell.py
delete mode 100644 dts/framework/remote_session/remote/remote_session.py
delete mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
create mode 100644 dts/framework/remote_session/remote_session.py
rename dts/framework/remote_session/{remote => }/ssh_session.py (82%)
create mode 100644 dts/framework/remote_session/testpmd_shell.py
rename dts/framework/testbed_model/{hw => }/cpu.py (50%)
delete mode 100644 dts/framework/testbed_model/hw/__init__.py
delete mode 100644 dts/framework/testbed_model/hw/port.py
delete mode 100644 dts/framework/testbed_model/hw/virtual_device.py
rename dts/framework/{remote_session => testbed_model}/linux_session.py (77%)
create mode 100644 dts/framework/testbed_model/os_session.py
create mode 100644 dts/framework/testbed_model/port.py
rename dts/framework/{remote_session => testbed_model}/posix_session.py (73%)
create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (68%)
rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (71%)
rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (51%)
create mode 100644 dts/framework/testbed_model/virtual_device.py
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 01/21] dts: code adjustments for doc generation
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 02/21] dts: add docstring checker Juraj Linkeš
` (20 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
The standard Python tool for generating API documentation, Sphinx,
imports modules one-by-one when generating the documentation. This
requires code changes:
* properly guarding argument parsing in the if __name__ == '__main__'
block,
* the logger used by DTS runner underwent the same treatment so that it
doesn't create log files outside of a DTS run,
* however, DTS uses the arguments to construct an object holding global
variables. The defaults for the global variables needed to be moved
from argument parsing elsewhere,
* importing the remote_session module from framework resulted in
circular imports because of one module trying to import another
module. This is fixed by reorganizing the code,
* some code reorganization was done because the resulting structure
makes more sense, improving documentation clarity.
The are some other changes which are documentation related:
* added missing type annotation so they appear in the generated docs,
* reordered arguments in some methods,
* removed superfluous arguments and attributes,
* change private functions/methods/attributes to private and vice-versa.
The above all appear in the generated documentation and the with them,
the documentation is improved.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 8 +-
dts/framework/dts.py | 31 +++++--
dts/framework/exception.py | 54 +++++-------
dts/framework/remote_session/__init__.py | 41 +++++----
.../interactive_remote_session.py | 0
.../{remote => }/interactive_shell.py | 0
.../{remote => }/python_shell.py | 0
.../remote_session/remote/__init__.py | 27 ------
.../{remote => }/remote_session.py | 0
.../{remote => }/ssh_session.py | 12 +--
.../{remote => }/testpmd_shell.py | 0
dts/framework/settings.py | 85 +++++++++++--------
dts/framework/test_result.py | 4 +-
dts/framework/test_suite.py | 7 +-
dts/framework/testbed_model/__init__.py | 12 +--
dts/framework/testbed_model/{hw => }/cpu.py | 13 +++
dts/framework/testbed_model/hw/__init__.py | 27 ------
.../linux_session.py | 6 +-
dts/framework/testbed_model/node.py | 23 +++--
.../os_session.py | 22 ++---
dts/framework/testbed_model/{hw => }/port.py | 0
.../posix_session.py | 4 +-
dts/framework/testbed_model/sut_node.py | 8 +-
dts/framework/testbed_model/tg_node.py | 29 +------
.../traffic_generator/__init__.py | 23 +++++
.../capturing_traffic_generator.py | 4 +-
.../{ => traffic_generator}/scapy.py | 19 ++---
.../traffic_generator.py | 14 ++-
.../testbed_model/{hw => }/virtual_device.py | 0
dts/framework/utils.py | 40 +++------
dts/main.py | 9 +-
31 files changed, 244 insertions(+), 278 deletions(-)
rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
delete mode 100644 dts/framework/remote_session/remote/__init__.py
rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
delete mode 100644 dts/framework/testbed_model/hw/__init__.py
rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
rename dts/framework/testbed_model/{hw => }/port.py (100%)
rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (98%)
rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (81%)
rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index 9b32cf0532..ef25a463c0 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -17,6 +17,7 @@
import warlock # type: ignore[import]
import yaml
+from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict):
+ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
# This looks useless now, but is designed to allow expansion to traffic
# generators that require more configuration later.
match TrafficGeneratorType(d["type"]):
@@ -97,6 +98,8 @@ def from_dict(d: dict):
return ScapyTrafficGeneratorConfig(
traffic_generator_type=TrafficGeneratorType.SCAPY
)
+ case _:
+ raise ConfigurationError(f'Unknown traffic generator type "{d["type"]}".')
@dataclass(slots=True, frozen=True)
@@ -314,6 +317,3 @@ def load_config() -> Configuration:
config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
config_obj: Configuration = Configuration.from_dict(dict(config))
return config_obj
-
-
-CONFIGURATION = load_config()
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 25d6942d81..356368ef10 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -6,19 +6,19 @@
import sys
from .config import (
- CONFIGURATION,
BuildTargetConfiguration,
ExecutionConfiguration,
TestSuiteConfig,
+ load_config,
)
from .exception import BlockingTestSuiteError
from .logger import DTSLOG, getLogger
from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
from .test_suite import get_test_suites
from .testbed_model import SutNode, TGNode
-from .utils import check_dts_python_version
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG = None # type: ignore[assignment]
result: DTSResult = DTSResult(dts_logger)
@@ -30,14 +30,18 @@ def run_all() -> None:
global dts_logger
global result
+ # create a regular DTS logger and create a new result with it
+ dts_logger = getLogger("DTSRunner")
+ result = DTSResult(dts_logger)
+
# check the python version of the server that run dts
- check_dts_python_version()
+ _check_dts_python_version()
sut_nodes: dict[str, SutNode] = {}
tg_nodes: dict[str, TGNode] = {}
try:
# for all Execution sections
- for execution in CONFIGURATION.executions:
+ for execution in load_config().executions:
sut_node = sut_nodes.get(execution.system_under_test_node.name)
tg_node = tg_nodes.get(execution.traffic_generator_node.name)
@@ -82,6 +86,23 @@ def run_all() -> None:
_exit_dts()
+def _check_dts_python_version() -> None:
+ def RED(text: str) -> str:
+ return f"\u001B[31;1m{str(text)}\u001B[0m"
+
+ if sys.version_info.major < 3 or (sys.version_info.major == 3 and sys.version_info.minor < 10):
+ print(
+ RED(
+ (
+ "WARNING: DTS execution node's python version is lower than"
+ "python 3.10, is deprecated and will not work in future releases."
+ )
+ ),
+ file=sys.stderr,
+ )
+ print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+
+
def _run_execution(
sut_node: SutNode,
tg_node: TGNode,
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index b362e42924..151e4d3aa9 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
Command execution timeout.
"""
- command: str
- output: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _command: str
- def __init__(self, command: str, output: str):
- self.command = command
- self.output = output
+ def __init__(self, command: str):
+ self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self.command}"
-
- def get_output(self) -> str:
- return self.output
+ return f"TIMEOUT on {self._command}"
class SSHConnectionError(DTSError):
@@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
SSH connection error.
"""
- host: str
- errors: list[str]
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
+ _errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
- self.host = host
- self.errors = [] if errors is None else errors
+ self._host = host
+ self._errors = [] if errors is None else errors
def __str__(self) -> str:
- message = f"Error trying to connect with {self.host}."
- if self.errors:
- message += f" Errors encountered while retrying: {', '.join(self.errors)}"
+ message = f"Error trying to connect with {self._host}."
+ if self._errors:
+ message += f" Errors encountered while retrying: {', '.join(self._errors)}"
return message
@@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
It can no longer be used.
"""
- host: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+ _host: str
def __init__(self, host: str):
- self.host = host
+ self._host = host
def __str__(self) -> str:
- return f"SSH session with {self.host} has died"
+ return f"SSH session with {self._host} has died"
class ConfigurationError(DTSError):
@@ -107,16 +102,16 @@ class RemoteCommandExecutionError(DTSError):
Raised when a command executed on a Node returns a non-zero exit status.
"""
- command: str
- command_return_code: int
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ command: str
+ _command_return_code: int
def __init__(self, command: str, command_return_code: int):
self.command = command
- self.command_return_code = command_return_code
+ self._command_return_code = command_return_code
def __str__(self) -> str:
- return f"Command {self.command} returned a non-zero exit code: {self.command_return_code}"
+ return f"Command {self.command} returned a non-zero exit code: {self._command_return_code}"
class RemoteDirectoryExistsError(DTSError):
@@ -140,22 +135,15 @@ class TestCaseVerifyError(DTSError):
Used in test cases to verify the expected behavior.
"""
- value: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
- def __init__(self, value: str):
- self.value = value
-
- def __str__(self) -> str:
- return repr(self.value)
-
class BlockingTestSuiteError(DTSError):
- suite_name: str
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
+ _suite_name: str
def __init__(self, suite_name: str) -> None:
- self.suite_name = suite_name
+ self._suite_name = suite_name
def __str__(self) -> str:
- return f"Blocking suite {self.suite_name} failed."
+ return f"Blocking suite {self._suite_name} failed."
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 6124417bd7..5e7ddb2b05 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -12,27 +12,24 @@
# pylama:ignore=W0611
-from framework.config import OS, NodeConfiguration
-from framework.exception import ConfigurationError
+from framework.config import NodeConfiguration
from framework.logger import DTSLOG
-from .linux_session import LinuxSession
-from .os_session import InteractiveShellType, OSSession
-from .remote import (
- CommandResult,
- InteractiveRemoteSession,
- InteractiveShell,
- PythonShell,
- RemoteSession,
- SSHSession,
- TestPmdDevice,
- TestPmdShell,
-)
-
-
-def create_session(node_config: NodeConfiguration, name: str, logger: DTSLOG) -> OSSession:
- match node_config.os:
- case OS.linux:
- return LinuxSession(node_config, name, logger)
- case _:
- raise ConfigurationError(f"Unsupported OS {node_config.os}")
+from .interactive_remote_session import InteractiveRemoteSession
+from .interactive_shell import InteractiveShell
+from .python_shell import PythonShell
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
+from .testpmd_shell import TestPmdShell
+
+
+def create_remote_session(
+ node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> RemoteSession:
+ return SSHSession(node_config, name, logger)
+
+
+def create_interactive_session(
+ node_config: NodeConfiguration, logger: DTSLOG
+) -> InteractiveRemoteSession:
+ return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_remote_session.py
rename to dts/framework/remote_session/interactive_remote_session.py
diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_shell.py
rename to dts/framework/remote_session/interactive_shell.py
diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/python_shell.py
rename to dts/framework/remote_session/python_shell.py
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 06403691a5..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-# Copyright(c) 2023 University of New Hampshire
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .interactive_remote_session import InteractiveRemoteSession
-from .interactive_shell import InteractiveShell
-from .python_shell import PythonShell
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-from .testpmd_shell import TestPmdDevice, TestPmdShell
-
-
-def create_remote_session(
- node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
- return SSHSession(node_config, name, logger)
-
-
-def create_interactive_session(
- node_config: NodeConfiguration, logger: DTSLOG
-) -> InteractiveRemoteSession:
- return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 91%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
index 1a7ee649ab..a467033a13 100644
--- a/dts/framework/remote_session/remote/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -18,9 +18,7 @@
SSHException,
)
-from framework.config import NodeConfiguration
from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
-from framework.logger import DTSLOG
from .remote_session import CommandResult, RemoteSession
@@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
session: Connection
- def __init__(
- self,
- node_config: NodeConfiguration,
- session_name: str,
- logger: DTSLOG,
- ):
- super(SSHSession, self).__init__(node_config, session_name, logger)
-
def _connect(self) -> None:
errors = []
retry_attempts = 10
@@ -111,7 +101,7 @@ def _send_command(self, command: str, timeout: float, env: dict | None) -> Comma
except CommandTimedOut as e:
self._logger.exception(e)
- raise SSHTimeoutError(command, e.result.stderr) from e
+ raise SSHTimeoutError(command) from e
return CommandResult(self.name, command, output.stdout, output.stderr, output.return_code)
diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/testpmd_shell.py
rename to dts/framework/remote_session/testpmd_shell.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 974793a11a..25b5dcff22 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
from pathlib import Path
from typing import Any, TypeVar
@@ -22,8 +22,8 @@ def __init__(
option_strings: Sequence[str],
dest: str,
nargs: str | int | None = None,
- const: str | None = None,
- default: str = None,
+ const: bool | None = None,
+ default: Any = None,
type: Callable[[str], _T | argparse.FileType | None] = None,
choices: Iterable[_T] | None = None,
required: bool = False,
@@ -32,6 +32,12 @@ def __init__(
) -> None:
env_var_value = os.environ.get(env_var)
default = env_var_value or default
+ if const is not None:
+ nargs = 0
+ default = const if env_var_value else default
+ type = None
+ choices = None
+ metavar = None
super(_EnvironmentArgument, self).__init__(
option_strings,
dest,
@@ -52,22 +58,28 @@ def __call__(
values: Any,
option_string: str = None,
) -> None:
- setattr(namespace, self.dest, values)
+ if self.const is not None:
+ setattr(namespace, self.dest, self.const)
+ else:
+ setattr(namespace, self.dest, values)
return _EnvironmentArgument
-@dataclass(slots=True, frozen=True)
-class _Settings:
- config_file_path: str
- output_dir: str
- timeout: float
- verbose: bool
- skip_setup: bool
- dpdk_tarball_path: Path
- compile_timeout: float
- test_cases: list
- re_run: int
+@dataclass(slots=True)
+class Settings:
+ config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ output_dir: str = "output"
+ timeout: float = 15
+ verbose: bool = False
+ skip_setup: bool = False
+ dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ compile_timeout: float = 1200
+ test_cases: list[str] = field(default_factory=list)
+ re_run: int = 0
+
+
+SETTINGS: Settings = Settings()
def _get_parser() -> argparse.ArgumentParser:
@@ -80,7 +92,8 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--config-file",
action=_env_arg("DTS_CFG_FILE"),
- default="conf.yaml",
+ default=SETTINGS.config_file_path,
+ type=Path,
help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs and targets.",
)
@@ -88,7 +101,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--output-dir",
"--output",
action=_env_arg("DTS_OUTPUT_DIR"),
- default="output",
+ default=SETTINGS.output_dir,
help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
)
@@ -96,7 +109,7 @@ def _get_parser() -> argparse.ArgumentParser:
"-t",
"--timeout",
action=_env_arg("DTS_TIMEOUT"),
- default=15,
+ default=SETTINGS.timeout,
type=float,
help="[DTS_TIMEOUT] The default timeout for all DTS operations except for compiling DPDK.",
)
@@ -105,8 +118,9 @@ def _get_parser() -> argparse.ArgumentParser:
"-v",
"--verbose",
action=_env_arg("DTS_VERBOSE"),
- default="N",
- help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
+ default=SETTINGS.verbose,
+ const=True,
+ help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
"to the console.",
)
@@ -114,8 +128,8 @@ def _get_parser() -> argparse.ArgumentParser:
"-s",
"--skip-setup",
action=_env_arg("DTS_SKIP_SETUP"),
- default="N",
- help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
+ const=True,
+ help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
)
parser.add_argument(
@@ -123,7 +137,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--snapshot",
"--git-ref",
action=_env_arg("DTS_DPDK_TARBALL"),
- default="dpdk.tar.xz",
+ default=SETTINGS.dpdk_tarball_path,
type=Path,
help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
"tag ID or tree ID to test. To test local changes, first commit them, "
@@ -133,7 +147,7 @@ def _get_parser() -> argparse.ArgumentParser:
parser.add_argument(
"--compile-timeout",
action=_env_arg("DTS_COMPILE_TIMEOUT"),
- default=1200,
+ default=SETTINGS.compile_timeout,
type=float,
help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
)
@@ -150,7 +164,7 @@ def _get_parser() -> argparse.ArgumentParser:
"--re-run",
"--re_run",
action=_env_arg("DTS_RERUN"),
- default=0,
+ default=SETTINGS.re_run,
type=int,
help="[DTS_RERUN] Re-run each test case the specified amount of times "
"if a test failure occurs",
@@ -159,21 +173,20 @@ def _get_parser() -> argparse.ArgumentParser:
return parser
-def _get_settings() -> _Settings:
+def get_settings() -> Settings:
parsed_args = _get_parser().parse_args()
- return _Settings(
+ return Settings(
config_file_path=parsed_args.config_file,
output_dir=parsed_args.output_dir,
timeout=parsed_args.timeout,
- verbose=(parsed_args.verbose == "Y"),
- skip_setup=(parsed_args.skip_setup == "Y"),
- dpdk_tarball_path=Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
- if not os.path.exists(parsed_args.tarball)
- else Path(parsed_args.tarball),
+ verbose=parsed_args.verbose,
+ skip_setup=parsed_args.skip_setup,
+ dpdk_tarball_path=Path(
+ Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
+ if not os.path.exists(parsed_args.tarball)
+ else Path(parsed_args.tarball)
+ ),
compile_timeout=parsed_args.compile_timeout,
- test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+ test_cases=(parsed_args.test_cases.split(",") if parsed_args.test_cases else []),
re_run=parsed_args.re_run,
)
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 4c2e7e2418..57090feb04 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -246,7 +246,7 @@ def add_build_target(self, build_target: BuildTargetConfiguration) -> BuildTarge
self._inner_results.append(build_target_result)
return build_target_result
- def add_sut_info(self, sut_info: NodeInfo):
+ def add_sut_info(self, sut_info: NodeInfo) -> None:
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
@@ -289,7 +289,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
self._inner_results.append(execution_result)
return execution_result
- def add_error(self, error) -> None:
+ def add_error(self, error: Exception) -> None:
self._errors.append(error)
def process(self) -> None:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 4a7907ec33..f9e66e814a 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -11,7 +11,7 @@
import re
from ipaddress import IPv4Interface, IPv6Interface, ip_interface
from types import MethodType
-from typing import Union
+from typing import Any, Union
from scapy.layers.inet import IP # type: ignore[import]
from scapy.layers.l2 import Ether # type: ignore[import]
@@ -26,8 +26,7 @@
from .logger import DTSLOG, getLogger
from .settings import SETTINGS
from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
-from .testbed_model import SutNode, TGNode
-from .testbed_model.hw.port import Port, PortLink
+from .testbed_model import Port, PortLink, SutNode, TGNode
from .utils import get_packet_summaries
@@ -426,7 +425,7 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
- def is_test_suite(object) -> bool:
+ def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
return True
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 5cbb859e47..8ced05653b 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,15 +9,9 @@
# pylama:ignore=W0611
-from .hw import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
- VirtualDevice,
- lcore_filter,
-)
+from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
from .node import Node
+from .port import Port, PortLink
from .sut_node import SutNode
from .tg_node import TGNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/testbed_model/cpu.py
index cbc5fe7fff..1b392689f5 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -262,3 +262,16 @@ def filter(self) -> list[LogicalCore]:
)
return filtered_lcores
+
+
+def lcore_filter(
+ core_list: list[LogicalCore],
+ filter_specifier: LogicalCoreCount | LogicalCoreList,
+ ascending: bool,
+) -> LogicalCoreFilter:
+ if isinstance(filter_specifier, LogicalCoreList):
+ return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+ elif isinstance(filter_specifier, LogicalCoreCount):
+ return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+ else:
+ raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
- LogicalCore,
- LogicalCoreCount,
- LogicalCoreCountFilter,
- LogicalCoreFilter,
- LogicalCoreList,
- LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
- core_list: list[LogicalCore],
- filter_specifier: LogicalCoreCount | LogicalCoreList,
- ascending: bool,
-) -> LogicalCoreFilter:
- if isinstance(filter_specifier, LogicalCoreList):
- return LogicalCoreListFilter(core_list, filter_specifier, ascending)
- elif isinstance(filter_specifier, LogicalCoreCount):
- return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
- else:
- raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
similarity index 97%
rename from dts/framework/remote_session/linux_session.py
rename to dts/framework/testbed_model/linux_session.py
index fd877fbfae..055765ba2d 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -9,10 +9,10 @@
from typing_extensions import NotRequired
from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
from framework.utils import expand_range
+from .cpu import LogicalCore
+from .port import Port
from .posix_session import PosixSession
@@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
lcores.append(LogicalCore(lcore, core, socket, node))
return lcores
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return dpdk_prefix
def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index ef700d8114..b313b5ad54 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,23 +12,26 @@
from typing import Any, Callable, Type, Union
from framework.config import (
+ OS,
BuildTargetConfiguration,
ExecutionConfiguration,
NodeConfiguration,
)
+from framework.exception import ConfigurationError
from framework.logger import DTSLOG, getLogger
-from framework.remote_session import InteractiveShellType, OSSession, create_session
from framework.settings import SETTINGS
-from .hw import (
+from .cpu import (
LogicalCore,
LogicalCoreCount,
LogicalCoreList,
LogicalCoreListFilter,
- VirtualDevice,
lcore_filter,
)
-from .hw.port import Port
+from .linux_session import LinuxSession
+from .os_session import InteractiveShellType, OSSession
+from .port import Port
+from .virtual_device import VirtualDevice
class Node(ABC):
@@ -168,9 +171,9 @@ def create_interactive_shell(
return self.main_session.create_interactive_shell(
shell_cls,
- app_args,
timeout,
privileged,
+ app_args,
)
def filter_lcores(
@@ -201,7 +204,7 @@ def _get_remote_cpus(self) -> None:
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
- def _setup_hugepages(self):
+ def _setup_hugepages(self) -> None:
"""
Setup hugepages on the Node. Different architectures can supply different
amounts of memory for hugepages and numa-based hugepage allocation may need
@@ -245,3 +248,11 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
return lambda *args: None
else:
return func
+
+
+def create_session(node_config: NodeConfiguration, name: str, logger: DTSLOG) -> OSSession:
+ match node_config.os:
+ case OS.linux:
+ return LinuxSession(node_config, name, logger)
+ case _:
+ raise ConfigurationError(f"Unsupported OS {node_config.os}")
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
similarity index 95%
rename from dts/framework/remote_session/os_session.py
rename to dts/framework/testbed_model/os_session.py
index 8a709eac1c..76e595a518 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -10,19 +10,19 @@
from framework.config import Architecture, NodeConfiguration, NodeInfo
from framework.logger import DTSLOG
-from framework.remote_session.remote import InteractiveShell
-from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
-from framework.utils import MesonArgs
-
-from .remote import (
+from framework.remote_session import (
CommandResult,
InteractiveRemoteSession,
+ InteractiveShell,
RemoteSession,
create_interactive_session,
create_remote_session,
)
+from framework.settings import SETTINGS
+from framework.utils import MesonArgs
+
+from .cpu import LogicalCore
+from .port import Port
InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
@@ -85,9 +85,9 @@ def send_command(
def create_interactive_shell(
self,
shell_cls: Type[InteractiveShellType],
- eal_parameters: str,
timeout: float,
privileged: bool,
+ app_args: str,
) -> InteractiveShellType:
"""
See "create_interactive_shell" in SutNode
@@ -96,7 +96,7 @@ def create_interactive_shell(
self.interactive_session.session,
self._logger,
self._get_privileged_command if privileged else None,
- eal_parameters,
+ app_args,
timeout,
)
@@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
"""
@abstractmethod
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
"""
Try to find DPDK remote dir in remote_dir.
"""
@@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
"""
@abstractmethod
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
"""
Get the DPDK file prefix that will be used when running DPDK apps.
"""
diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
similarity index 100%
rename from dts/framework/testbed_model/hw/port.py
rename to dts/framework/testbed_model/port.py
diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
similarity index 98%
rename from dts/framework/remote_session/posix_session.py
rename to dts/framework/testbed_model/posix_session.py
index a29e2e8280..5657cc0bc9 100644
--- a/dts/framework/remote_session/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
- def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
+ def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
@@ -207,7 +207,7 @@ def _remove_dpdk_runtime_dirs(self, dpdk_runtime_dirs: Iterable[str | PurePath])
for dpdk_runtime_dir in dpdk_runtime_dirs:
self.remove_remote_dir(dpdk_runtime_dir)
- def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+ def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
return ""
def get_compiler_version(self, compiler_name: str) -> str:
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 7f75043bd3..5ce9446dba 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,12 +15,14 @@
NodeInfo,
SutNodeConfiguration,
)
-from framework.remote_session import CommandResult, InteractiveShellType, OSSession
+from framework.remote_session import CommandResult
from framework.settings import SETTINGS
from framework.utils import MesonArgs
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
+from .cpu import LogicalCoreCount, LogicalCoreList
from .node import Node
+from .os_session import InteractiveShellType, OSSession
+from .virtual_device import VirtualDevice
class EalParameters(object):
@@ -293,7 +295,7 @@ def create_eal_parameters(
prefix: str = "dpdk",
append_prefix_timestamp: bool = True,
no_pci: bool = False,
- vdevs: list[VirtualDevice] = None,
+ vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
"""
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 79a55663b5..8a8f0019f3 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -16,16 +16,11 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.config import (
- ScapyTrafficGeneratorConfig,
- TGNodeConfiguration,
- TrafficGeneratorType,
-)
-from framework.exception import ConfigurationError
-
-from .capturing_traffic_generator import CapturingTrafficGenerator
-from .hw.port import Port
+from framework.config import TGNodeConfiguration
+
from .node import Node
+from .port import Port
+from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
class TGNode(Node):
@@ -78,19 +73,3 @@ def close(self) -> None:
"""Free all resources used by the node"""
self.traffic_generator.close()
super(TGNode, self).close()
-
-
-def create_traffic_generator(
- tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
-) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
-
- from .scapy import ScapyTrafficGenerator
-
- match traffic_generator_config.traffic_generator_type:
- case TrafficGeneratorType.SCAPY:
- return ScapyTrafficGenerator(tg_node, traffic_generator_config)
- case _:
- raise ConfigurationError(
- f"Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
- )
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
new file mode 100644
index 0000000000..52888d03fa
--- /dev/null
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -0,0 +1,23 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
+from framework.exception import ConfigurationError
+from framework.testbed_model.node import Node
+
+from .capturing_traffic_generator import CapturingTrafficGenerator
+from .scapy import ScapyTrafficGenerator
+
+
+def create_traffic_generator(
+ tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
+) -> CapturingTrafficGenerator:
+ """A factory function for creating traffic generator object from user config."""
+
+ match traffic_generator_config.traffic_generator_type:
+ case TrafficGeneratorType.SCAPY:
+ return ScapyTrafficGenerator(tg_node, traffic_generator_config)
+ case _:
+ raise ConfigurationError(
+ "Unknown traffic generator: {traffic_generator_config.traffic_generator_type}"
+ )
diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
similarity index 98%
rename from dts/framework/testbed_model/capturing_traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index e6512061d7..1fc7f98c05 100644
--- a/dts/framework/testbed_model/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -16,9 +16,9 @@
from scapy.packet import Packet # type: ignore[import]
from framework.settings import SETTINGS
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
from .traffic_generator import TrafficGenerator
@@ -127,7 +127,7 @@ def _send_packets_and_capture(
for the specified duration. It must be able to handle no received packets.
"""
- def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
+ def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]) -> None:
file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
self._logger.debug(f"Writing packets to {file_name}.")
scapy.utils.wrpcap(file_name, packets)
diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
similarity index 95%
rename from dts/framework/testbed_model/scapy.py
rename to dts/framework/testbed_model/traffic_generator/scapy.py
index 9083e92b3d..c88cf28369 100644
--- a/dts/framework/testbed_model/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -24,16 +24,15 @@
from scapy.packet import Packet # type: ignore[import]
from framework.config import OS, ScapyTrafficGeneratorConfig
-from framework.logger import DTSLOG, getLogger
from framework.remote_session import PythonShell
from framework.settings import SETTINGS
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from .capturing_traffic_generator import (
CapturingTrafficGenerator,
_get_default_capture_name,
)
-from .hw.port import Port
-from .tg_node import TGNode
"""
========= BEGIN RPC FUNCTIONS =========
@@ -144,7 +143,7 @@ def quit(self) -> None:
self._BaseServer__shutdown_request = True
return None
- def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
+ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
"""Add a function to the server.
This is meant to be executed remotely.
@@ -189,13 +188,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
session: PythonShell
rpc_server_proxy: xmlrpc.client.ServerProxy
_config: ScapyTrafficGeneratorConfig
- _tg_node: TGNode
- _logger: DTSLOG
- def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
- self._config = config
- self._tg_node = tg_node
- self._logger = getLogger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+ def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ super().__init__(tg_node, config)
assert (
self._tg_node.config.os == OS.linux
@@ -229,7 +224,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
function_bytes = marshal.dumps(function.__code__)
self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
- def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
+ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
# load the source of the function
src = inspect.getsource(QuittableXMLRPCServer)
# Lines with only whitespace break the repl if in the middle of a function
@@ -271,7 +266,7 @@ def _send_packets_and_capture(
scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
return scapy_packets
- def close(self):
+ def close(self) -> None:
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
similarity index 81%
rename from dts/framework/testbed_model/traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 28c35d3ce4..0d9902ddb7 100644
--- a/dts/framework/testbed_model/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -12,11 +12,12 @@
from scapy.packet import Packet # type: ignore[import]
-from framework.logger import DTSLOG
+from framework.config import TrafficGeneratorConfig
+from framework.logger import DTSLOG, getLogger
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
from framework.utils import get_packet_summaries
-from .hw.port import Port
-
class TrafficGenerator(ABC):
"""The base traffic generator.
@@ -24,8 +25,15 @@ class TrafficGenerator(ABC):
Defines the few basic methods that each traffic generator must implement.
"""
+ _config: TrafficGeneratorConfig
+ _tg_node: Node
_logger: DTSLOG
+ def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ self._config = config
+ self._tg_node = tg_node
+ self._logger = getLogger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
+
def send_packet(self, packet: Packet, port: Port) -> None:
"""Send a packet and block until it is fully sent.
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index d098d364ff..a0f2173949 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -7,7 +7,6 @@
import json
import os
import subprocess
-import sys
from enum import Enum
from pathlib import Path
from subprocess import SubprocessError
@@ -16,31 +15,7 @@
from .exception import ConfigurationError
-
-class StrEnum(Enum):
- @staticmethod
- def _generate_next_value_(name: str, start: int, count: int, last_values: object) -> str:
- return name
-
- def __str__(self) -> str:
- return self.name
-
-
-REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
-
-
-def check_dts_python_version() -> None:
- if sys.version_info.major < 3 or (sys.version_info.major == 3 and sys.version_info.minor < 10):
- print(
- RED(
- (
- "WARNING: DTS execution node's python version is lower than"
- "python 3.10, is deprecated and will not work in future releases."
- )
- ),
- file=sys.stderr,
- )
- print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
def expand_range(range_str: str) -> list[int]:
@@ -61,7 +36,7 @@ def expand_range(range_str: str) -> list[int]:
return expanded_range
-def get_packet_summaries(packets: list[Packet]):
+def get_packet_summaries(packets: list[Packet]) -> str:
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -69,8 +44,13 @@ def get_packet_summaries(packets: list[Packet]):
return f"Packet contents: \n{packet_summaries}"
-def RED(text: str) -> str:
- return f"\u001B[31;1m{str(text)}\u001B[0m"
+class StrEnum(Enum):
+ @staticmethod
+ def _generate_next_value_(name: str, start: int, count: int, last_values: object) -> str:
+ return name
+
+ def __str__(self) -> str:
+ return self.name
class MesonArgs(object):
@@ -215,5 +195,5 @@ def _delete_tarball(self) -> None:
if self._tarball_path and os.path.exists(self._tarball_path):
os.remove(self._tarball_path)
- def __fspath__(self):
+ def __fspath__(self) -> str:
return str(self._tarball_path)
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..5d4714b0c3 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,17 @@
import logging
-from framework import dts
+from framework import settings
def main() -> None:
+ """Set DTS settings, then run DTS.
+
+ The DTS settings are taken from the command line arguments and the environment variables.
+ """
+ settings.SETTINGS = settings.get_settings()
+ from framework import dts
+
dts.run_all()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 02/21] dts: add docstring checker
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 01/21] dts: code adjustments for doc generation Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 03/21] dts: add basic developer docs Juraj Linkeš
` (19 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Python docstrings are the in-code way to document the code. The
docstring checker of choice is pydocstyle which we're executing from
Pylama, but the current latest versions are not complatible due to [0],
so pin the pydocstyle version to the latest working version.
[0] https://github.com/klen/pylama/issues/232
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 12 ++++++------
dts/pyproject.toml | 6 +++++-
2 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index f7b3b6d602..a734fa71f0 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -489,20 +489,20 @@ files = [
[[package]]
name = "pydocstyle"
-version = "6.3.0"
+version = "6.1.1"
description = "Python docstring style checker"
optional = false
python-versions = ">=3.6"
files = [
- {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
- {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+ {file = "pydocstyle-6.1.1-py3-none-any.whl", hash = "sha256:6987826d6775056839940041beef5c08cc7e3d71d63149b48e36727f70144dc4"},
+ {file = "pydocstyle-6.1.1.tar.gz", hash = "sha256:1d41b7c459ba0ee6c345f2eb9ae827cab14a7533a88c5c6f7e94923f72df92dc"},
]
[package.dependencies]
-snowballstemmer = ">=2.2.0"
+snowballstemmer = "*"
[package.extras]
-toml = ["tomli (>=1.2.3)"]
+toml = ["toml"]
[[package]]
name = "pyflakes"
@@ -837,4 +837,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
+content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 980ac3c7db..37a692d655 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -25,6 +25,7 @@ PyYAML = "^6.0"
types-PyYAML = "^6.0.8"
fabric = "^2.7.1"
scapy = "^2.5.0"
+pydocstyle = "6.1.1"
[tool.poetry.group.dev.dependencies]
mypy = "^0.961"
@@ -39,10 +40,13 @@ requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
[tool.pylama]
-linters = "mccabe,pycodestyle,pyflakes"
+linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
format = "pylint"
max_line_length = 100
+[tool.pylama.linter.pydocstyle]
+convention = "google"
+
[tool.mypy]
python_version = "3.10"
enable_error_code = ["ignore-without-code"]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 03/21] dts: add basic developer docs
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 01/21] dts: code adjustments for doc generation Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 02/21] dts: add docstring checker Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 04/21] dts: exceptions docstring update Juraj Linkeš
` (18 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Expand the framework contribution guidelines and add how to document the
code with Python docstrings.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
1 file changed, 73 insertions(+)
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 32c18ee472..cd771a428c 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
The results contain basic statistics of passed/failed test cases and DPDK version.
+Contributing to DTS
+-------------------
+
+There are two areas of contribution: The DTS framework and DTS test suites.
+
+The framework contains the logic needed to run test cases, such as connecting to nodes,
+running DPDK apps and collecting results.
+
+The test cases call APIs from the framework to test their scenarios. Adding test cases may
+require adding code to the framework as well.
+
+
+Framework Coding Guidelines
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When adding code to the DTS framework, pay attention to the rest of the code
+and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
+warnings when some of the basics are not met.
+
+The code must be properly documented with docstrings. The style must conform to
+the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
+See an example of the style
+`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
+For cases which are not covered by the Google style, refer
+to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
+the two style guides, where we deviate or where some additional clarification is helpful:
+
+ * The __init__() methods of classes are documented separately from the docstring of the class
+ itself.
+ * The docstrigs of implemented abstract methods should refer to the superclass's definition
+ if there's no deviation.
+ * Instance variables/attributes should be documented in the docstring of the class
+ in the ``Attributes:`` section.
+ * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
+ attributes which result in instance variables/attributes should also be recorded
+ in the ``Attributes:`` section.
+ * Class variables/attributes, on the other hand, should be documented with ``#:`` above
+ the type annotated line. The description may be omitted if the meaning is obvious.
+ * The Enum and TypedDict also process the attributes in particular ways and should be documented
+ with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
+ * When referencing a parameter of a function or a method in their docstring, don't use
+ any articles and put the parameter into single backticks. This mimics the style of
+ `Python's documentation <https://docs.python.org/3/index.html>`_.
+ * When specifying a value, use double backticks::
+
+ def foo(greet: bool) -> None:
+ """Demonstration of single and double backticks.
+
+ `greet` controls whether ``Hello World`` is printed.
+
+ Args:
+ greet: Whether to print the ``Hello World`` message.
+ """
+ if greet:
+ print(f"Hello World")
+
+ * The docstring maximum line length is the same as the code maximum line length.
+
+
How To Write a Test Suite
-------------------------
@@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
| These methods don't need to be implemented if there's no need for them in a test suite.
In that case, nothing will happen when they're is executed.
+#. **Configuration, traffic and other logic**
+
+ The ``TestSuite`` class contains a variety of methods for anything that
+ a test suite setup, a teardown, or a test case may need to do.
+
+ The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
+ and use the interactive shell instances directly.
+
+ These are the two main ways to call the framework logic in test suites. If there's any
+ functionality or logic missing from the framework, it should be implemented so that
+ the test suites can use one of these two ways.
+
#. **Test case verification**
Test case verification should be done with the ``verify`` method, which records the result.
@@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
and used by the test suite via the ``sut_node`` field.
+.. _dts_dev_tools:
+
DTS Developer Tools
-------------------
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 04/21] dts: exceptions docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (2 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 03/21] dts: add basic developer docs Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 05/21] dts: settings " Juraj Linkeš
` (17 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/__init__.py | 12 ++++-
dts/framework/exception.py | 106 +++++++++++++++++++++++++------------
2 files changed, 83 insertions(+), 35 deletions(-)
diff --git a/dts/framework/__init__.py b/dts/framework/__init__.py
index d551ad4bf0..662e6ccad2 100644
--- a/dts/framework/__init__.py
+++ b/dts/framework/__init__.py
@@ -1,3 +1,13 @@
# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2022 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+
+"""Libraries and utilities for running DPDK Test Suite (DTS).
+
+The various modules in the DTS framework offer:
+
+* Connections to nodes, both interactive and non-interactive,
+* A straightforward way to add support for different operating systems of remote nodes,
+* Test suite setup, execution and teardown, along with test case setup, execution and teardown,
+* Pre-test suite setup and post-test suite teardown.
+"""
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 151e4d3aa9..658eee2c38 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -3,8 +3,10 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-User-defined exceptions used across the framework.
+"""DTS exceptions.
+
+The exceptions all have different severities expressed as an integer.
+The highest severity of all raised exceptions is used as the exit code of DTS.
"""
from enum import IntEnum, unique
@@ -13,59 +15,79 @@
@unique
class ErrorSeverity(IntEnum):
- """
- The severity of errors that occur during DTS execution.
+ """The severity of errors that occur during DTS execution.
+
All exceptions are caught and the most severe error is used as return code.
"""
+ #:
NO_ERR = 0
+ #:
GENERIC_ERR = 1
+ #:
CONFIG_ERR = 2
+ #:
REMOTE_CMD_EXEC_ERR = 3
+ #:
SSH_ERR = 4
+ #:
DPDK_BUILD_ERR = 10
+ #:
TESTCASE_VERIFY_ERR = 20
+ #:
BLOCKING_TESTSUITE_ERR = 25
class DTSError(Exception):
- """
- The base exception from which all DTS exceptions are derived.
- Stores error severity.
+ """The base exception from which all DTS exceptions are subclassed.
+
+ Do not use this exception, only use subclassed exceptions.
"""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.GENERIC_ERR
class SSHTimeoutError(DTSError):
- """
- Command execution timeout.
- """
+ """The SSH execution of a command timed out."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_command: str
def __init__(self, command: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ command: The executed command.
+ """
self._command = command
def __str__(self) -> str:
- return f"TIMEOUT on {self._command}"
+ """Add some context to the string representation."""
+ return f"{self._command} execution timed out."
class SSHConnectionError(DTSError):
- """
- SSH connection error.
- """
+ """An unsuccessful SSH connection."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
_errors: list[str]
def __init__(self, host: str, errors: list[str] | None = None):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ host: The hostname to which we're trying to connect.
+ errors: Any errors that occurred during the connection attempt.
+ """
self._host = host
self._errors = [] if errors is None else errors
def __str__(self) -> str:
+ """Include the errors in the string representation."""
message = f"Error trying to connect with {self._host}."
if self._errors:
message += f" Errors encountered while retrying: {', '.join(self._errors)}"
@@ -74,76 +96,92 @@ def __str__(self) -> str:
class SSHSessionDeadError(DTSError):
- """
- SSH session is not alive.
- It can no longer be used.
- """
+ """The SSH session is no longer alive."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
_host: str
def __init__(self, host: str):
+ """Define the meaning of the first argument.
+
+ Args:
+ host: The hostname of the disconnected node.
+ """
self._host = host
def __str__(self) -> str:
- return f"SSH session with {self._host} has died"
+ """Add some context to the string representation."""
+ return f"SSH session with {self._host} has died."
class ConfigurationError(DTSError):
- """
- Raised when an invalid configuration is encountered.
- """
+ """An invalid configuration."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.CONFIG_ERR
class RemoteCommandExecutionError(DTSError):
- """
- Raised when a command executed on a Node returns a non-zero exit status.
- """
+ """An unsuccessful execution of a remote command."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+ #: The executed command.
command: str
_command_return_code: int
def __init__(self, command: str, command_return_code: int):
+ """Define the meaning of the first two arguments.
+
+ Args:
+ command: The executed command.
+ command_return_code: The return code of the executed command.
+ """
self.command = command
self._command_return_code = command_return_code
def __str__(self) -> str:
+ """Include both the command and return code in the string representation."""
return f"Command {self.command} returned a non-zero exit code: {self._command_return_code}"
class RemoteDirectoryExistsError(DTSError):
- """
- Raised when a remote directory to be created already exists.
- """
+ """A directory that exists on a remote node."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
class DPDKBuildError(DTSError):
- """
- Raised when DPDK build fails for any reason.
- """
+ """A DPDK build failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.DPDK_BUILD_ERR
class TestCaseVerifyError(DTSError):
- """
- Used in test cases to verify the expected behavior.
- """
+ """A test case failure."""
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
class BlockingTestSuiteError(DTSError):
+ """A failure in a blocking test suite."""
+
+ #:
severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
_suite_name: str
def __init__(self, suite_name: str) -> None:
+ """Define the meaning of the first argument.
+
+ Args:
+ suite_name: The blocking test suite.
+ """
self._suite_name = suite_name
def __str__(self) -> str:
+ """Add some context to the string representation."""
return f"Blocking suite {self._suite_name} failed."
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 05/21] dts: settings docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (3 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 04/21] dts: exceptions docstring update Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 06/21] dts: logger and utils " Juraj Linkeš
` (16 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/settings.py | 103 +++++++++++++++++++++++++++++++++++++-
1 file changed, 102 insertions(+), 1 deletion(-)
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 25b5dcff22..41f98e8519 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -3,6 +3,72 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
+"""Environment variables and command line arguments parsing.
+
+This is a simple module utilizing the built-in argparse module to parse command line arguments,
+augment them with values from environment variables and make them available across the framework.
+
+The command line value takes precedence, followed by the environment variable value,
+followed by the default value defined in this module.
+
+The command line arguments along with the supported environment variables are:
+
+.. option:: --config-file
+.. envvar:: DTS_CFG_FILE
+
+ The path to the YAML test run configuration file.
+
+.. option:: --output-dir, --output
+.. envvar:: DTS_OUTPUT_DIR
+
+ The directory where DTS logs and results are saved.
+
+.. option:: --compile-timeout
+.. envvar:: DTS_COMPILE_TIMEOUT
+
+ The timeout for compiling DPDK.
+
+.. option:: -t, --timeout
+.. envvar:: DTS_TIMEOUT
+
+ The timeout for all DTS operation except for compiling DPDK.
+
+.. option:: -v, --verbose
+.. envvar:: DTS_VERBOSE
+
+ Set to any value to enable logging everything to the console.
+
+.. option:: -s, --skip-setup
+.. envvar:: DTS_SKIP_SETUP
+
+ Set to any value to skip building DPDK.
+
+.. option:: --tarball, --snapshot, --git-ref
+.. envvar:: DTS_DPDK_TARBALL
+
+ The path to a DPDK tarball, git commit ID, tag ID or tree ID to test.
+
+.. option:: --test-cases
+.. envvar:: DTS_TESTCASES
+
+ A comma-separated list of test cases to execute. Unknown test cases will be silently ignored.
+
+.. option:: --re-run, --re_run
+.. envvar:: DTS_RERUN
+
+ Re-run each test case this many times in case of a failure.
+
+The module provides one key module-level variable:
+
+Attributes:
+ SETTINGS: The module level variable storing framework-wide DTS settings.
+
+Typical usage example::
+
+ from framework.settings import SETTINGS
+ foo = SETTINGS.foo
+"""
+
import argparse
import os
from collections.abc import Callable, Iterable, Sequence
@@ -16,6 +82,23 @@
def _env_arg(env_var: str) -> Any:
+ """A helper method augmenting the argparse Action with environment variables.
+
+ If the supplied environment variable is defined, then the default value
+ of the argument is modified. This satisfies the priority order of
+ command line argument > environment variable > default value.
+
+ Arguments with no values (flags) should be defined using the const keyword argument
+ (True or False). When the argument is specified, it will be set to const, if not specified,
+ the default will be stored (possibly modified by the corresponding environment variable).
+
+ Other arguments work the same as default argparse arguments, that is using
+ the default 'store' action.
+
+ Returns:
+ The modified argparse.Action.
+ """
+
class _EnvironmentArgument(argparse.Action):
def __init__(
self,
@@ -68,14 +151,28 @@ def __call__(
@dataclass(slots=True)
class Settings:
+ """Default framework-wide user settings.
+
+ The defaults may be modified at the start of the run.
+ """
+
+ #:
config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+ #:
output_dir: str = "output"
+ #:
timeout: float = 15
+ #:
verbose: bool = False
+ #:
skip_setup: bool = False
+ #:
dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+ #:
compile_timeout: float = 1200
+ #:
test_cases: list[str] = field(default_factory=list)
+ #:
re_run: int = 0
@@ -166,7 +263,7 @@ def _get_parser() -> argparse.ArgumentParser:
action=_env_arg("DTS_RERUN"),
default=SETTINGS.re_run,
type=int,
- help="[DTS_RERUN] Re-run each test case the specified amount of times "
+ help="[DTS_RERUN] Re-run each test case the specified number of times "
"if a test failure occurs",
)
@@ -174,6 +271,10 @@ def _get_parser() -> argparse.ArgumentParser:
def get_settings() -> Settings:
+ """Create new settings with inputs from the user.
+
+ The inputs are taken from the command line and from environment variables.
+ """
parsed_args = _get_parser().parse_args()
return Settings(
config_file_path=parsed_args.config_file,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 06/21] dts: logger and utils docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (4 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 05/21] dts: settings " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 07/21] dts: dts runner and main " Juraj Linkeš
` (15 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/logger.py | 72 ++++++++++++++++++++++-----------
dts/framework/utils.py | 88 +++++++++++++++++++++++++++++------------
2 files changed, 113 insertions(+), 47 deletions(-)
diff --git a/dts/framework/logger.py b/dts/framework/logger.py
index bb2991e994..cfa6e8cd72 100644
--- a/dts/framework/logger.py
+++ b/dts/framework/logger.py
@@ -3,9 +3,9 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-DTS logger module with several log level. DTS framework and TestSuite logs
-are saved in different log files.
+"""DTS logger module.
+
+DTS framework and TestSuite logs are saved in different log files.
"""
import logging
@@ -18,19 +18,21 @@
stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
-class LoggerDictType(TypedDict):
- logger: "DTSLOG"
- name: str
- node: str
-
+class DTSLOG(logging.LoggerAdapter):
+ """DTS logger adapter class for framework and testsuites.
-# List for saving all using loggers
-Loggers: list[LoggerDictType] = []
+ The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
+ variable control the verbosity of output. If enabled, all messages will be emitted to the
+ console.
+ The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+ variable modify the directory where the logs will be stored.
-class DTSLOG(logging.LoggerAdapter):
- """
- DTS log class for framework and testsuite.
+ Attributes:
+ node: The additional identifier. Currently unused.
+ sh: The handler which emits logs to console.
+ fh: The handler which emits logs to a file.
+ verbose_fh: Just as fh, but logs with a different, more verbose, format.
"""
_logger: logging.Logger
@@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
verbose_fh: logging.FileHandler
def __init__(self, logger: logging.Logger, node: str = "suite"):
+ """Extend the constructor with additional handlers.
+
+ One handler logs to the console, the other one to a file, with either a regular or verbose
+ format.
+
+ Args:
+ logger: The logger from which to create the logger adapter.
+ node: An additional identifier. Currently unused.
+ """
self._logger = logger
# 1 means log everything, this will be used by file handlers if their level
# is not set
@@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
def logger_exit(self) -> None:
- """
- Remove stream handler and logfile handler.
- """
+ """Remove the stream handler and the logfile handler."""
for handler in (self.sh, self.fh, self.verbose_fh):
handler.flush()
self._logger.removeHandler(handler)
+class _LoggerDictType(TypedDict):
+ logger: DTSLOG
+ name: str
+ node: str
+
+
+# List for saving all loggers in use
+_Loggers: list[_LoggerDictType] = []
+
+
def getLogger(name: str, node: str = "suite") -> DTSLOG:
+ """Get DTS logger adapter identified by name and node.
+
+ An existing logger will be returned if one with the exact name and node already exists.
+ A new one will be created and stored otherwise.
+
+ Args:
+ name: The name of the logger.
+ node: An additional identifier for the logger.
+
+ Returns:
+ A logger uniquely identified by both name and node.
"""
- Get logger handler and if there's no handler for specified Node will create one.
- """
- global Loggers
+ global _Loggers
# return saved logger
- logger: LoggerDictType
- for logger in Loggers:
+ logger: _LoggerDictType
+ for logger in _Loggers:
if logger["name"] == name and logger["node"] == node:
return logger["logger"]
# return new logger
dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
- Loggers.append({"logger": dts_logger, "name": name, "node": node})
+ _Loggers.append({"logger": dts_logger, "name": name, "node": node})
return dts_logger
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index a0f2173949..cc5e458cc8 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -3,6 +3,16 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Various utility classes and functions.
+
+These are used in multiple modules across the framework. They're here because
+they provide some non-specific functionality, greatly simplify imports or just don't
+fit elsewhere.
+
+Attributes:
+ REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
+"""
+
import atexit
import json
import os
@@ -19,12 +29,20 @@
def expand_range(range_str: str) -> list[int]:
- """
- Process range string into a list of integers. There are two possible formats:
- n - a single integer
- n-m - a range of integers
+ """Process `range_str` into a list of integers.
+
+ There are two possible formats of `range_str`:
+
+ * ``n`` - a single integer,
+ * ``n-m`` - a range of integers.
- The returned range includes both n and m. Empty string returns an empty list.
+ The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
+
+ Args:
+ range_str: The range to expand.
+
+ Returns:
+ All the numbers from the range.
"""
expanded_range: list[int] = []
if range_str:
@@ -37,6 +55,14 @@ def expand_range(range_str: str) -> list[int]:
def get_packet_summaries(packets: list[Packet]) -> str:
+ """Format a string summary from `packets`.
+
+ Args:
+ packets: The packets to format.
+
+ Returns:
+ The summary of `packets`.
+ """
if len(packets) == 1:
packet_summaries = packets[0].summary()
else:
@@ -45,27 +71,36 @@ def get_packet_summaries(packets: list[Packet]) -> str:
class StrEnum(Enum):
+ """Enum with members stored as strings."""
+
@staticmethod
def _generate_next_value_(name: str, start: int, count: int, last_values: object) -> str:
return name
def __str__(self) -> str:
+ """The string representation is the name of the member."""
return self.name
class MesonArgs(object):
- """
- Aggregate the arguments needed to build DPDK:
- default_library: Default library type, Meson allows "shared", "static" and "both".
- Defaults to None, in which case the argument won't be used.
- Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
- Do not use -D with them, for example:
- meson_args = MesonArgs(enable_kmods=True).
- """
+ """Aggregate the arguments needed to build DPDK."""
_default_library: str
def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
+ """Initialize the meson arguments.
+
+ Args:
+ default_library: The default library type, Meson supports ``shared``, ``static`` and
+ ``both``. Defaults to :data:`None`, in which case the argument won't be used.
+ dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Example:
+ ::
+
+ meson_args = MesonArgs(enable_kmods=True).
+ """
self._default_library = f"--default-library={default_library}" if default_library else ""
self._dpdk_args = " ".join(
(
@@ -75,6 +110,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
)
def __str__(self) -> str:
+ """The actual args."""
return " ".join(f"{self._default_library} {self._dpdk_args}".split())
@@ -96,24 +132,14 @@ class _TarCompressionFormat(StrEnum):
class DPDKGitTarball(object):
- """Create a compressed tarball of DPDK from the repository.
-
- The DPDK version is specified with git object git_ref.
- The tarball will be compressed with _TarCompressionFormat,
- which must be supported by the DTS execution environment.
- The resulting tarball will be put into output_dir.
+ """Compressed tarball of DPDK from the repository.
- The class supports the os.PathLike protocol,
+ The class supports the :class:`os.PathLike` protocol,
which is used to get the Path of the tarball::
from pathlib import Path
tarball = DPDKGitTarball("HEAD", "output")
tarball_path = Path(tarball)
-
- Arguments:
- git_ref: A git commit ID, tag ID or tree ID.
- output_dir: The directory where to put the resulting tarball.
- tar_compression_format: The compression format to use.
"""
_git_ref: str
@@ -128,6 +154,17 @@ def __init__(
output_dir: str,
tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
):
+ """Create the tarball during initialization.
+
+ The DPDK version is specified with `git_ref`. The tarball will be compressed with
+ `tar_compression_format`, which must be supported by the DTS execution environment.
+ The resulting tarball will be put into `output_dir`.
+
+ Args:
+ git_ref: A git commit ID, tag ID or tree ID.
+ output_dir: The directory where to put the resulting tarball.
+ tar_compression_format: The compression format to use.
+ """
self._git_ref = git_ref
self._tar_compression_format = tar_compression_format
@@ -196,4 +233,5 @@ def _delete_tarball(self) -> None:
os.remove(self._tarball_path)
def __fspath__(self) -> str:
+ """The os.PathLike protocol implementation."""
return str(self._tarball_path)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 07/21] dts: dts runner and main docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (5 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 06/21] dts: logger and utils " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 08/21] dts: test suite " Juraj Linkeš
` (14 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/dts.py | 131 ++++++++++++++++++++++++++++++++++++-------
dts/main.py | 10 ++--
2 files changed, 116 insertions(+), 25 deletions(-)
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 356368ef10..e16d4578a0 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,33 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+r"""Test suite runner module.
+
+A DTS run is split into stages:
+
+ #. Execution stage,
+ #. Build target stage,
+ #. Test suite stage,
+ #. Test case stage.
+
+The module is responsible for running tests on testbeds defined in the test run configuration.
+Each setup or teardown of each stage is recorded in a :class:`~.test_result.DTSResult` or
+one of its subclasses. The test case results are also recorded.
+
+If an error occurs, the current stage is aborted, the error is recorded and the run continues in
+the next iteration of the same stage. The return code is the highest `severity` of all
+:class:`~.exception.DTSError`\s.
+
+Example:
+ An error occurs in a build target setup. The current build target is aborted and the run
+ continues with the next build target. If the errored build target was the last one in the given
+ execution, the next execution begins.
+
+Attributes:
+ dts_logger: The logger instance used in this module.
+ result: The top level result used in the module.
+"""
+
import sys
from .config import (
@@ -23,9 +50,38 @@
def run_all() -> None:
- """
- The main process of DTS. Runs all build targets in all executions from the main
- config file.
+ """Run all build targets in all executions from the test run configuration.
+
+ Before running test suites, executions and build targets are first set up.
+ The executions and build targets defined in the test run configuration are iterated over.
+ The executions define which tests to run and where to run them and build targets define
+ the DPDK build setup.
+
+ The tests suites are set up for each execution/build target tuple and each scheduled
+ test case within the test suite is set up, executed and torn down. After all test cases
+ have been executed, the test suite is torn down and the next build target will be tested.
+
+ All the nested steps look like this:
+
+ #. Execution setup
+
+ #. Build target setup
+
+ #. Test suite setup
+
+ #. Test case setup
+ #. Test case logic
+ #. Test case teardown
+
+ #. Test suite teardown
+
+ #. Build target teardown
+
+ #. Execution teardown
+
+ The test cases are filtered according to the specification in the test run configuration and
+ the :option:`--test-cases` command line argument or
+ the :envvar:`DTS_TESTCASES` environment variable.
"""
global dts_logger
global result
@@ -87,6 +143,8 @@ def run_all() -> None:
def _check_dts_python_version() -> None:
+ """Check the required Python version - v3.10."""
+
def RED(text: str) -> str:
return f"\u001B[31;1m{str(text)}\u001B[0m"
@@ -109,9 +167,16 @@ def _run_execution(
execution: ExecutionConfiguration,
result: DTSResult,
) -> None:
- """
- Run the given execution. This involves running the execution setup as well as
- running all build targets in the given execution.
+ """Run the given execution.
+
+ This involves running the execution setup as well as running all build targets
+ in the given execution. After that, execution teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: An execution's test run configuration.
+ result: The top level result object.
"""
dts_logger.info(f"Running execution with SUT '{execution.system_under_test_node.name}'.")
execution_result = result.add_execution(sut_node.config)
@@ -144,8 +209,18 @@ def _run_build_target(
execution: ExecutionConfiguration,
execution_result: ExecutionResult,
) -> None:
- """
- Run the given build target.
+ """Run the given build target.
+
+ This involves running the build target setup as well as running all test suites
+ in the given execution the build target is defined in.
+ After that, build target teardown is run.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ build_target: A build target's test run configuration.
+ execution: The build target's execution's test run configuration.
+ execution_result: The execution level result object associated with the execution.
"""
dts_logger.info(f"Running build target '{build_target.name}'.")
build_target_result = execution_result.add_build_target(build_target)
@@ -177,10 +252,20 @@ def _run_all_suites(
execution: ExecutionConfiguration,
build_target_result: BuildTargetResult,
) -> None:
- """
- Use the given build_target to run execution's test suites
- with possibly only a subset of test cases.
- If no subset is specified, run all test cases.
+ """Run the execution's (possibly a subset) test suites using the current build target.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
+
+ If a blocking test suite (such as the smoke test suite) fails, the rest of the test suites
+ in the current build target won't be executed.
+
+ Args:
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
"""
end_build_target = False
if not execution.skip_smoke_tests:
@@ -206,16 +291,22 @@ def _run_single_suite(
build_target_result: BuildTargetResult,
test_suite_config: TestSuiteConfig,
) -> None:
- """Runs a single test suite.
+ """Run all test suite in a single test suite module.
+
+ The function assumes the build target we're testing has already been built on the SUT node.
+ The current build target thus corresponds to the current DPDK build present on the SUT node.
Args:
- sut_node: Node to run tests on.
- execution: Execution the test case belongs to.
- build_target_result: Build target configuration test case is run on
- test_suite_config: Test suite configuration
+ sut_node: The execution's SUT node.
+ tg_node: The execution's TG node.
+ execution: The execution's test run configuration associated with the current build target.
+ build_target_result: The build target level result object associated
+ with the current build target.
+ test_suite_config: Test suite test run configuration specifying the test suite module
+ and possibly a subset of test cases of test suites in that module.
Raises:
- BlockingTestSuiteError: If a test suite that was marked as blocking fails.
+ BlockingTestSuiteError: If a blocking test suite fails.
"""
try:
full_suite_path = f"tests.TestSuite_{test_suite_config.test_suite}"
@@ -239,9 +330,7 @@ def _run_single_suite(
def _exit_dts() -> None:
- """
- Process all errors and exit with the proper exit code.
- """
+ """Process all errors and exit with the proper exit code."""
result.process()
if dts_logger:
diff --git a/dts/main.py b/dts/main.py
index 5d4714b0c3..b856ba86be 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -1,12 +1,10 @@
#!/usr/bin/env python3
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2010-2014 Intel Corporation
-# Copyright(c) 2022 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022 University of New Hampshire
-"""
-A test framework for testing DPDK.
-"""
+"""The DTS executable."""
import logging
@@ -17,6 +15,10 @@ def main() -> None:
"""Set DTS settings, then run DTS.
The DTS settings are taken from the command line arguments and the environment variables.
+ The settings object is stored in the module-level variable settings.SETTINGS which the entire
+ framework uses. After importing the module (or the variable), any changes to the variable are
+ not going to be reflected without a re-import. This means that the SETTINGS variable must
+ be modified before the settings module is imported anywhere else in the framework.
"""
settings.SETTINGS = settings.get_settings()
from framework import dts
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 08/21] dts: test suite docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (6 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 07/21] dts: dts runner and main " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 09/21] dts: test result " Juraj Linkeš
` (13 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_suite.py | 231 +++++++++++++++++++++++++++---------
1 file changed, 175 insertions(+), 56 deletions(-)
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index f9e66e814a..dfb391ffbd 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -2,8 +2,19 @@
# Copyright(c) 2010-2014 Intel Corporation
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Base class for creating DTS test cases.
+"""Features common to all test suites.
+
+The module defines the :class:`TestSuite` class which doesn't contain any test cases, and as such
+must be extended by subclasses which add test cases. The :class:`TestSuite` contains the basics
+needed by subclasses:
+
+ * Test suite and test case execution flow,
+ * Testbed (SUT, TG) configuration,
+ * Packet sending and verification,
+ * Test case verification.
+
+The module also defines a function, :func:`get_test_suites`,
+for gathering test suites from a Python module.
"""
import importlib
@@ -11,7 +22,7 @@
import re
from ipaddress import IPv4Interface, IPv6Interface, ip_interface
from types import MethodType
-from typing import Any, Union
+from typing import Any, ClassVar, Union
from scapy.layers.inet import IP # type: ignore[import]
from scapy.layers.l2 import Ether # type: ignore[import]
@@ -31,25 +42,44 @@
class TestSuite(object):
- """
- The base TestSuite class provides methods for handling basic flow of a test suite:
- * test case filtering and collection
- * test suite setup/cleanup
- * test setup/cleanup
- * test case execution
- * error handling and results storage
- Test cases are implemented by derived classes. Test cases are all methods
- starting with test_, further divided into performance test cases
- (starting with test_perf_) and functional test cases (all other test cases).
- By default, all test cases will be executed. A list of testcase str names
- may be specified in conf.yaml or on the command line
- to filter which test cases to run.
- The methods named [set_up|tear_down]_[suite|test_case] should be overridden
- in derived classes if the appropriate suite/test case fixtures are needed.
+ """The base class with methods for handling the basic flow of a test suite.
+
+ * Test case filtering and collection,
+ * Test suite setup/cleanup,
+ * Test setup/cleanup,
+ * Test case execution,
+ * Error handling and results storage.
+
+ Test cases are implemented by subclasses. Test cases are all methods starting with ``test_``,
+ further divided into performance test cases (starting with ``test_perf_``)
+ and functional test cases (all other test cases).
+
+ By default, all test cases will be executed. A list of testcase names may be specified
+ in the YAML test run configuration file and in the :option:`--test-cases` command line argument
+ or in the :envvar:`DTS_TESTCASES` environment variable to filter which test cases to run.
+ The union of both lists will be used. Any unknown test cases from the latter lists
+ will be silently ignored.
+
+ If the :option:`--re-run` command line argument or the :envvar:`DTS_RERUN` environment variable
+ is set, in case of a test case failure, the test case will be executed again until it passes
+ or it fails that many times in addition of the first failure.
+
+ The methods named ``[set_up|tear_down]_[suite|test_case]`` should be overridden in subclasses
+ if the appropriate test suite/test case fixtures are needed.
+
+ The test suite is aware of the testbed (the SUT and TG) it's running on. From this, it can
+ properly choose the IP addresses and other configuration that must be tailored to the testbed.
+
+ Attributes:
+ sut_node: The SUT node where the test suite is running.
+ tg_node: The TG node where the test suite is running.
"""
sut_node: SutNode
- is_blocking = False
+ tg_node: TGNode
+ #: Whether the test suite is blocking. A failure of a blocking test suite
+ #: will block the execution of all subsequent test suites in the current build target.
+ is_blocking: ClassVar[bool] = False
_logger: DTSLOG
_test_cases_to_run: list[str]
_func: bool
@@ -72,6 +102,20 @@ def __init__(
func: bool,
build_target_result: BuildTargetResult,
):
+ """Initialize the test suite testbed information and basic configuration.
+
+ Process what test cases to run, create the associated
+ :class:`~.test_result.TestSuiteResult`, find links between ports
+ and set up default IP addresses to be used when configuring them.
+
+ Args:
+ sut_node: The SUT node where the test suite will run.
+ tg_node: The TG node where the test suite will run.
+ test_cases: The list of test cases to execute.
+ If empty, all test cases will be executed.
+ func: Whether to run functional tests.
+ build_target_result: The build target result this test suite is run in.
+ """
self.sut_node = sut_node
self.tg_node = tg_node
self._logger = getLogger(self.__class__.__name__)
@@ -95,6 +139,7 @@ def __init__(
self._tg_ip_address_ingress = ip_interface("192.168.101.3/24")
def _process_links(self) -> None:
+ """Construct links between SUT and TG ports."""
for sut_port in self.sut_node.ports:
for tg_port in self.tg_node.ports:
if (sut_port.identifier, sut_port.peer) == (
@@ -104,27 +149,42 @@ def _process_links(self) -> None:
self._port_links.append(PortLink(sut_port=sut_port, tg_port=tg_port))
def set_up_suite(self) -> None:
- """
- Set up test fixtures common to all test cases; this is done before
- any test case is run.
+ """Set up test fixtures common to all test cases.
+
+ This is done before any test case has been run.
"""
def tear_down_suite(self) -> None:
- """
- Tear down the previously created test fixtures common to all test cases.
+ """Tear down the previously created test fixtures common to all test cases.
+
+ This is done after all test have been run.
"""
def set_up_test_case(self) -> None:
- """
- Set up test fixtures before each test case.
+ """Set up test fixtures before each test case.
+
+ This is done before *each* test case.
"""
def tear_down_test_case(self) -> None:
- """
- Tear down the previously created test fixtures after each test case.
+ """Tear down the previously created test fixtures after each test case.
+
+ This is done after *each* test case.
"""
def configure_testbed_ipv4(self, restore: bool = False) -> None:
+ """Configure IPv4 addresses on all testbed ports.
+
+ The configured ports are:
+
+ * SUT ingress port,
+ * SUT egress port,
+ * TG ingress port,
+ * TG egress port.
+
+ Args:
+ restore: If :data:`True`, will remove the configuration instead.
+ """
delete = True if restore else False
enable = False if restore else True
self._configure_ipv4_forwarding(enable)
@@ -149,11 +209,17 @@ def _configure_ipv4_forwarding(self, enable: bool) -> None:
self.sut_node.configure_ipv4_forwarding(enable)
def send_packet_and_capture(self, packet: Packet, duration: float = 1) -> list[Packet]:
- """
- Send a packet through the appropriate interface and
- receive on the appropriate interface.
- Modify the packet with l3/l2 addresses corresponding
- to the testbed and desired traffic.
+ """Send and receive `packet` using the associated TG.
+
+ Send `packet` through the appropriate interface and receive on the appropriate interface.
+ Modify the packet with l3/l2 addresses corresponding to the testbed and desired traffic.
+
+ Args:
+ packet: The packet to send.
+ duration: Capture traffic for this amount of time after sending `packet`.
+
+ Returns:
+ A list of received packets.
"""
packet = self._adjust_addresses(packet)
return self.tg_node.send_packet_and_capture(
@@ -161,13 +227,26 @@ def send_packet_and_capture(self, packet: Packet, duration: float = 1) -> list[P
)
def get_expected_packet(self, packet: Packet) -> Packet:
+ """Inject the proper L2/L3 addresses into `packet`.
+
+ Args:
+ packet: The packet to modify.
+
+ Returns:
+ `packet` with injected L2/L3 addresses.
+ """
return self._adjust_addresses(packet, expected=True)
def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
- """
+ """L2 and L3 address additions in both directions.
+
Assumptions:
- Two links between SUT and TG, one link is TG -> SUT,
- the other SUT -> TG.
+ Two links between SUT and TG, one link is TG -> SUT, the other SUT -> TG.
+
+ Args:
+ packet: The packet to modify.
+ expected: If :data:`True`, the direction is SUT -> TG,
+ otherwise the direction is TG -> SUT.
"""
if expected:
# The packet enters the TG from SUT
@@ -193,6 +272,19 @@ def _adjust_addresses(self, packet: Packet, expected: bool = False) -> Packet:
return Ether(packet.build())
def verify(self, condition: bool, failure_description: str) -> None:
+ """Verify `condition` and handle failures.
+
+ When `condition` is :data:`False`, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ condition: The condition to check.
+ failure_description: A short description of the failure
+ that will be stored in the raised exception.
+
+ Raises:
+ TestCaseVerifyError: `condition` is :data:`False`.
+ """
if not condition:
self._fail_test_case_verify(failure_description)
@@ -206,6 +298,19 @@ def _fail_test_case_verify(self, failure_description: str) -> None:
raise TestCaseVerifyError(failure_description)
def verify_packets(self, expected_packet: Packet, received_packets: list[Packet]) -> None:
+ """Verify that `expected_packet` has been received.
+
+ Go through `received_packets` and check that `expected_packet` is among them.
+ If not, raise an exception and log the last 10 commands
+ executed on both the SUT and TG.
+
+ Args:
+ expected_packet: The packet we're expecting to receive.
+ received_packets: The packets where we're looking for `expected_packet`.
+
+ Raises:
+ TestCaseVerifyError: `expected_packet` is not among `received_packets`.
+ """
for received_packet in received_packets:
if self._compare_packets(expected_packet, received_packet):
break
@@ -280,10 +385,14 @@ def _verify_l3_packet(self, received_packet: IP, expected_packet: IP) -> bool:
return True
def run(self) -> None:
- """
- Setup, execute and teardown the whole suite.
- Suite execution consists of running all test cases scheduled to be executed.
- A test cast run consists of setup, execution and teardown of said test case.
+ """Set up, execute and tear down the whole suite.
+
+ Test suite execution consists of running all test cases scheduled to be executed.
+ A test case run consists of setup, execution and teardown of said test case.
+
+ Record the setup and the teardown and handle failures.
+
+ The list of scheduled test cases is constructed when creating the :class:`TestSuite` object.
"""
test_suite_name = self.__class__.__name__
@@ -315,9 +424,7 @@ def run(self) -> None:
raise BlockingTestSuiteError(test_suite_name)
def _execute_test_suite(self) -> None:
- """
- Execute all test cases scheduled to be executed in this suite.
- """
+ """Execute all test cases scheduled to be executed in this suite."""
if self._func:
for test_case_method in self._get_functional_test_cases():
test_case_name = test_case_method.__name__
@@ -334,14 +441,18 @@ def _execute_test_suite(self) -> None:
self._run_test_case(test_case_method, test_case_result)
def _get_functional_test_cases(self) -> list[MethodType]:
- """
- Get all functional test cases.
+ """Get all functional test cases defined in this TestSuite.
+
+ Returns:
+ The list of functional test cases of this TestSuite.
"""
return self._get_test_cases(r"test_(?!perf_)")
def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
- """
- Return a list of test cases matching test_case_regex.
+ """Return a list of test cases matching test_case_regex.
+
+ Returns:
+ The list of test cases matching test_case_regex of this TestSuite.
"""
self._logger.debug(f"Searching for test cases in {self.__class__.__name__}.")
filtered_test_cases = []
@@ -353,9 +464,7 @@ def _get_test_cases(self, test_case_regex: str) -> list[MethodType]:
return filtered_test_cases
def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool:
- """
- Check whether the test case should be executed.
- """
+ """Check whether the test case should be scheduled to be executed."""
match = bool(re.match(test_case_regex, test_case_name))
if self._test_cases_to_run:
return match and test_case_name in self._test_cases_to_run
@@ -365,9 +474,9 @@ def _should_be_executed(self, test_case_name: str, test_case_regex: str) -> bool
def _run_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Setup, execute and teardown a test case in this suite.
- Exceptions are caught and recorded in logs and results.
+ """Setup, execute and teardown a test case in this suite.
+
+ Record the result of the setup and the teardown and handle failures.
"""
test_case_name = test_case_method.__name__
@@ -402,9 +511,7 @@ def _run_test_case(
def _execute_test_case(
self, test_case_method: MethodType, test_case_result: TestCaseResult
) -> None:
- """
- Execute one test case and handle failures.
- """
+ """Execute one test case, record the result and handle failures."""
test_case_name = test_case_method.__name__
try:
self._logger.info(f"Starting test case execution: {test_case_name}")
@@ -425,6 +532,18 @@ def _execute_test_case(
def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
+ r"""Find all :class:`TestSuite`\s in a Python module.
+
+ Args:
+ testsuite_module_path: The path to the Python module.
+
+ Returns:
+ The list of :class:`TestSuite`\s found within the Python module.
+
+ Raises:
+ ConfigurationError: The test suite module was not found.
+ """
+
def is_test_suite(object: Any) -> bool:
try:
if issubclass(object, TestSuite) and object is not TestSuite:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 09/21] dts: test result docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (7 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 08/21] dts: test suite " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 10/21] dts: config " Juraj Linkeš
` (12 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/test_result.py | 297 ++++++++++++++++++++++++++++-------
1 file changed, 239 insertions(+), 58 deletions(-)
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index 57090feb04..4467749a9d 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -2,8 +2,25 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-Generic result container and reporters
+r"""Record and process DTS results.
+
+The results are recorded in a hierarchical manner:
+
+ * :class:`DTSResult` contains
+ * :class:`ExecutionResult` contains
+ * :class:`BuildTargetResult` contains
+ * :class:`TestSuiteResult` contains
+ * :class:`TestCaseResult`
+
+Each result may contain multiple lower level results, e.g. there are multiple
+:class:`TestSuiteResult`\s in a :class:`BuildTargetResult`.
+The results have common parts, such as setup and teardown results, captured in :class:`BaseResult`,
+which also defines some common behaviors in its methods.
+
+Each result class has its own idiosyncrasies which they implement in overridden methods.
+
+The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+variable modify the directory where the files with results will be stored.
"""
import os.path
@@ -26,26 +43,34 @@
class Result(Enum):
- """
- An Enum defining the possible states that
- a setup, a teardown or a test case may end up in.
- """
+ """The possible states that a setup, a teardown or a test case may end up in."""
+ #:
PASS = auto()
+ #:
FAIL = auto()
+ #:
ERROR = auto()
+ #:
SKIP = auto()
def __bool__(self) -> bool:
+ """Only PASS is True."""
return self is self.PASS
class FixtureResult(object):
- """
- A record that stored the result of a setup or a teardown.
- The default is FAIL because immediately after creating the object
- the setup of the corresponding stage will be executed, which also guarantees
- the execution of teardown.
+ """A record that stores the result of a setup or a teardown.
+
+ :attr:`~Result.FAIL` is a sensible default since it prevents false positives (which could happen
+ if the default was :attr:`~Result.PASS`).
+
+ Preventing false positives or other false results is preferable since a failure
+ is mostly likely to be investigated (the other false results may not be investigated at all).
+
+ Attributes:
+ result: The associated result.
+ error: The error in case of a failure.
"""
result: Result
@@ -56,21 +81,37 @@ def __init__(
result: Result = Result.FAIL,
error: Exception | None = None,
):
+ """Initialize the constructor with the fixture result and store a possible error.
+
+ Args:
+ result: The result to store.
+ error: The error which happened when a failure occurred.
+ """
self.result = result
self.error = error
def __bool__(self) -> bool:
+ """A wrapper around the stored :class:`Result`."""
return bool(self.result)
class Statistics(dict):
- """
- A helper class used to store the number of test cases by its result
- along a few other basic information.
- Using a dict provides a convenient way to format the data.
+ """How many test cases ended in which result state along some other basic information.
+
+ Subclassing :class:`dict` provides a convenient way to format the data.
+
+ The data are stored in the following keys:
+
+ * **PASS RATE** (:class:`int`) -- The FAIL/PASS ratio of all test cases.
+ * **DPDK VERSION** (:class:`str`) -- The tested DPDK version.
"""
def __init__(self, dpdk_version: str | None):
+ """Extend the constructor with keys in which the data are stored.
+
+ Args:
+ dpdk_version: The version of tested DPDK.
+ """
super(Statistics, self).__init__()
for result in Result:
self[result.name] = 0
@@ -78,8 +119,17 @@ def __init__(self, dpdk_version: str | None):
self["DPDK VERSION"] = dpdk_version
def __iadd__(self, other: Result) -> "Statistics":
- """
- Add a Result to the final count.
+ """Add a Result to the final count.
+
+ Example:
+ stats: Statistics = Statistics() # empty Statistics
+ stats += Result.PASS # add a Result to `stats`
+
+ Args:
+ other: The Result to add to this statistics object.
+
+ Returns:
+ The modified statistics object.
"""
self[other.name] += 1
self["PASS RATE"] = (
@@ -88,9 +138,7 @@ def __iadd__(self, other: Result) -> "Statistics":
return self
def __str__(self) -> str:
- """
- Provide a string representation of the data.
- """
+ """Each line contains the formatted key = value pair."""
stats_str = ""
for key, value in self.items():
stats_str += f"{key:<12} = {value}\n"
@@ -100,10 +148,16 @@ def __str__(self) -> str:
class BaseResult(object):
- """
- The Base class for all results. Stores the results of
- the setup and teardown portions of the corresponding stage
- and a list of results from each inner stage in _inner_results.
+ """Common data and behavior of DTS results.
+
+ Stores the results of the setup and teardown portions of the corresponding stage.
+ The hierarchical nature of DTS results is captured recursively in an internal list.
+ A stage is each level in this particular hierarchy (pre-execution or the top-most level,
+ execution, build target, test suite and test case.)
+
+ Attributes:
+ setup_result: The result of the setup of the particular stage.
+ teardown_result: The results of the teardown of the particular stage.
"""
setup_result: FixtureResult
@@ -111,15 +165,28 @@ class BaseResult(object):
_inner_results: MutableSequence["BaseResult"]
def __init__(self):
+ """Initialize the constructor."""
self.setup_result = FixtureResult()
self.teardown_result = FixtureResult()
self._inner_results = []
def update_setup(self, result: Result, error: Exception | None = None) -> None:
+ """Store the setup result.
+
+ Args:
+ result: The result of the setup.
+ error: The error that occurred in case of a failure.
+ """
self.setup_result.result = result
self.setup_result.error = error
def update_teardown(self, result: Result, error: Exception | None = None) -> None:
+ """Store the teardown result.
+
+ Args:
+ result: The result of the teardown.
+ error: The error that occurred in case of a failure.
+ """
self.teardown_result.result = result
self.teardown_result.error = error
@@ -137,27 +204,55 @@ def _get_inner_errors(self) -> list[Exception]:
]
def get_errors(self) -> list[Exception]:
+ """Compile errors from the whole result hierarchy.
+
+ Returns:
+ The errors from setup, teardown and all errors found in the whole result hierarchy.
+ """
return self._get_setup_teardown_errors() + self._get_inner_errors()
def add_stats(self, statistics: Statistics) -> None:
+ """Collate stats from the whole result hierarchy.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be collated.
+ """
for inner_result in self._inner_results:
inner_result.add_stats(statistics)
class TestCaseResult(BaseResult, FixtureResult):
- """
- The test case specific result.
- Stores the result of the actual test case.
- Also stores the test case name.
+ r"""The test case specific result.
+
+ Stores the result of the actual test case. This is done by adding an extra superclass
+ in :class:`FixtureResult`. The setup and teardown results are :class:`FixtureResult`\s and
+ the class is itself a record of the test case.
+
+ Attributes:
+ test_case_name: The test case name.
"""
test_case_name: str
def __init__(self, test_case_name: str):
+ """Extend the constructor with `test_case_name`.
+
+ Args:
+ test_case_name: The test case's name.
+ """
super(TestCaseResult, self).__init__()
self.test_case_name = test_case_name
def update(self, result: Result, error: Exception | None = None) -> None:
+ """Update the test case result.
+
+ This updates the result of the test case itself and doesn't affect
+ the results of the setup and teardown steps in any way.
+
+ Args:
+ result: The result of the test case.
+ error: The error that occurred in case of a failure.
+ """
self.result = result
self.error = error
@@ -167,36 +262,64 @@ def _get_inner_errors(self) -> list[Exception]:
return []
def add_stats(self, statistics: Statistics) -> None:
+ r"""Add the test case result to statistics.
+
+ The base method goes through the hierarchy recursively and this method is here to stop
+ the recursion, as the :class:`TestCaseResult`\s are the leaves of the hierarchy tree.
+
+ Args:
+ statistics: The :class:`Statistics` object where the stats will be added.
+ """
statistics += self.result
def __bool__(self) -> bool:
+ """The test case passed only if setup, teardown and the test case itself passed."""
return bool(self.setup_result) and bool(self.teardown_result) and bool(self.result)
class TestSuiteResult(BaseResult):
- """
- The test suite specific result.
- The _inner_results list stores results of test cases in a given test suite.
- Also stores the test suite name.
+ """The test suite specific result.
+
+ The internal list stores the results of all test cases in a given test suite.
+
+ Attributes:
+ suite_name: The test suite name.
"""
suite_name: str
def __init__(self, suite_name: str):
+ """Extend the constructor with `suite_name`.
+
+ Args:
+ suite_name: The test suite's name.
+ """
super(TestSuiteResult, self).__init__()
self.suite_name = suite_name
def add_test_case(self, test_case_name: str) -> TestCaseResult:
+ """Add and return the inner result (test case).
+
+ Returns:
+ The test case's result.
+ """
test_case_result = TestCaseResult(test_case_name)
self._inner_results.append(test_case_result)
return test_case_result
class BuildTargetResult(BaseResult):
- """
- The build target specific result.
- The _inner_results list stores results of test suites in a given build target.
- Also stores build target specifics, such as compiler used to build DPDK.
+ """The build target specific result.
+
+ The internal list stores the results of all test suites in a given build target.
+
+ Attributes:
+ arch: The DPDK build target architecture.
+ os: The DPDK build target operating system.
+ cpu: The DPDK build target CPU.
+ compiler: The DPDK build target compiler.
+ compiler_version: The DPDK build target compiler version.
+ dpdk_version: The built DPDK version.
"""
arch: Architecture
@@ -207,6 +330,11 @@ class BuildTargetResult(BaseResult):
dpdk_version: str | None
def __init__(self, build_target: BuildTargetConfiguration):
+ """Extend the constructor with the `build_target`'s build target config.
+
+ Args:
+ build_target: The build target's test run configuration.
+ """
super(BuildTargetResult, self).__init__()
self.arch = build_target.arch
self.os = build_target.os
@@ -216,20 +344,35 @@ def __init__(self, build_target: BuildTargetConfiguration):
self.dpdk_version = None
def add_build_target_info(self, versions: BuildTargetInfo) -> None:
+ """Add information about the build target gathered at runtime.
+
+ Args:
+ versions: The additional information.
+ """
self.compiler_version = versions.compiler_version
self.dpdk_version = versions.dpdk_version
def add_test_suite(self, test_suite_name: str) -> TestSuiteResult:
+ """Add and return the inner result (test suite).
+
+ Returns:
+ The test suite's result.
+ """
test_suite_result = TestSuiteResult(test_suite_name)
self._inner_results.append(test_suite_result)
return test_suite_result
class ExecutionResult(BaseResult):
- """
- The execution specific result.
- The _inner_results list stores results of build targets in a given execution.
- Also stores the SUT node configuration.
+ """The execution specific result.
+
+ The internal list stores the results of all build targets in a given execution.
+
+ Attributes:
+ sut_node: The SUT node used in the execution.
+ sut_os_name: The operating system of the SUT node.
+ sut_os_version: The operating system version of the SUT node.
+ sut_kernel_version: The operating system kernel version of the SUT node.
"""
sut_node: NodeConfiguration
@@ -238,34 +381,53 @@ class ExecutionResult(BaseResult):
sut_kernel_version: str
def __init__(self, sut_node: NodeConfiguration):
+ """Extend the constructor with the `sut_node`'s config.
+
+ Args:
+ sut_node: The SUT node's test run configuration used in the execution.
+ """
super(ExecutionResult, self).__init__()
self.sut_node = sut_node
def add_build_target(self, build_target: BuildTargetConfiguration) -> BuildTargetResult:
+ """Add and return the inner result (build target).
+
+ Args:
+ build_target: The build target's test run configuration.
+
+ Returns:
+ The build target's result.
+ """
build_target_result = BuildTargetResult(build_target)
self._inner_results.append(build_target_result)
return build_target_result
def add_sut_info(self, sut_info: NodeInfo) -> None:
+ """Add SUT information gathered at runtime.
+
+ Args:
+ sut_info: The additional SUT node information.
+ """
self.sut_os_name = sut_info.os_name
self.sut_os_version = sut_info.os_version
self.sut_kernel_version = sut_info.kernel_version
class DTSResult(BaseResult):
- """
- Stores environment information and test results from a DTS run, which are:
- * Execution level information, such as SUT and TG hardware.
- * Build target level information, such as compiler, target OS and cpu.
- * Test suite results.
- * All errors that are caught and recorded during DTS execution.
+ """Stores environment information and test results from a DTS run.
- The information is stored in nested objects.
+ * Execution level information, such as testbed and the test suite list,
+ * Build target level information, such as compiler, target OS and cpu,
+ * Test suite and test case results,
+ * All errors that are caught and recorded during DTS execution.
- The class is capable of computing the return code used to exit DTS with
- from the stored error.
+ The information is stored hierarchically. This is the first level of the hierarchy
+ and as such is where the data form the whole hierarchy is collated or processed.
- It also provides a brief statistical summary of passed/failed test cases.
+ The internal list stores the results of all executions.
+
+ Attributes:
+ dpdk_version: The DPDK version to record.
"""
dpdk_version: str | None
@@ -276,6 +438,11 @@ class DTSResult(BaseResult):
_stats_filename: str
def __init__(self, logger: DTSLOG):
+ """Extend the constructor with top-level specifics.
+
+ Args:
+ logger: The logger instance the whole result will use.
+ """
super(DTSResult, self).__init__()
self.dpdk_version = None
self._logger = logger
@@ -285,21 +452,33 @@ def __init__(self, logger: DTSLOG):
self._stats_filename = os.path.join(SETTINGS.output_dir, "statistics.txt")
def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
+ """Add and return the inner result (execution).
+
+ Args:
+ sut_node: The SUT node's test run configuration.
+
+ Returns:
+ The execution's result.
+ """
execution_result = ExecutionResult(sut_node)
self._inner_results.append(execution_result)
return execution_result
def add_error(self, error: Exception) -> None:
+ """Record an error that occurred outside any execution.
+
+ Args:
+ error: The exception to record.
+ """
self._errors.append(error)
def process(self) -> None:
- """
- Process the data after a DTS run.
- The data is added to nested objects during runtime and this parent object
- is not updated at that time. This requires us to process the nested data
- after it's all been gathered.
+ """Process the data after a whole DTS run.
+
+ The data is added to inner objects during runtime and this object is not updated
+ at that time. This requires us to process the inner data after it's all been gathered.
- The processing gathers all errors and the result statistics of test cases.
+ The processing gathers all errors and the statistics of test case results.
"""
self._errors += self.get_errors()
if self._errors and self._logger:
@@ -313,8 +492,10 @@ def process(self) -> None:
stats_file.write(str(self._stats_result))
def get_return_code(self) -> int:
- """
- Go through all stored Exceptions and return the highest error code found.
+ """Go through all stored Exceptions and return the final DTS error code.
+
+ Returns:
+ The highest error code found.
"""
for error in self._errors:
error_return_code = ErrorSeverity.GENERIC_ERR
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 10/21] dts: config docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (8 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 09/21] dts: test result " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 11/21] dts: remote session " Juraj Linkeš
` (11 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/config/__init__.py | 369 ++++++++++++++++++++++++++-----
dts/framework/config/types.py | 132 +++++++++++
2 files changed, 444 insertions(+), 57 deletions(-)
create mode 100644 dts/framework/config/types.py
diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index ef25a463c0..62eded7f04 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -3,8 +3,34 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-Yaml config parsing methods
+"""Testbed configuration and test suite specification.
+
+This package offers classes that hold real-time information about the testbed, hold test run
+configuration describing the tested testbed and a loader function, :func:`load_config`, which loads
+the YAML test run configuration file
+and validates it according to :download:`the schema <conf_yaml_schema.json>`.
+
+The YAML test run configuration file is parsed into a dictionary, parts of which are used throughout
+this package. The allowed keys and types inside this dictionary are defined in
+the :doc:`types <framework.config.types>` module.
+
+The test run configuration has two main sections:
+
+ * The :class:`ExecutionConfiguration` which defines what tests are going to be run
+ and how DPDK will be built. It also references the testbed where these tests and DPDK
+ are going to be run,
+ * The nodes of the testbed are defined in the other section,
+ a :class:`list` of :class:`NodeConfiguration` objects.
+
+The real-time information about testbed is supposed to be gathered at runtime.
+
+The classes defined in this package make heavy use of :mod:`dataclasses`.
+All of them use slots and are frozen:
+
+ * Slots enables some optimizations, by pre-allocating space for the defined
+ attributes in the underlying data structure,
+ * Frozen makes the object immutable. This enables further optimizations,
+ and makes it thread safe should we ever want to move in that direction.
"""
import json
@@ -12,11 +38,20 @@
import pathlib
from dataclasses import dataclass
from enum import auto, unique
-from typing import Any, TypedDict, Union
+from typing import Union
import warlock # type: ignore[import]
import yaml
+from framework.config.types import (
+ BuildTargetConfigDict,
+ ConfigurationDict,
+ ExecutionConfigDict,
+ NodeConfigDict,
+ PortConfigDict,
+ TestSuiteConfigDict,
+ TrafficGeneratorConfigDict,
+)
from framework.exception import ConfigurationError
from framework.settings import SETTINGS
from framework.utils import StrEnum
@@ -24,55 +59,97 @@
@unique
class Architecture(StrEnum):
+ r"""The supported architectures of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
i686 = auto()
+ #:
x86_64 = auto()
+ #:
x86_32 = auto()
+ #:
arm64 = auto()
+ #:
ppc64le = auto()
@unique
class OS(StrEnum):
+ r"""The supported operating systems of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
linux = auto()
+ #:
freebsd = auto()
+ #:
windows = auto()
@unique
class CPUType(StrEnum):
+ r"""The supported CPUs of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
native = auto()
+ #:
armv8a = auto()
+ #:
dpaa2 = auto()
+ #:
thunderx = auto()
+ #:
xgene1 = auto()
@unique
class Compiler(StrEnum):
+ r"""The supported compilers of :class:`~framework.testbed_model.node.Node`\s."""
+
+ #:
gcc = auto()
+ #:
clang = auto()
+ #:
icc = auto()
+ #:
msvc = auto()
@unique
class TrafficGeneratorType(StrEnum):
+ """The supported traffic generators."""
+
+ #:
SCAPY = auto()
-# Slots enables some optimizations, by pre-allocating space for the defined
-# attributes in the underlying data structure.
-#
-# Frozen makes the object immutable. This enables further optimizations,
-# and makes it thread safe should we every want to move in that direction.
@dataclass(slots=True, frozen=True)
class HugepageConfiguration:
+ r"""The hugepage configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ amount: The number of hugepages.
+ force_first_numa: If :data:`True`, the hugepages will be configured on the first NUMA node.
+ """
+
amount: int
force_first_numa: bool
@dataclass(slots=True, frozen=True)
class PortConfig:
+ r"""The port configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ node: The :class:`~framework.testbed_model.node.Node` where this port exists.
+ pci: The PCI address of the port.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK.
+ os_driver: The operating system driver name when the operating system controls the port.
+ peer_node: The :class:`~framework.testbed_model.node.Node` of the port
+ connected to this port.
+ peer_pci: The PCI address of the port connected to this port.
+ """
+
node: str
pci: str
os_driver_for_dpdk: str
@@ -81,18 +158,44 @@ class PortConfig:
peer_pci: str
@staticmethod
- def from_dict(node: str, d: dict) -> "PortConfig":
+ def from_dict(node: str, d: PortConfigDict) -> "PortConfig":
+ """A convenience method that creates the object from fewer inputs.
+
+ Args:
+ node: The node where this port exists.
+ d: The configuration dictionary.
+
+ Returns:
+ The port configuration instance.
+ """
return PortConfig(node=node, **d)
@dataclass(slots=True, frozen=True)
class TrafficGeneratorConfig:
+ """The configuration of traffic generators.
+
+ The class will be expanded when more configuration is needed.
+
+ Attributes:
+ traffic_generator_type: The type of the traffic generator.
+ """
+
traffic_generator_type: TrafficGeneratorType
@staticmethod
- def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
- # This looks useless now, but is designed to allow expansion to traffic
- # generators that require more configuration later.
+ def from_dict(d: TrafficGeneratorConfigDict) -> "ScapyTrafficGeneratorConfig":
+ """A convenience method that produces traffic generator config of the proper type.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The traffic generator configuration instance.
+
+ Raises:
+ ConfigurationError: An unknown traffic generator type was encountered.
+ """
match TrafficGeneratorType(d["type"]):
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGeneratorConfig(
@@ -104,11 +207,31 @@ def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
@dataclass(slots=True, frozen=True)
class ScapyTrafficGeneratorConfig(TrafficGeneratorConfig):
+ """Scapy traffic generator specific configuration."""
+
pass
@dataclass(slots=True, frozen=True)
class NodeConfiguration:
+ r"""The configuration of :class:`~framework.testbed_model.node.Node`\s.
+
+ Attributes:
+ name: The name of the :class:`~framework.testbed_model.node.Node`.
+ hostname: The hostname of the :class:`~framework.testbed_model.node.Node`.
+ Can be an IP or a domain name.
+ user: The name of the user used to connect to
+ the :class:`~framework.testbed_model.node.Node`.
+ password: The password of the user. The use of passwords is heavily discouraged.
+ Please use keys instead.
+ arch: The architecture of the :class:`~framework.testbed_model.node.Node`.
+ os: The operating system of the :class:`~framework.testbed_model.node.Node`.
+ lcores: A comma delimited list of logical cores to use when running DPDK.
+ use_first_core: If :data:`True`, the first logical core won't be used.
+ hugepages: An optional hugepage configuration.
+ ports: The ports that can be used in testing.
+ """
+
name: str
hostname: str
user: str
@@ -121,55 +244,89 @@ class NodeConfiguration:
ports: list[PortConfig]
@staticmethod
- def from_dict(d: dict) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
- hugepage_config = d.get("hugepages")
- if hugepage_config:
- if "force_first_numa" not in hugepage_config:
- hugepage_config["force_first_numa"] = False
- hugepage_config = HugepageConfiguration(**hugepage_config)
-
- common_config = {
- "name": d["name"],
- "hostname": d["hostname"],
- "user": d["user"],
- "password": d.get("password"),
- "arch": Architecture(d["arch"]),
- "os": OS(d["os"]),
- "lcores": d.get("lcores", "1"),
- "use_first_core": d.get("use_first_core", False),
- "hugepages": hugepage_config,
- "ports": [PortConfig.from_dict(d["name"], port) for port in d["ports"]],
- }
-
+ def from_dict(
+ d: NodeConfigDict,
+ ) -> Union["SutNodeConfiguration", "TGNodeConfiguration"]:
+ """A convenience method that processes the inputs before creating a specialized instance.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ Either an SUT or TG configuration instance.
+ """
+ hugepage_config = None
+ if "hugepages" in d:
+ hugepage_config_dict = d["hugepages"]
+ if "force_first_numa" not in hugepage_config_dict:
+ hugepage_config_dict["force_first_numa"] = False
+ hugepage_config = HugepageConfiguration(**hugepage_config_dict)
+
+ # The calls here contain duplicated code which is here because Mypy doesn't
+ # properly support dictionary unpacking with TypedDicts
if "traffic_generator" in d:
return TGNodeConfiguration(
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
traffic_generator=TrafficGeneratorConfig.from_dict(d["traffic_generator"]),
- **common_config,
)
else:
return SutNodeConfiguration(
- memory_channels=d.get("memory_channels", 1), **common_config
+ name=d["name"],
+ hostname=d["hostname"],
+ user=d["user"],
+ password=d.get("password"),
+ arch=Architecture(d["arch"]),
+ os=OS(d["os"]),
+ lcores=d.get("lcores", "1"),
+ use_first_core=d.get("use_first_core", False),
+ hugepages=hugepage_config,
+ ports=[PortConfig.from_dict(d["name"], port) for port in d["ports"]],
+ memory_channels=d.get("memory_channels", 1),
)
@dataclass(slots=True, frozen=True)
class SutNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.sut_node.SutNode` specific configuration.
+
+ Attributes:
+ memory_channels: The number of memory channels to use when running DPDK.
+ """
+
memory_channels: int
@dataclass(slots=True, frozen=True)
class TGNodeConfiguration(NodeConfiguration):
+ """:class:`~framework.testbed_model.tg_node.TGNode` specific configuration.
+
+ Attributes:
+ traffic_generator: The configuration of the traffic generator present on the TG node.
+ """
+
traffic_generator: ScapyTrafficGeneratorConfig
@dataclass(slots=True, frozen=True)
class NodeInfo:
- """Class to hold important versions within the node.
-
- This class, unlike the NodeConfiguration class, cannot be generated at the start.
- This is because we need to initialize a connection with the node before we can
- collect the information needed in this class. Therefore, it cannot be a part of
- the configuration class above.
+ """Supplemental node information.
+
+ Attributes:
+ os_name: The name of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ os_version: The version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
+ kernel_version: The kernel version of the running operating system of
+ the :class:`~framework.testbed_model.node.Node`.
"""
os_name: str
@@ -179,6 +336,20 @@ class NodeInfo:
@dataclass(slots=True, frozen=True)
class BuildTargetConfiguration:
+ """DPDK build configuration.
+
+ The configuration used for building DPDK.
+
+ Attributes:
+ arch: The target architecture to build for.
+ os: The target os to build for.
+ cpu: The target CPU to build for.
+ compiler: The compiler executable to use.
+ compiler_wrapper: This string will be put in front of the compiler when
+ executing the build. Useful for adding wrapper commands, such as ``ccache``.
+ name: The name of the compiler.
+ """
+
arch: Architecture
os: OS
cpu: CPUType
@@ -187,7 +358,18 @@ class BuildTargetConfiguration:
name: str
@staticmethod
- def from_dict(d: dict) -> "BuildTargetConfiguration":
+ def from_dict(d: BuildTargetConfigDict) -> "BuildTargetConfiguration":
+ r"""A convenience method that processes the inputs before creating an instance.
+
+ `arch`, `os`, `cpu` and `compiler` are converted to :class:`Enum`\s and
+ `name` is constructed from `arch`, `os`, `cpu` and `compiler`.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The build target configuration instance.
+ """
return BuildTargetConfiguration(
arch=Architecture(d["arch"]),
os=OS(d["os"]),
@@ -200,23 +382,29 @@ def from_dict(d: dict) -> "BuildTargetConfiguration":
@dataclass(slots=True, frozen=True)
class BuildTargetInfo:
- """Class to hold important versions within the build target.
+ """Various versions and other information about a build target.
- This is very similar to the NodeInfo class, it just instead holds information
- for the build target.
+ Attributes:
+ dpdk_version: The DPDK version that was built.
+ compiler_version: The version of the compiler used to build DPDK.
"""
dpdk_version: str
compiler_version: str
-class TestSuiteConfigDict(TypedDict):
- suite: str
- cases: list[str]
-
-
@dataclass(slots=True, frozen=True)
class TestSuiteConfig:
+ """Test suite configuration.
+
+ Information about a single test suite to be executed.
+
+ Attributes:
+ test_suite: The name of the test suite module without the starting ``TestSuite_``.
+ test_cases: The names of test cases from this test suite to execute.
+ If empty, all test cases will be executed.
+ """
+
test_suite: str
test_cases: list[str]
@@ -224,6 +412,14 @@ class TestSuiteConfig:
def from_dict(
entry: str | TestSuiteConfigDict,
) -> "TestSuiteConfig":
+ """Create an instance from two different types.
+
+ Args:
+ entry: Either a suite name or a dictionary containing the config.
+
+ Returns:
+ The test suite configuration instance.
+ """
if isinstance(entry, str):
return TestSuiteConfig(test_suite=entry, test_cases=[])
elif isinstance(entry, dict):
@@ -234,19 +430,49 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class ExecutionConfiguration:
+ """The configuration of an execution.
+
+ The configuration contains testbed information, what tests to execute
+ and with what DPDK build.
+
+ Attributes:
+ build_targets: A list of DPDK builds to test.
+ perf: Whether to run performance tests.
+ func: Whether to run functional tests.
+ skip_smoke_tests: Whether to skip smoke tests.
+ test_suites: The names of test suites and/or test cases to execute.
+ system_under_test_node: The SUT node to use in this execution.
+ traffic_generator_node: The TG node to use in this execution.
+ vdevs: The names of virtual devices to test.
+ """
+
build_targets: list[BuildTargetConfiguration]
perf: bool
func: bool
+ skip_smoke_tests: bool
test_suites: list[TestSuiteConfig]
system_under_test_node: SutNodeConfiguration
traffic_generator_node: TGNodeConfiguration
vdevs: list[str]
- skip_smoke_tests: bool
@staticmethod
def from_dict(
- d: dict, node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]]
+ d: ExecutionConfigDict,
+ node_map: dict[str, Union[SutNodeConfiguration | TGNodeConfiguration]],
) -> "ExecutionConfiguration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ The build target and the test suite config are transformed into their respective objects.
+ SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
+ are just stored.
+
+ Args:
+ d: The configuration dictionary.
+ node_map: A dictionary mapping node names to their config objects.
+
+ Returns:
+ The execution configuration instance.
+ """
build_targets: list[BuildTargetConfiguration] = list(
map(BuildTargetConfiguration.from_dict, d["build_targets"])
)
@@ -283,10 +509,31 @@ def from_dict(
@dataclass(slots=True, frozen=True)
class Configuration:
+ """DTS testbed and test configuration.
+
+ The node configuration is not stored in this object. Rather, all used node configurations
+ are stored inside the execution configuration where the nodes are actually used.
+
+ Attributes:
+ executions: Execution configurations.
+ """
+
executions: list[ExecutionConfiguration]
@staticmethod
- def from_dict(d: dict) -> "Configuration":
+ def from_dict(d: ConfigurationDict) -> "Configuration":
+ """A convenience method that processes the inputs before creating an instance.
+
+ Build target and test suite config are transformed into their respective objects.
+ SUT and TG configurations are taken from `node_map`. The other (:class:`bool`) attributes
+ are just stored.
+
+ Args:
+ d: The configuration dictionary.
+
+ Returns:
+ The whole configuration instance.
+ """
nodes: list[Union[SutNodeConfiguration | TGNodeConfiguration]] = list(
map(NodeConfiguration.from_dict, d["nodes"])
)
@@ -303,9 +550,17 @@ def from_dict(d: dict) -> "Configuration":
def load_config() -> Configuration:
- """
- Loads the configuration file and the configuration file schema,
- validates the configuration file, and creates a configuration object.
+ """Load DTS test run configuration from a file.
+
+ Load the YAML test run configuration file
+ and :download:`the configuration file schema <conf_yaml_schema.json>`,
+ validate the test run configuration file, and create a test run configuration object.
+
+ The YAML test run configuration file is specified in the :option:`--config-file` command line
+ argument or the :envvar:`DTS_CFG_FILE` environment variable.
+
+ Returns:
+ The parsed test run configuration.
"""
with open(SETTINGS.config_file_path, "r") as f:
config_data = yaml.safe_load(f)
@@ -314,6 +569,6 @@ def load_config() -> Configuration:
with open(schema_path, "r") as f:
schema = json.load(f)
- config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
- config_obj: Configuration = Configuration.from_dict(dict(config))
+ config = warlock.model_factory(schema, name="_Config")(config_data)
+ config_obj: Configuration = Configuration.from_dict(dict(config)) # type: ignore[arg-type]
return config_obj
diff --git a/dts/framework/config/types.py b/dts/framework/config/types.py
new file mode 100644
index 0000000000..1927910d88
--- /dev/null
+++ b/dts/framework/config/types.py
@@ -0,0 +1,132 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+"""Configuration dictionary contents specification.
+
+These type definitions serve as documentation of the configuration dictionary contents.
+
+The definitions use the built-in :class:`~typing.TypedDict` construct.
+"""
+
+from typing import TypedDict
+
+
+class PortConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ pci: str
+ #:
+ os_driver_for_dpdk: str
+ #:
+ os_driver: str
+ #:
+ peer_node: str
+ #:
+ peer_pci: str
+
+
+class TrafficGeneratorConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ type: str
+
+
+class HugepageConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ amount: int
+ #:
+ force_first_numa: bool
+
+
+class NodeConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ hugepages: HugepageConfigurationDict
+ #:
+ name: str
+ #:
+ hostname: str
+ #:
+ user: str
+ #:
+ password: str
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ lcores: str
+ #:
+ use_first_core: bool
+ #:
+ ports: list[PortConfigDict]
+ #:
+ memory_channels: int
+ #:
+ traffic_generator: TrafficGeneratorConfigDict
+
+
+class BuildTargetConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ arch: str
+ #:
+ os: str
+ #:
+ cpu: str
+ #:
+ compiler: str
+ #:
+ compiler_wrapper: str
+
+
+class TestSuiteConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ suite: str
+ #:
+ cases: list[str]
+
+
+class ExecutionSUTConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ node_name: str
+ #:
+ vdevs: list[str]
+
+
+class ExecutionConfigDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ build_targets: list[BuildTargetConfigDict]
+ #:
+ perf: bool
+ #:
+ func: bool
+ #:
+ skip_smoke_tests: bool
+ #:
+ test_suites: TestSuiteConfigDict
+ #:
+ system_under_test_node: ExecutionSUTConfigDict
+ #:
+ traffic_generator_node: str
+
+
+class ConfigurationDict(TypedDict):
+ """Allowed keys and values."""
+
+ #:
+ nodes: list[NodeConfigDict]
+ #:
+ executions: list[ExecutionConfigDict]
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 11/21] dts: remote session docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (9 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 10/21] dts: config " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 12/21] dts: interactive " Juraj Linkeš
` (10 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/__init__.py | 39 +++++-
.../remote_session/remote_session.py | 130 +++++++++++++-----
dts/framework/remote_session/ssh_session.py | 16 +--
3 files changed, 137 insertions(+), 48 deletions(-)
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 5e7ddb2b05..51a01d6b5e 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -2,12 +2,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
-"""
-The package provides modules for managing remote connections to a remote host (node),
-differentiated by OS.
-The package provides a factory function, create_session, that returns the appropriate
-remote connection based on the passed configuration. The differences are in the
-underlying transport protocol (e.g. SSH) and remote OS (e.g. Linux).
+"""Remote interactive and non-interactive sessions.
+
+This package provides modules for managing remote connections to a remote host (node).
+
+The non-interactive sessions send commands and return their output and exit code.
+
+The interactive sessions open an interactive shell which is continuously open,
+allowing it to send and receive data within that particular shell.
"""
# pylama:ignore=W0611
@@ -26,10 +28,35 @@
def create_remote_session(
node_config: NodeConfiguration, name: str, logger: DTSLOG
) -> RemoteSession:
+ """Factory for non-interactive remote sessions.
+
+ The function returns an SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The SSH remote session.
+ """
return SSHSession(node_config, name, logger)
def create_interactive_session(
node_config: NodeConfiguration, logger: DTSLOG
) -> InteractiveRemoteSession:
+ """Factory for interactive remote sessions.
+
+ The function returns an interactive SSH session, but will be extended if support
+ for other protocols is added.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+
+ Returns:
+ The interactive SSH remote session.
+ """
return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote_session.py b/dts/framework/remote_session/remote_session.py
index 719f7d1ef7..2059f9a981 100644
--- a/dts/framework/remote_session/remote_session.py
+++ b/dts/framework/remote_session/remote_session.py
@@ -3,6 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
+"""Base remote session.
+
+This module contains the abstract base class for remote sessions and defines
+the structure of the result of a command execution.
+"""
+
+
import dataclasses
from abc import ABC, abstractmethod
from pathlib import PurePath
@@ -15,8 +22,14 @@
@dataclasses.dataclass(slots=True, frozen=True)
class CommandResult:
- """
- The result of remote execution of a command.
+ """The result of remote execution of a command.
+
+ Attributes:
+ name: The name of the session that executed the command.
+ command: The executed command.
+ stdout: The standard output the command produced.
+ stderr: The standard error output the command produced.
+ return_code: The return code the command exited with.
"""
name: str
@@ -26,6 +39,7 @@ class CommandResult:
return_code: int
def __str__(self) -> str:
+ """Format the command outputs."""
return (
f"stdout: '{self.stdout}'\n"
f"stderr: '{self.stderr}'\n"
@@ -34,13 +48,24 @@ def __str__(self) -> str:
class RemoteSession(ABC):
- """
- The base class for defining which methods must be implemented in order to connect
- to a remote host (node) and maintain a remote session. The derived classes are
- supposed to implement/use some underlying transport protocol (e.g. SSH) to
- implement the methods. On top of that, it provides some basic services common to
- all derived classes, such as keeping history and logging what's being executed
- on the remote node.
+ """Non-interactive remote session.
+
+ The abstract methods must be implemented in order to connect to a remote host (node)
+ and maintain a remote session.
+ The subclasses must use (or implement) some underlying transport protocol (e.g. SSH)
+ to implement the methods. On top of that, it provides some basic services common to all
+ subclasses, such as keeping history and logging what's being executed on the remote node.
+
+ Attributes:
+ name: The name of the session.
+ hostname: The node's hostname. Could be an IP (possibly with port, separated by a colon)
+ or a domain name.
+ ip: The IP address of the node or a domain name, whichever was used in `hostname`.
+ port: The port of the node, if given in `hostname`.
+ username: The username used in the connection.
+ password: The password used in the connection. Most frequently empty,
+ as the use of passwords is discouraged.
+ history: The executed commands during this session.
"""
name: str
@@ -59,6 +84,16 @@ def __init__(
session_name: str,
logger: DTSLOG,
):
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ session_name: The name of the session.
+ logger: The logger instance this session will use.
+
+ Raises:
+ SSHConnectionError: If the connection to the node was not successful.
+ """
self._node_config = node_config
self.name = session_name
@@ -79,8 +114,13 @@ def __init__(
@abstractmethod
def _connect(self) -> None:
- """
- Create connection to assigned node.
+ """Create a connection to the node.
+
+ The implementation must assign the established session to self.session.
+
+ The implementation must except all exceptions and convert them to an SSHConnectionError.
+
+ The implementation may optionally implement retry attempts.
"""
def send_command(
@@ -90,11 +130,24 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- Send a command to the connected node using optional env vars
- and return CommandResult.
- If verify is True, check the return code of the executed command
- and raise a RemoteCommandExecutionError if the command failed.
+ """Send `command` to the connected node.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds for `command` execution to complete.
+ verify: If :data:`True`, will check the exit code of `command`.
+ env: A dictionary with environment variables to be used with `command` execution.
+
+ Raises:
+ SSHSessionDeadError: If the session isn't alive when sending `command`.
+ SSHTimeoutError: If `command` execution timed out.
+ RemoteCommandExecutionError: If verify is :data:`True` and `command` execution failed.
+
+ Returns:
+ The output of the command along with the return code.
"""
self._logger.info(f"Sending: '{command}'" + (f" with env vars: '{env}'" if env else ""))
result = self._send_command(command, timeout, env)
@@ -111,29 +164,38 @@ def send_command(
@abstractmethod
def _send_command(self, command: str, timeout: float, env: dict | None) -> CommandResult:
- """
- Use the underlying protocol to execute the command using optional env vars
- and return CommandResult.
+ """Send a command to the connected node.
+
+ The implementation must execute the command remotely with `env` environment variables
+ and return the result.
+
+ The implementation must except all exceptions and raise:
+
+ * SSHSessionDeadError if the session is not alive,
+ * SSHTimeoutError if the command execution times out.
"""
def close(self, force: bool = False) -> None:
- """
- Close the remote session and free all used resources.
+ """Close the remote session and free all used resources.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
"""
self._logger.logger_exit()
self._close(force)
@abstractmethod
def _close(self, force: bool = False) -> None:
- """
- Execute protocol specific steps needed to close the session properly.
+ """Protocol specific steps needed to close the session properly.
+
+ Args:
+ force: Force the closure of the connection. This may not clean up all resources.
+ This doesn't have to be implemented in the overloaded method.
"""
@abstractmethod
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the remote session is still responding."""
@abstractmethod
def copy_from(
@@ -143,12 +205,12 @@ def copy_from(
) -> None:
"""Copy a file from the remote Node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote Node associated with this remote session
+ to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
- destination_file: a file or directory path on the local filesystem.
+ source_file: The file on the remote Node.
+ destination_file: A file or directory path on the local filesystem.
"""
@abstractmethod
@@ -159,10 +221,10 @@ def copy_to(
) -> None:
"""Copy a file from local filesystem to the remote Node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file` on the remote Node
+ associated with this remote session.
Args:
- source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ source_file: The file on the local filesystem.
+ destination_file: A file or directory path on the remote Node.
"""
diff --git a/dts/framework/remote_session/ssh_session.py b/dts/framework/remote_session/ssh_session.py
index a467033a13..782220092c 100644
--- a/dts/framework/remote_session/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -1,6 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""SSH remote session."""
+
import socket
import traceback
from pathlib import PurePath
@@ -26,13 +28,8 @@
class SSHSession(RemoteSession):
"""A persistent SSH connection to a remote Node.
- The connection is implemented with the Fabric Python library.
-
- Args:
- node_config: The configuration of the Node to connect to.
- session_name: The name of the session.
- logger: The logger used for logging.
- This should be passed from the parent OSSession.
+ The connection is implemented with
+ `the Fabric Python library <https://docs.fabfile.org/en/latest/>`_.
Attributes:
session: The underlying Fabric SSH connection.
@@ -78,6 +75,7 @@ def _connect(self) -> None:
raise SSHConnectionError(self.hostname, errors)
def is_alive(self) -> bool:
+ """Overrides :meth:`~.remote_session.RemoteSession.is_alive`."""
return self.session.is_connected
def _send_command(self, command: str, timeout: float, env: dict | None) -> CommandResult:
@@ -85,7 +83,7 @@ def _send_command(self, command: str, timeout: float, env: dict | None) -> Comma
Args:
command: The command to execute.
- timeout: Wait at most this many seconds for the execution to complete.
+ timeout: Wait at most this long in seconds for the command execution to complete.
env: Extra environment variables that will be used in command execution.
Raises:
@@ -110,6 +108,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_from`."""
self.session.get(str(destination_file), str(source_file))
def copy_to(
@@ -117,6 +116,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.remote_session.RemoteSession.copy_to`."""
self.session.put(str(source_file), str(destination_file))
def _close(self, force: bool = False) -> None:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 12/21] dts: interactive remote session docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (10 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 11/21] dts: remote session " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 13/21] dts: port and virtual device " Juraj Linkeš
` (9 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../interactive_remote_session.py | 36 +++----
.../remote_session/interactive_shell.py | 99 +++++++++++--------
dts/framework/remote_session/python_shell.py | 26 ++++-
dts/framework/remote_session/testpmd_shell.py | 59 +++++++++--
4 files changed, 150 insertions(+), 70 deletions(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 098ded1bb0..1cc82e3377 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -22,27 +22,23 @@
class InteractiveRemoteSession:
"""SSH connection dedicated to interactive applications.
- This connection is created using paramiko and is a persistent connection to the
- host. This class defines methods for connecting to the node and configures this
- connection to send "keep alive" packets every 30 seconds. Because paramiko attempts
- to use SSH keys to establish a connection first, providing a password is optional.
- This session is utilized by InteractiveShells and cannot be interacted with
- directly.
-
- Arguments:
- node_config: Configuration class for the node you are connecting to.
- _logger: Desired logger for this session to use.
+ The connection is created using `paramiko <https://docs.paramiko.org/en/latest/>`_
+ and is a persistent connection to the host. This class defines the methods for connecting
+ to the node and configures the connection to send "keep alive" packets every 30 seconds.
+ Because paramiko attempts to use SSH keys to establish a connection first, providing
+ a password is optional. This session is utilized by InteractiveShells
+ and cannot be interacted with directly.
Attributes:
- hostname: Hostname that will be used to initialize a connection to the node.
- ip: A subsection of hostname that removes the port for the connection if there
+ hostname: The hostname that will be used to initialize a connection to the node.
+ ip: A subsection of `hostname` that removes the port for the connection if there
is one. If there is no port, this will be the same as hostname.
- port: Port to use for the ssh connection. This will be extracted from the
- hostname if there is a port included, otherwise it will default to 22.
+ port: Port to use for the ssh connection. This will be extracted from `hostname`
+ if there is a port included, otherwise it will default to ``22``.
username: User to connect to the node with.
password: Password of the user connecting to the host. This will default to an
empty string if a password is not provided.
- session: Underlying paramiko connection.
+ session: The underlying paramiko connection.
Raises:
SSHConnectionError: There is an error creating the SSH connection.
@@ -58,9 +54,15 @@ class InteractiveRemoteSession:
_node_config: NodeConfiguration
_transport: Transport | None
- def __init__(self, node_config: NodeConfiguration, _logger: DTSLOG) -> None:
+ def __init__(self, node_config: NodeConfiguration, logger: DTSLOG) -> None:
+ """Connect to the node during initialization.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ logger: The logger instance this session will use.
+ """
self._node_config = node_config
- self._logger = _logger
+ self._logger = logger
self.hostname = node_config.hostname
self.username = node_config.user
self.password = node_config.password if node_config.password else ""
diff --git a/dts/framework/remote_session/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
index 4db19fb9b3..b158f963b6 100644
--- a/dts/framework/remote_session/interactive_shell.py
+++ b/dts/framework/remote_session/interactive_shell.py
@@ -3,18 +3,20 @@
"""Common functionality for interactive shell handling.
-This base class, InteractiveShell, is meant to be extended by other classes that
-contain functionality specific to that shell type. These derived classes will often
-modify things like the prompt to expect or the arguments to pass into the application,
-but still utilize the same method for sending a command and collecting output. How
-this output is handled however is often application specific. If an application needs
-elevated privileges to start it is expected that the method for gaining those
-privileges is provided when initializing the class.
+The base class, :class:`InteractiveShell`, is meant to be extended by subclasses that contain
+functionality specific to that shell type. These subclasses will often modify things like
+the prompt to expect or the arguments to pass into the application, but still utilize
+the same method for sending a command and collecting output. How this output is handled however
+is often application specific. If an application needs elevated privileges to start it is expected
+that the method for gaining those privileges is provided when initializing the class.
+
+The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+environment variable configure the timeout of getting the output from command execution.
"""
from abc import ABC
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from paramiko import Channel, SSHClient, channel # type: ignore[import]
@@ -30,28 +32,6 @@ class InteractiveShell(ABC):
and collecting input until reaching a certain prompt. All interactive applications
will use the same SSH connection, but each will create their own channel on that
session.
-
- Arguments:
- interactive_session: The SSH session dedicated to interactive shells.
- logger: Logger used for displaying information in the console.
- get_privileged_command: Method for modifying a command to allow it to use
- elevated privileges. If this is None, the application will not be started
- with elevated privileges.
- app_args: Command line arguments to be passed to the application on startup.
- timeout: Timeout used for the SSH channel that is dedicated to this interactive
- shell. This timeout is for collecting output, so if reading from the buffer
- and no output is gathered within the timeout, an exception is thrown.
-
- Attributes
- _default_prompt: Prompt to expect at the end of output when sending a command.
- This is often overridden by derived classes.
- _command_extra_chars: Extra characters to add to the end of every command
- before sending them. This is often overridden by derived classes and is
- most commonly an additional newline character.
- path: Path to the executable to start the interactive application.
- dpdk_app: Whether this application is a DPDK app. If it is, the build
- directory for DPDK on the node will be prepended to the path to the
- executable.
"""
_interactive_session: SSHClient
@@ -61,10 +41,22 @@ class InteractiveShell(ABC):
_logger: DTSLOG
_timeout: float
_app_args: str
- _default_prompt: str = ""
- _command_extra_chars: str = ""
- path: PurePath
- dpdk_app: bool = False
+
+ #: Prompt to expect at the end of output when sending a command.
+ #: This is often overridden by subclasses.
+ _default_prompt: ClassVar[str] = ""
+
+ #: Extra characters to add to the end of every command
+ #: before sending them. This is often overridden by subclasses and is
+ #: most commonly an additional newline character.
+ _command_extra_chars: ClassVar[str] = ""
+
+ #: Path to the executable to start the interactive application.
+ path: ClassVar[PurePath]
+
+ #: Whether this application is a DPDK app. If it is, the build directory
+ #: for DPDK on the node will be prepended to the path to the executable.
+ dpdk_app: ClassVar[bool] = False
def __init__(
self,
@@ -74,6 +66,19 @@ def __init__(
app_args: str = "",
timeout: float = SETTINGS.timeout,
) -> None:
+ """Create an SSH channel during initialization.
+
+ Args:
+ interactive_session: The SSH session dedicated to interactive shells.
+ logger: The logger instance this session will use.
+ get_privileged_command: A method for modifying a command to allow it to use
+ elevated privileges. If :data:`None`, the application will not be started
+ with elevated privileges.
+ app_args: The command line arguments to be passed to the application on startup.
+ timeout: The timeout used for the SSH channel that is dedicated to this interactive
+ shell. This timeout is for collecting output, so if reading from the buffer
+ and no output is gathered within the timeout, an exception is thrown.
+ """
self._interactive_session = interactive_session
self._ssh_channel = self._interactive_session.invoke_shell()
self._stdin = self._ssh_channel.makefile_stdin("w")
@@ -90,6 +95,10 @@ def _start_application(self, get_privileged_command: Callable[[str], str] | None
This method is often overridden by subclasses as their process for
starting may look different.
+
+ Args:
+ get_privileged_command: A function (but could be any callable) that produces
+ the version of the command with elevated privileges.
"""
start_command = f"{self.path} {self._app_args}"
if get_privileged_command is not None:
@@ -97,16 +106,24 @@ def _start_application(self, get_privileged_command: Callable[[str], str] | None
self.send_command(start_command)
def send_command(self, command: str, prompt: str | None = None) -> str:
- """Send a command and get all output before the expected ending string.
+ """Send `command` and get all output before the expected ending string.
Lines that expect input are not included in the stdout buffer, so they cannot
- be used for expect. For example, if you were prompted to log into something
- with a username and password, you cannot expect "username:" because it won't
- yet be in the stdout buffer. A workaround for this could be consuming an
- extra newline character to force the current prompt into the stdout buffer.
+ be used for expect.
+
+ Example:
+ If you were prompted to log into something with a username and password,
+ you cannot expect ``username:`` because it won't yet be in the stdout buffer.
+ A workaround for this could be consuming an extra newline character to force
+ the current `prompt` into the stdout buffer.
+
+ Args:
+ command: The command to send.
+ prompt: After sending the command, `send_command` will be expecting this string.
+ If :data:`None`, will use the class's default prompt.
Returns:
- All output in the buffer before expected string
+ All output in the buffer before expected string.
"""
self._logger.info(f"Sending: '{command}'")
if prompt is None:
@@ -124,8 +141,10 @@ def send_command(self, command: str, prompt: str | None = None) -> str:
return out
def close(self) -> None:
+ """Properly free all resources."""
self._stdin.close()
self._ssh_channel.close()
def __del__(self) -> None:
+ """Make sure the session is properly closed before deleting the object."""
self.close()
diff --git a/dts/framework/remote_session/python_shell.py b/dts/framework/remote_session/python_shell.py
index cc3ad48a68..ccfd3783e8 100644
--- a/dts/framework/remote_session/python_shell.py
+++ b/dts/framework/remote_session/python_shell.py
@@ -1,12 +1,32 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Python interactive shell.
+
+Typical usage example in a TestSuite::
+
+ from framework.remote_session import PythonShell
+ python_shell = self.tg_node.create_interactive_shell(
+ PythonShell, timeout=5, privileged=True
+ )
+ python_shell.send_command("print('Hello World')")
+ python_shell.close()
+"""
+
from pathlib import PurePath
+from typing import ClassVar
from .interactive_shell import InteractiveShell
class PythonShell(InteractiveShell):
- _default_prompt: str = ">>>"
- _command_extra_chars: str = "\n"
- path: PurePath = PurePath("python3")
+ """Python interactive shell."""
+
+ #: Python's prompt.
+ _default_prompt: ClassVar[str] = ">>>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
+
+ #: The Python executable.
+ path: ClassVar[PurePath] = PurePath("python3")
diff --git a/dts/framework/remote_session/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
index 08ac311016..0184cc2e71 100644
--- a/dts/framework/remote_session/testpmd_shell.py
+++ b/dts/framework/remote_session/testpmd_shell.py
@@ -1,41 +1,80 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+"""Testpmd interactive shell.
+
+Typical usage example in a TestSuite::
+
+ testpmd_shell = self.sut_node.create_interactive_shell(
+ TestPmdShell, privileged=True
+ )
+ devices = testpmd_shell.get_devices()
+ for device in devices:
+ print(device)
+ testpmd_shell.close()
+"""
from pathlib import PurePath
-from typing import Callable
+from typing import Callable, ClassVar
from .interactive_shell import InteractiveShell
class TestPmdDevice(object):
+ """The data of a device that testpmd can recognize.
+
+ Attributes:
+ pci_address: The PCI address of the device.
+ """
+
pci_address: str
def __init__(self, pci_address_line: str):
+ """Initialize the device from the testpmd output line string.
+
+ Args:
+ pci_address_line: A line of testpmd output that contains a device.
+ """
self.pci_address = pci_address_line.strip().split(": ")[1].strip()
def __str__(self) -> str:
+ """The PCI address captures what the device is."""
return self.pci_address
class TestPmdShell(InteractiveShell):
- path: PurePath = PurePath("app", "dpdk-testpmd")
- dpdk_app: bool = True
- _default_prompt: str = "testpmd>"
- _command_extra_chars: str = "\n" # We want to append an extra newline to every command
+ """Testpmd interactive shell.
+
+ The testpmd shell users should never use
+ the :meth:`~.interactive_shell.InteractiveShell.send_command` method directly, but rather
+ call specialized methods. If there isn't one that satisfies a need, it should be added.
+ """
+
+ #: The path to the testpmd executable.
+ path: ClassVar[PurePath] = PurePath("app", "dpdk-testpmd")
+
+ #: Flag this as a DPDK app so that it's clear this is not a system app and
+ #: needs to be looked in a specific path.
+ dpdk_app: ClassVar[bool] = True
+
+ #: The testpmd's prompt.
+ _default_prompt: ClassVar[str] = "testpmd>"
+
+ #: This forces the prompt to appear after sending a command.
+ _command_extra_chars: ClassVar[str] = "\n"
def _start_application(self, get_privileged_command: Callable[[str], str] | None) -> None:
- """See "_start_application" in InteractiveShell."""
self._app_args += " -- -i"
super()._start_application(get_privileged_command)
def get_devices(self) -> list[TestPmdDevice]:
- """Get a list of device names that are known to testpmd
+ """Get a list of device names that are known to testpmd.
- Uses the device info listed in testpmd and then parses the output to
- return only the names of the devices.
+ Uses the device info listed in testpmd and then parses the output.
Returns:
- A list of strings representing device names (e.g. 0000:14:00.1)
+ A list of devices.
"""
dev_info: str = self.send_command("show device info all")
dev_list: list[TestPmdDevice] = []
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 13/21] dts: port and virtual device docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (11 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 12/21] dts: interactive " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 14/21] dts: cpu " Juraj Linkeš
` (8 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/__init__.py | 17 ++++--
dts/framework/testbed_model/port.py | 53 +++++++++++++++----
dts/framework/testbed_model/virtual_device.py | 17 +++++-
3 files changed, 72 insertions(+), 15 deletions(-)
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 8ced05653b..6086512ca2 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -2,9 +2,20 @@
# Copyright(c) 2022-2023 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
-This package contains the classes used to model the physical traffic generator,
-system under test and any other components that need to be interacted with.
+"""Testbed modelling.
+
+This package defines the testbed elements DTS works with:
+
+ * A system under test node: :class:`~.sut_node.SutNode`,
+ * A traffic generator node: :class:`~.tg_node.TGNode`,
+ * The ports of network interface cards (NICs) present on nodes: :class:`~.port.Port`,
+ * The logical cores of CPUs present on nodes: :class:`~.cpu.LogicalCore`,
+ * The virtual devices that can be created on nodes: :class:`~.virtual_device.VirtualDevice`,
+ * The operating systems running on nodes: :class:`~.linux_session.LinuxSession`
+ and :class:`~.posix_session.PosixSession`.
+
+DTS needs to be able to connect to nodes and understand some of the hardware present on these nodes
+to properly build and test DPDK.
"""
# pylama:ignore=W0611
diff --git a/dts/framework/testbed_model/port.py b/dts/framework/testbed_model/port.py
index 680c29bfe3..817405bea4 100644
--- a/dts/framework/testbed_model/port.py
+++ b/dts/framework/testbed_model/port.py
@@ -2,6 +2,13 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""NIC port model.
+
+Basic port information, such as location (the port are identified by their PCI address on a node),
+drivers and address.
+"""
+
+
from dataclasses import dataclass
from framework.config import PortConfig
@@ -9,24 +16,35 @@
@dataclass(slots=True, frozen=True)
class PortIdentifier:
+ """The port identifier.
+
+ Attributes:
+ node: The node where the port resides.
+ pci: The PCI address of the port on `node`.
+ """
+
node: str
pci: str
@dataclass(slots=True)
class Port:
- """
- identifier: The PCI address of the port on a node.
-
- os_driver: The driver used by this port when the OS is controlling it.
- Example: i40e
- os_driver_for_dpdk: The driver the device must be bound to for DPDK to use it,
- Example: vfio-pci.
+ """Physical port on a node.
- Note: os_driver and os_driver_for_dpdk may be the same thing.
- Example: mlx5_core
+ The ports are identified by the node they're on and their PCI addresses. The port on the other
+ side of the connection is also captured here.
+ Each port is serviced by a driver, which may be different for the operating system (`os_driver`)
+ and for DPDK (`os_driver_for_dpdk`). For some devices, they are the same, e.g.: ``mlx5_core``.
- peer: The identifier of a port this port is connected with.
+ Attributes:
+ identifier: The PCI address of the port on a node.
+ os_driver: The operating system driver name when the operating system controls the port,
+ e.g.: ``i40e``.
+ os_driver_for_dpdk: The operating system driver name for use with DPDK, e.g.: ``vfio-pci``.
+ peer: The identifier of a port this port is connected with.
+ The `peer` is on a different node.
+ mac_address: The MAC address of the port.
+ logical_name: The logical name of the port. Must be discovered.
"""
identifier: PortIdentifier
@@ -37,6 +55,12 @@ class Port:
logical_name: str = ""
def __init__(self, node_name: str, config: PortConfig):
+ """Initialize the port from `node_name` and `config`.
+
+ Args:
+ node_name: The name of the port's node.
+ config: The test run configuration of the port.
+ """
self.identifier = PortIdentifier(
node=node_name,
pci=config.pci,
@@ -47,14 +71,23 @@ def __init__(self, node_name: str, config: PortConfig):
@property
def node(self) -> str:
+ """The node where the port resides."""
return self.identifier.node
@property
def pci(self) -> str:
+ """The PCI address of the port."""
return self.identifier.pci
@dataclass(slots=True, frozen=True)
class PortLink:
+ """The physical, cabled connection between the ports.
+
+ Attributes:
+ sut_port: The port on the SUT node connected to `tg_port`.
+ tg_port: The port on the TG node connected to `sut_port`.
+ """
+
sut_port: Port
tg_port: Port
diff --git a/dts/framework/testbed_model/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
index eb664d9f17..e9b5e9c3be 100644
--- a/dts/framework/testbed_model/virtual_device.py
+++ b/dts/framework/testbed_model/virtual_device.py
@@ -1,16 +1,29 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""Virtual devices model.
+
+Alongside support for physical hardware, DPDK can create various virtual devices.
+"""
+
class VirtualDevice(object):
- """
- Base class for virtual devices used by DPDK.
+ """Base class for virtual devices used by DPDK.
+
+ Attributes:
+ name: The name of the virtual device.
"""
name: str
def __init__(self, name: str):
+ """Initialize the virtual device.
+
+ Args:
+ name: The name of the virtual device.
+ """
self.name = name
def __str__(self) -> str:
+ """This corresponds to the name used for DPDK devices."""
return self.name
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 14/21] dts: cpu docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (12 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 13/21] dts: port and virtual device " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 15/21] dts: os session " Juraj Linkeš
` (7 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/cpu.py | 196 +++++++++++++++++++++--------
1 file changed, 144 insertions(+), 52 deletions(-)
diff --git a/dts/framework/testbed_model/cpu.py b/dts/framework/testbed_model/cpu.py
index 1b392689f5..9e33b2825d 100644
--- a/dts/framework/testbed_model/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -1,6 +1,22 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""CPU core representation and filtering.
+
+This module provides a unified representation of logical CPU cores along
+with filtering capabilities.
+
+When symmetric multiprocessing (SMP or multithreading) is enabled on a server,
+the physical CPU cores are split into logical CPU cores with different IDs.
+
+:class:`LogicalCoreCountFilter` filters by the number of logical cores. It's possible to specify
+the socket from which to filter the number of logical cores. It's also possible to not use all
+logical CPU cores from each physical core (e.g. only the first logical core of each physical core).
+
+:class:`LogicalCoreListFilter` filters by logical core IDs. This mostly checks that
+the logical cores are actually present on the server.
+"""
+
import dataclasses
from abc import ABC, abstractmethod
from collections.abc import Iterable, ValuesView
@@ -11,9 +27,17 @@
@dataclass(slots=True, frozen=True)
class LogicalCore(object):
- """
- Representation of a CPU core. A physical core is represented in OS
- by multiple logical cores (lcores) if CPU multithreading is enabled.
+ """Representation of a logical CPU core.
+
+ A physical core is represented in OS by multiple logical cores (lcores)
+ if CPU multithreading is enabled. When multithreading is disabled, their IDs are the same.
+
+ Attributes:
+ lcore: The logical core ID of a CPU core. It's the same as `core` with
+ disabled multithreading.
+ core: The physical core ID of a CPU core.
+ socket: The physical socket ID where the CPU resides.
+ node: The NUMA node ID where the CPU resides.
"""
lcore: int
@@ -22,27 +46,36 @@ class LogicalCore(object):
node: int
def __int__(self) -> int:
+ """The CPU is best represented by the logical core, as that's what we configure in EAL."""
return self.lcore
class LogicalCoreList(object):
- """
- Convert these options into a list of logical core ids.
- lcore_list=[LogicalCore1, LogicalCore2] - a list of LogicalCores
- lcore_list=[0,1,2,3] - a list of int indices
- lcore_list=['0','1','2-3'] - a list of str indices; ranges are supported
- lcore_list='0,1,2-3' - a comma delimited str of indices; ranges are supported
-
- The class creates a unified format used across the framework and allows
- the user to use either a str representation (using str(instance) or directly
- in f-strings) or a list representation (by accessing instance.lcore_list).
- Empty lcore_list is allowed.
+ r"""A unified way to store :class:`LogicalCore`\s.
+
+ Create a unified format used across the framework and allow the user to use
+ either a :class:`str` representation (using ``str(instance)`` or directly in f-strings)
+ or a :class:`list` representation (by accessing the `lcore_list` property,
+ which stores logical core IDs).
"""
_lcore_list: list[int]
_lcore_str: str
def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
+ """Process `lcore_list`, then sort.
+
+ There are four supported logical core list formats::
+
+ lcore_list=[LogicalCore1, LogicalCore2] # a list of LogicalCores
+ lcore_list=[0,1,2,3] # a list of int indices
+ lcore_list=['0','1','2-3'] # a list of str indices; ranges are supported
+ lcore_list='0,1,2-3' # a comma delimited str of indices; ranges are supported
+
+ Args:
+ lcore_list: Various ways to represent multiple logical cores.
+ Empty `lcore_list` is allowed.
+ """
self._lcore_list = []
if isinstance(lcore_list, str):
lcore_list = lcore_list.split(",")
@@ -58,6 +91,7 @@ def __init__(self, lcore_list: list[int] | list[str] | list[LogicalCore] | str):
@property
def lcore_list(self) -> list[int]:
+ """The logical core IDs."""
return self._lcore_list
def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
@@ -83,28 +117,30 @@ def _get_consecutive_lcores_range(self, lcore_ids_list: list[int]) -> list[str]:
return formatted_core_list
def __str__(self) -> str:
+ """The consecutive ranges of logical core IDs."""
return self._lcore_str
@dataclasses.dataclass(slots=True, frozen=True)
class LogicalCoreCount(object):
- """
- Define the number of logical cores to use.
- If sockets is not None, socket_count is ignored.
- """
+ """Define the number of logical cores per physical cores per sockets."""
+ #: Use this many logical cores per each physical core.
lcores_per_core: int = 1
+ #: Use this many physical cores per each socket.
cores_per_socket: int = 2
+ #: Use this many sockets.
socket_count: int = 1
+ #: Use exactly these sockets. This takes precedence over `socket_count`,
+ #: so when `sockets` is not :data:`None`, `socket_count` is ignored.
sockets: list[int] | None = None
class LogicalCoreFilter(ABC):
- """
- Filter according to the input filter specifier. Each filter needs to be
- implemented in a derived class.
- This class only implements operations common to all filters, such as sorting
- the list to be filtered beforehand.
+ """Common filtering class.
+
+ Each filter needs to be implemented in a subclass. This base class sorts the list of cores
+ and defines the filtering method, which must be implemented by subclasses.
"""
_filter_specifier: LogicalCoreCount | LogicalCoreList
@@ -116,6 +152,17 @@ def __init__(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
):
+ """Filter according to the input filter specifier.
+
+ The input `lcore_list` is copied and sorted by physical core before filtering.
+ The list is copied so that the original is left intact.
+
+ Args:
+ lcore_list: The logical CPU cores to filter.
+ filter_specifier: Filter cores from `lcore_list` according to this filter.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If data:`False`,
+ sort in descending order.
+ """
self._filter_specifier = filter_specifier
# sorting by core is needed in case hyperthreading is enabled
@@ -124,31 +171,45 @@ def __init__(
@abstractmethod
def filter(self) -> list[LogicalCore]:
- """
- Use self._filter_specifier to filter self._lcores_to_filter
- and return the list of filtered LogicalCores.
- self._lcores_to_filter is a sorted copy of the original list,
- so it may be modified.
+ r"""Filter the cores.
+
+ Use `self._filter_specifier` to filter `self._lcores_to_filter` and return
+ the filtered :class:`LogicalCore`\s.
+ `self._lcores_to_filter` is a sorted copy of the original list, so it may be modified.
+
+ Returns:
+ The filtered cores.
"""
class LogicalCoreCountFilter(LogicalCoreFilter):
- """
+ """Filter cores by specified counts.
+
Filter the input list of LogicalCores according to specified rules:
- Use cores from the specified number of sockets or from the specified socket ids.
- If sockets is specified, it takes precedence over socket_count.
- From each of those sockets, use only cores_per_socket of cores.
- And for each core, use lcores_per_core of logical cores. Hypertheading
- must be enabled for this to take effect.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+
+ * The input `filter_specifier` is :class:`LogicalCoreCount`,
+ * Use cores from the specified number of sockets or from the specified socket ids,
+ * If `sockets` is specified, it takes precedence over `socket_count`,
+ * From each of those sockets, use only `cores_per_socket` of cores,
+ * And for each core, use `lcores_per_core` of logical cores. Hypertheading
+ must be enabled for this to take effect.
"""
_filter_specifier: LogicalCoreCount
def filter(self) -> list[LogicalCore]:
+ """Filter the cores according to :class:`LogicalCoreCount`.
+
+ Start by filtering the allowed sockets. The cores matching the allowed sockets are returned.
+ The cores of each socket are stored in separate lists.
+
+ Then filter the allowed physical cores from those lists of cores per socket. When filtering
+ physical cores, store the desired number of logical cores per physical core which then
+ together constitute the final filtered list.
+
+ Returns:
+ The filtered cores.
+ """
sockets_to_filter = self._filter_sockets(self._lcores_to_filter)
filtered_lcores = []
for socket_to_filter in sockets_to_filter:
@@ -158,24 +219,37 @@ def filter(self) -> list[LogicalCore]:
def _filter_sockets(
self, lcores_to_filter: Iterable[LogicalCore]
) -> ValuesView[list[LogicalCore]]:
- """
- Remove all lcores that don't match the specified socket(s).
- If self._filter_specifier.sockets is not None, keep lcores from those sockets,
- otherwise keep lcores from the first
- self._filter_specifier.socket_count sockets.
+ """Filter a list of cores per each allowed socket.
+
+ The sockets may be specified in two ways, either a number or a specific list of sockets.
+ In case of a specific list, we just need to return the cores from those sockets.
+ If filtering a number of cores, we need to go through all cores and note which sockets
+ appear and only filter from the first n that appear.
+
+ Args:
+ lcores_to_filter: The cores to filter. These must be sorted by the physical core.
+
+ Returns:
+ A list of lists of logical CPU cores. Each list contains cores from one socket.
"""
allowed_sockets: set[int] = set()
socket_count = self._filter_specifier.socket_count
if self._filter_specifier.sockets:
+ # when sockets in filter is specified, the sockets are already set
socket_count = len(self._filter_specifier.sockets)
allowed_sockets = set(self._filter_specifier.sockets)
+ # filter socket_count sockets from all sockets by checking the socket of each CPU
filtered_lcores: dict[int, list[LogicalCore]] = {}
for lcore in lcores_to_filter:
if not self._filter_specifier.sockets:
+ # this is when sockets is not set, so we do the actual filtering
+ # when it is set, allowed_sockets is already defined and can't be changed
if len(allowed_sockets) < socket_count:
+ # allowed_sockets is a set, so adding an existing socket won't re-add it
allowed_sockets.add(lcore.socket)
if lcore.socket in allowed_sockets:
+ # separate lcores into sockets; this makes it easier in further processing
if lcore.socket in filtered_lcores:
filtered_lcores[lcore.socket].append(lcore)
else:
@@ -192,12 +266,13 @@ def _filter_sockets(
def _filter_cores_from_socket(
self, lcores_to_filter: Iterable[LogicalCore]
) -> list[LogicalCore]:
- """
- Keep only the first self._filter_specifier.cores_per_socket cores.
- In multithreaded environments, keep only
- the first self._filter_specifier.lcores_per_core lcores of those cores.
- """
+ """Filter a list of cores from the given socket.
+
+ Go through the cores and note how many logical cores per physical core have been filtered.
+ Returns:
+ The filtered logical CPU cores.
+ """
# no need to use ordered dict, from Python3.7 the dict
# insertion order is preserved (LIFO).
lcore_count_per_core_map: dict[int, int] = {}
@@ -238,15 +313,21 @@ def _filter_cores_from_socket(
class LogicalCoreListFilter(LogicalCoreFilter):
- """
- Filter the input list of Logical Cores according to the input list of
- lcore indices.
- An empty LogicalCoreList won't filter anything.
+ """Filter the logical CPU cores by logical CPU core IDs.
+
+ This is a simple filter that looks at logical CPU IDs and only filter those that match.
+
+ The input filter is :class:`LogicalCoreList`. An empty LogicalCoreList won't filter anything.
"""
_filter_specifier: LogicalCoreList
def filter(self) -> list[LogicalCore]:
+ """Filter based on logical CPU core ID.
+
+ Return:
+ The filtered logical CPU cores.
+ """
if not len(self._filter_specifier.lcore_list):
return self._lcores_to_filter
@@ -269,6 +350,17 @@ def lcore_filter(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool,
) -> LogicalCoreFilter:
+ """Factory for providing the filter that corresponds to `filter_specifier`.
+
+ Args:
+ core_list: The logical CPU cores to filter.
+ filter_specifier: The filter to use.
+ ascending: Sort cores in ascending order (lowest to highest IDs). If :data:`False`,
+ sort in descending order.
+
+ Returns:
+ The filter that corresponds to `filter_specifier`.
+ """
if isinstance(filter_specifier, LogicalCoreList):
return LogicalCoreListFilter(core_list, filter_specifier, ascending)
elif isinstance(filter_specifier, LogicalCoreCount):
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 15/21] dts: os session docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (13 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 14/21] dts: cpu " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 16/21] dts: posix and linux sessions " Juraj Linkeš
` (6 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/os_session.py | 272 ++++++++++++++++------
1 file changed, 205 insertions(+), 67 deletions(-)
diff --git a/dts/framework/testbed_model/os_session.py b/dts/framework/testbed_model/os_session.py
index 76e595a518..ac6bb5e112 100644
--- a/dts/framework/testbed_model/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -2,6 +2,26 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""OS-aware remote session.
+
+DPDK supports multiple different operating systems, meaning it can run on these different operating
+systems. This module defines the common API that OS-unaware layers use and translates the API into
+OS-aware calls/utility usage.
+
+Note:
+ Running commands with administrative privileges requires OS awareness. This is the only layer
+ that's aware of OS differences, so this is where non-privileged command get converted
+ to privileged commands.
+
+Example:
+ A user wishes to remove a directory on a remote :class:`~.sut_node.SutNode`.
+ The :class:`~.sut_node.SutNode` object isn't aware what OS the node is running - it delegates
+ the OS translation logic to :attr:`~.node.Node.main_session`. The SUT node calls
+ :meth:`~OSSession.remove_remote_dir` with a generic, OS-unaware path and
+ the :attr:`~.node.Node.main_session` translates that to ``rm -rf`` if the node's OS is Linux
+ and other commands for other OSs. It also translates the path to match the underlying OS.
+"""
+
from abc import ABC, abstractmethod
from collections.abc import Iterable
from ipaddress import IPv4Interface, IPv6Interface
@@ -28,10 +48,16 @@
class OSSession(ABC):
- """
- The OS classes create a DTS node remote session and implement OS specific
+ """OS-unaware to OS-aware translation API definition.
+
+ The OSSession classes create a remote session to a DTS node and implement OS specific
behavior. There a few control methods implemented by the base class, the rest need
- to be implemented by derived classes.
+ to be implemented by subclasses.
+
+ Attributes:
+ name: The name of the session.
+ remote_session: The remote session maintaining the connection to the node.
+ interactive_session: The interactive remote session maintaining the connection to the node.
"""
_config: NodeConfiguration
@@ -46,6 +72,15 @@ def __init__(
name: str,
logger: DTSLOG,
):
+ """Initialize the OS-aware session.
+
+ Connect to the node right away and also create an interactive remote session.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
self._config = node_config
self.name = name
self._logger = logger
@@ -53,15 +88,15 @@ def __init__(
self.interactive_session = create_interactive_session(node_config, logger)
def close(self, force: bool = False) -> None:
- """
- Close the remote session.
+ """Close the underlying remote session.
+
+ Args:
+ force: Force the closure of the connection.
"""
self.remote_session.close(force)
def is_alive(self) -> bool:
- """
- Check whether the remote session is still responding.
- """
+ """Check whether the underlying remote session is still responding."""
return self.remote_session.is_alive()
def send_command(
@@ -72,10 +107,23 @@ def send_command(
verify: bool = False,
env: dict | None = None,
) -> CommandResult:
- """
- An all-purpose API in case the command to be executed is already
- OS-agnostic, such as when the path to the executed command has been
- constructed beforehand.
+ """An all-purpose API for OS-agnostic commands.
+
+ This can be used for an execution of a portable command that's executed the same way
+ on all operating systems, such as Python.
+
+ The :option:`--timeout` command line argument and the :envvar:`DTS_TIMEOUT`
+ environment variable configure the timeout of command execution.
+
+ Args:
+ command: The command to execute.
+ timeout: Wait at most this long in seconds for `command` execution to complete.
+ privileged: Whether to run the command with administrative privileges.
+ verify: If :data:`True`, will check the exit code of the command.
+ env: A dictionary with environment variables to be used with the command execution.
+
+ Raises:
+ RemoteCommandExecutionError: If verify is :data:`True` and the command failed.
"""
if privileged:
command = self._get_privileged_command(command)
@@ -89,8 +137,20 @@ def create_interactive_shell(
privileged: bool,
app_args: str,
) -> InteractiveShellType:
- """
- See "create_interactive_shell" in SutNode
+ """Factory for interactive session handlers.
+
+ Instantiate `shell_cls` according to the remote OS specifics.
+
+ Args:
+ shell_cls: The class of the shell.
+ timeout: Timeout for reading output from the SSH channel. If you are
+ reading from the buffer and don't receive any data within the timeout
+ it will throw an error.
+ privileged: Whether to run the shell with administrative privileges.
+ app_args: The arguments to be passed to the application.
+
+ Returns:
+ An instance of the desired interactive application shell.
"""
return shell_cls(
self.interactive_session.session,
@@ -114,27 +174,42 @@ def _get_privileged_command(command: str) -> str:
@abstractmethod
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
- """
- Try to find DPDK remote dir in remote_dir.
+ """Try to find DPDK directory in `remote_dir`.
+
+ The directory is the one which is created after the extraction of the tarball. The files
+ are usually extracted into a directory starting with ``dpdk-``.
+
+ Returns:
+ The absolute path of the DPDK remote directory, empty path if not found.
"""
@abstractmethod
def get_remote_tmp_dir(self) -> PurePath:
- """
- Get the path of the temporary directory of the remote OS.
+ """Get the path of the temporary directory of the remote OS.
+
+ Returns:
+ The absolute path of the temporary directory.
"""
@abstractmethod
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for the target architecture. Get
- information from the node if needed.
+ """Create extra environment variables needed for the target architecture.
+
+ Different architectures may require different configuration, such as setting 32-bit CFLAGS.
+
+ Returns:
+ A dictionary with keys as environment variables.
"""
@abstractmethod
def join_remote_path(self, *args: str | PurePath) -> PurePath:
- """
- Join path parts using the path separator that fits the remote OS.
+ """Join path parts using the path separator that fits the remote OS.
+
+ Args:
+ args: Any number of paths to join.
+
+ Returns:
+ The resulting joined path.
"""
@abstractmethod
@@ -143,13 +218,13 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from the remote Node to the local filesystem.
+ """Copy a file from the remote node to the local filesystem.
- Copy source_file from the remote Node associated with this remote
- session to destination_file on the local filesystem.
+ Copy `source_file` from the remote node associated with this remote
+ session to `destination_file` on the local filesystem.
Args:
- source_file: the file on the remote Node.
+ source_file: the file on the remote node.
destination_file: a file or directory path on the local filesystem.
"""
@@ -159,14 +234,14 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
- """Copy a file from local filesystem to the remote Node.
+ """Copy a file from local filesystem to the remote node.
- Copy source_file from local filesystem to destination_file
- on the remote Node associated with this remote session.
+ Copy `source_file` from local filesystem to `destination_file`
+ on the remote node associated with this remote session.
Args:
source_file: the file on the local filesystem.
- destination_file: a file or directory path on the remote Node.
+ destination_file: a file or directory path on the remote node.
"""
@abstractmethod
@@ -176,8 +251,12 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
- """
- Remove remote directory, by default remove recursively and forcefully.
+ """Remove remote directory, by default remove recursively and forcefully.
+
+ Args:
+ remote_dir_path: The path of the directory to remove.
+ recursive: If :data:`True`, also remove all contents inside the directory.
+ force: If :data:`True`, ignore all warnings and try to remove at all costs.
"""
@abstractmethod
@@ -186,9 +265,12 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
- """
- Extract remote tarball in place. If expected_dir is a non-empty string, check
- whether the dir exists after extracting the archive.
+ """Extract remote tarball in its remote directory.
+
+ Args:
+ remote_tarball_path: The path of the tarball on the remote node.
+ expected_dir: If non-empty, check whether `expected_dir` exists after extracting
+ the archive.
"""
@abstractmethod
@@ -201,69 +283,119 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
- """
- Build DPDK in the input dir with specified environment variables and meson
- arguments.
+ """Build DPDK on the remote node.
+
+ An extracted DPDK tarball must be present on the node. The build consists of two steps::
+
+ meson setup <meson args> remote_dpdk_dir remote_dpdk_build_dir
+ ninja -C remote_dpdk_build_dir
+
+ The :option:`--compile-timeout` command line argument and the :envvar:`DTS_COMPILE_TIMEOUT`
+ environment variable configure the timeout of DPDK build.
+
+ Args:
+ env_vars: Use these environment variables when building DPDK.
+ meson_args: Use these meson arguments when building DPDK.
+ remote_dpdk_dir: The directory on the remote node where DPDK will be built.
+ remote_dpdk_build_dir: The target build directory on the remote node.
+ rebuild: If :data:`True`, do a subsequent build with ``meson configure`` instead
+ of ``meson setup``.
+ timeout: Wait at most this long in seconds for the build execution to complete.
"""
@abstractmethod
def get_dpdk_version(self, version_path: str | PurePath) -> str:
- """
- Inspect DPDK version on the remote node from version_path.
+ """Inspect the DPDK version on the remote node.
+
+ Args:
+ version_path: The path to the VERSION file containing the DPDK version.
+
+ Returns:
+ The DPDK version.
"""
@abstractmethod
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
- """
- Compose a list of LogicalCores present on the remote node.
- If use_first_core is False, the first physical core won't be used.
+ r"""Get the list of :class:`~.cpu.LogicalCore`\s on the remote node.
+
+ Args:
+ use_first_core: If :data:`False`, the first physical core won't be used.
+
+ Returns:
+ The logical cores present on the node.
"""
@abstractmethod
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
- """
- Kill and cleanup all DPDK apps identified by dpdk_prefix_list. If
- dpdk_prefix_list is empty, attempt to find running DPDK apps to kill and clean.
+ """Kill and cleanup all DPDK apps.
+
+ Args:
+ dpdk_prefix_list: Kill all apps identified by `dpdk_prefix_list`.
+ If `dpdk_prefix_list` is empty, attempt to find running DPDK apps to kill and clean.
"""
@abstractmethod
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
- """
- Get the DPDK file prefix that will be used when running DPDK apps.
+ """Make OS-specific modification to the DPDK file prefix.
+
+ Args:
+ dpdk_prefix: The OS-unaware file prefix.
+
+ Returns:
+ The OS-specific file prefix.
"""
@abstractmethod
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
- """
- Get the node's Hugepage Size, configure the specified amount of hugepages
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Configure hugepages on the node.
+
+ Get the node's Hugepage Size, configure the specified count of hugepages
if needed and mount the hugepages if needed.
- If force_first_numa is True, configure hugepages just on the first socket.
+
+ Args:
+ hugepage_count: Configure this many hugepages.
+ force_first_numa: If :data:`True`, configure hugepages just on the first numa node.
"""
@abstractmethod
def get_compiler_version(self, compiler_name: str) -> str:
- """
- Get installed version of compiler used for DPDK
+ """Get installed version of compiler used for DPDK.
+
+ Args:
+ compiler_name: The name of the compiler executable.
+
+ Returns:
+ The compiler's version.
"""
@abstractmethod
def get_node_info(self) -> NodeInfo:
- """
- Collect information about the node
+ """Collect additional information about the node.
+
+ Returns:
+ Node information.
"""
@abstractmethod
def update_ports(self, ports: list[Port]) -> None:
- """
- Get additional information about ports:
- Logical name (e.g. enp7s0) if applicable
- Mac address
+ """Get additional information about ports from the operating system and update them.
+
+ The additional information is:
+
+ * Logical name (e.g. ``enp7s0``) if applicable,
+ * Mac address.
+
+ Args:
+ ports: The ports to update.
"""
@abstractmethod
def configure_port_state(self, port: Port, enable: bool) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port` in the operating system.
+
+ Args:
+ port: The port to configure.
+ enable: If :data:`True`, enable the port, otherwise shut it down.
"""
@abstractmethod
@@ -273,12 +405,18 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
- """
- Configure (add or delete) an IP address of the input port.
+ """Configure an IP address on `port` in the operating system.
+
+ Args:
+ address: The address to configure.
+ port: The port to configure.
+ delete: If :data:`True`, remove the IP address, otherwise configure it.
"""
@abstractmethod
def configure_ipv4_forwarding(self, enable: bool) -> None:
- """
- Enable IPv4 forwarding in the underlying OS.
+ """Enable IPv4 forwarding in the operating system.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
"""
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 16/21] dts: posix and linux sessions docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (14 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 15/21] dts: os session " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 17/21] dts: node " Juraj Linkeš
` (5 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/linux_session.py | 64 +++++++++++-----
dts/framework/testbed_model/posix_session.py | 81 +++++++++++++++++---
2 files changed, 114 insertions(+), 31 deletions(-)
diff --git a/dts/framework/testbed_model/linux_session.py b/dts/framework/testbed_model/linux_session.py
index 055765ba2d..0ab59cef85 100644
--- a/dts/framework/testbed_model/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -2,6 +2,13 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""Linux OS translator.
+
+Translate OS-unaware calls into Linux calls/utilities. Most of Linux distributions are mostly
+compliant with POSIX standards, so this module only implements the parts that aren't.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import json
from ipaddress import IPv4Interface, IPv6Interface
from typing import TypedDict, Union
@@ -17,43 +24,52 @@
class LshwConfigurationOutput(TypedDict):
+ """The relevant parts of ``lshw``'s ``configuration`` section."""
+
+ #:
link: str
class LshwOutput(TypedDict):
- """
- A model of the relevant information from json lshw output, e.g.:
- {
- ...
- "businfo" : "pci@0000:08:00.0",
- "logicalname" : "enp8s0",
- "version" : "00",
- "serial" : "52:54:00:59:e1:ac",
- ...
- "configuration" : {
- ...
- "link" : "yes",
- ...
- },
- ...
+ """A model of the relevant information from ``lshw``'s json output.
+
+ Example:
+ ::
+
+ {
+ ...
+ "businfo" : "pci@0000:08:00.0",
+ "logicalname" : "enp8s0",
+ "version" : "00",
+ "serial" : "52:54:00:59:e1:ac",
+ ...
+ "configuration" : {
+ ...
+ "link" : "yes",
+ ...
+ },
+ ...
"""
+ #:
businfo: str
+ #:
logicalname: NotRequired[str]
+ #:
serial: NotRequired[str]
+ #:
configuration: LshwConfigurationOutput
class LinuxSession(PosixSession):
- """
- The implementation of non-Posix compliant parts of Linux remote sessions.
- """
+ """The implementation of non-Posix compliant parts of Linux."""
@staticmethod
def _get_privileged_command(command: str) -> str:
return f"sudo -- sh -c '{command}'"
def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_cpus`."""
cpu_info = self.send_command("lscpu -p=CPU,CORE,SOCKET,NODE|grep -v \\#").stdout
lcores = []
for cpu_line in cpu_info.splitlines():
@@ -65,18 +81,20 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
return lcores
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return dpdk_prefix
- def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
+ def setup_hugepages(self, hugepage_count: int, force_first_numa: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.setup_hugepages`."""
self._logger.info("Getting Hugepage information.")
hugepage_size = self._get_hugepage_size()
hugepages_total = self._get_hugepages_total()
self._numa_nodes = self._get_numa_nodes()
- if force_first_numa or hugepages_total != hugepage_amount:
+ if force_first_numa or hugepages_total != hugepage_count:
# when forcing numa, we need to clear existing hugepages regardless
# of size, so they can be moved to the first numa node
- self._configure_huge_pages(hugepage_amount, hugepage_size, force_first_numa)
+ self._configure_huge_pages(hugepage_count, hugepage_size, force_first_numa)
else:
self._logger.info("Hugepages already configured.")
self._mount_huge_pages()
@@ -132,6 +150,7 @@ def _configure_huge_pages(self, amount: int, size: int, force_first_numa: bool)
self.send_command(f"echo {amount} | tee {hugepage_config_path}", privileged=True)
def update_ports(self, ports: list[Port]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.update_ports`."""
self._logger.debug("Gathering port info.")
for port in ports:
assert port.node == self.name, "Attempted to gather port info on the wrong node"
@@ -161,6 +180,7 @@ def _update_port_attr(self, port: Port, attr_value: str | None, attr_name: str)
)
def configure_port_state(self, port: Port, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_state`."""
state = "up" if enable else "down"
self.send_command(f"ip link set dev {port.logical_name} {state}", privileged=True)
@@ -170,6 +190,7 @@ def configure_port_ip_address(
port: Port,
delete: bool,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_port_ip_address`."""
command = "del" if delete else "add"
self.send_command(
f"ip address {command} {address} dev {port.logical_name}",
@@ -178,5 +199,6 @@ def configure_port_ip_address(
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Overrides :meth:`~.os_session.OSSession.configure_ipv4_forwarding`."""
state = 1 if enable else 0
self.send_command(f"sysctl -w net.ipv4.ip_forward={state}", privileged=True)
diff --git a/dts/framework/testbed_model/posix_session.py b/dts/framework/testbed_model/posix_session.py
index 5657cc0bc9..d279bb8b53 100644
--- a/dts/framework/testbed_model/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -2,6 +2,15 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""POSIX compliant OS translator.
+
+Translates OS-unaware calls into POSIX compliant calls/utilities. POSIX is a set of standards
+for portability between Unix operating systems which not all Linux distributions
+(or the tools most frequently bundled with said distributions) adhere to. Most of Linux
+distributions are mostly compliant though.
+This intermediate module implements the common parts of mostly POSIX compliant distributions.
+"""
+
import re
from collections.abc import Iterable
from pathlib import PurePath, PurePosixPath
@@ -15,13 +24,21 @@
class PosixSession(OSSession):
- """
- An intermediary class implementing the Posix compliant parts of
- Linux and other OS remote sessions.
- """
+ """An intermediary class implementing the POSIX standard."""
@staticmethod
def combine_short_options(**opts: bool) -> str:
+ """Combine shell options into one argument.
+
+ These are options such as ``-x``, ``-v``, ``-f`` which are combined into ``-xvf``.
+
+ Args:
+ opts: The keys are option names (usually one letter) and the bool values indicate
+ whether to include the option in the resulting argument.
+
+ Returns:
+ The options combined into one argument.
+ """
ret_opts = ""
for opt, include in opts.items():
if include:
@@ -33,17 +50,19 @@ def combine_short_options(**opts: bool) -> str:
return ret_opts
def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.guess_dpdk_remote_dir`."""
remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
result = self.send_command(f"ls -d {remote_guess} | tail -1")
return PurePosixPath(result.stdout)
def get_remote_tmp_dir(self) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.get_remote_tmp_dir`."""
return PurePosixPath("/tmp")
def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
- """
- Create extra environment variables needed for i686 arch build. Get information
- from the node if needed.
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_build_env_vars`.
+
+ Supported architecture: ``i686``.
"""
env_vars = {}
if arch == Architecture.i686:
@@ -63,6 +82,7 @@ def get_dpdk_build_env_vars(self, arch: Architecture) -> dict:
return env_vars
def join_remote_path(self, *args: str | PurePath) -> PurePosixPath:
+ """Overrides :meth:`~.os_session.OSSession.join_remote_path`."""
return PurePosixPath(*args)
def copy_from(
@@ -70,6 +90,7 @@ def copy_from(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_from`."""
self.remote_session.copy_from(source_file, destination_file)
def copy_to(
@@ -77,6 +98,7 @@ def copy_to(
source_file: str | PurePath,
destination_file: str | PurePath,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.copy_to`."""
self.remote_session.copy_to(source_file, destination_file)
def remove_remote_dir(
@@ -85,6 +107,7 @@ def remove_remote_dir(
recursive: bool = True,
force: bool = True,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.remove_remote_dir`."""
opts = PosixSession.combine_short_options(r=recursive, f=force)
self.send_command(f"rm{opts} {remote_dir_path}")
@@ -93,6 +116,7 @@ def extract_remote_tarball(
remote_tarball_path: str | PurePath,
expected_dir: str | PurePath | None = None,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.extract_remote_tarball`."""
self.send_command(
f"tar xfm {remote_tarball_path} -C {PurePosixPath(remote_tarball_path).parent}",
60,
@@ -109,6 +133,7 @@ def build_dpdk(
rebuild: bool = False,
timeout: float = SETTINGS.compile_timeout,
) -> None:
+ """Overrides :meth:`~.os_session.OSSession.build_dpdk`."""
try:
if rebuild:
# reconfigure, then build
@@ -138,10 +163,12 @@ def build_dpdk(
raise DPDKBuildError(f"DPDK build failed when doing '{e.command}'.")
def get_dpdk_version(self, build_dir: str | PurePath) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_version`."""
out = self.send_command(f"cat {self.join_remote_path(build_dir, 'VERSION')}", verify=True)
return out.stdout
def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
+ """Overrides :meth:`~.os_session.OSSession.kill_cleanup_dpdk_apps`."""
self._logger.info("Cleaning up DPDK apps.")
dpdk_runtime_dirs = self._get_dpdk_runtime_dirs(dpdk_prefix_list)
if dpdk_runtime_dirs:
@@ -153,6 +180,14 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
self._remove_dpdk_runtime_dirs(dpdk_runtime_dirs)
def _get_dpdk_runtime_dirs(self, dpdk_prefix_list: Iterable[str]) -> list[PurePosixPath]:
+ """Find runtime directories DPDK apps are currently using.
+
+ Args:
+ dpdk_prefix_list: The prefixes DPDK apps were started with.
+
+ Returns:
+ The paths of DPDK apps' runtime dirs.
+ """
prefix = PurePosixPath("/var", "run", "dpdk")
if not dpdk_prefix_list:
remote_prefixes = self._list_remote_dirs(prefix)
@@ -164,9 +199,13 @@ def _get_dpdk_runtime_dirs(self, dpdk_prefix_list: Iterable[str]) -> list[PurePo
return [PurePosixPath(prefix, dpdk_prefix) for dpdk_prefix in dpdk_prefix_list]
def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
- """
- Return a list of directories of the remote_dir.
- If remote_path doesn't exist, return None.
+ """Contents of remote_path.
+
+ Args:
+ remote_path: List the contents of this path.
+
+ Returns:
+ The contents of remote_path. If remote_path doesn't exist, return None.
"""
out = self.send_command(f"ls -l {remote_path} | awk '/^d/ {{print $NF}}'").stdout
if "No such file or directory" in out:
@@ -175,6 +214,17 @@ def _list_remote_dirs(self, remote_path: str | PurePath) -> list[str] | None:
return out.splitlines()
def _get_dpdk_pids(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> list[int]:
+ """Find PIDs of running DPDK apps.
+
+ Look at each "config" file found in dpdk_runtime_dirs and find the PIDs of processes
+ that opened those file.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+
+ Returns:
+ The PIDs of running DPDK apps.
+ """
pids = []
pid_regex = r"p(\d+)"
for dpdk_runtime_dir in dpdk_runtime_dirs:
@@ -193,6 +243,14 @@ def _remote_files_exists(self, remote_path: PurePath) -> bool:
return not result.return_code
def _check_dpdk_hugepages(self, dpdk_runtime_dirs: Iterable[str | PurePath]) -> None:
+ """Check there aren't any leftover hugepages.
+
+ If any hugepages are found, emit a warning. The hugepages are investigated in the
+ "hugepage_info" file of dpdk_runtime_dirs.
+
+ Args:
+ dpdk_runtime_dirs: The paths of DPDK apps' runtime dirs.
+ """
for dpdk_runtime_dir in dpdk_runtime_dirs:
hugepage_info = PurePosixPath(dpdk_runtime_dir, "hugepage_info")
if self._remote_files_exists(hugepage_info):
@@ -208,9 +266,11 @@ def _remove_dpdk_runtime_dirs(self, dpdk_runtime_dirs: Iterable[str | PurePath])
self.remove_remote_dir(dpdk_runtime_dir)
def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_dpdk_file_prefix`."""
return ""
def get_compiler_version(self, compiler_name: str) -> str:
+ """Overrides :meth:`~.os_session.OSSession.get_compiler_version`."""
match compiler_name:
case "gcc":
return self.send_command(
@@ -228,6 +288,7 @@ def get_compiler_version(self, compiler_name: str) -> str:
raise ValueError(f"Unknown compiler {compiler_name}")
def get_node_info(self) -> NodeInfo:
+ """Overrides :meth:`~.os_session.OSSession.get_node_info`."""
os_release_info = self.send_command(
"awk -F= '$1 ~ /^NAME$|^VERSION$/ {print $2}' /etc/os-release",
SETTINGS.timeout,
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 17/21] dts: node docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (15 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 16/21] dts: posix and linux sessions " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 18/21] dts: sut and tg nodes " Juraj Linkeš
` (4 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/node.py | 191 +++++++++++++++++++---------
1 file changed, 131 insertions(+), 60 deletions(-)
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index b313b5ad54..1a55fadf78 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
# Copyright(c) 2022-2023 University of New Hampshire
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+A node is any host/server DTS connects to.
+
+The base class, :class:`Node`, provides features common to all nodes and is supposed
+to be extended by subclasses with features specific to each node type.
+The :func:`~Node.skip_setup` decorator can be used without subclassing.
"""
from abc import ABC
@@ -35,10 +40,22 @@
class Node(ABC):
- """
- Basic class for node management. This class implements methods that
- manage a node, such as information gathering (of CPU/PCI/NIC) and
- environment setup.
+ """The base class for node management.
+
+ It shouldn't be instantiated, but rather subclassed.
+ It implements common methods to manage any node:
+
+ * Connection to the node,
+ * Hugepages setup.
+
+ Attributes:
+ main_session: The primary OS-aware remote session used to communicate with the node.
+ config: The node configuration.
+ name: The name of the node.
+ lcores: The list of logical cores that DTS can use on the node.
+ It's derived from logical cores present on the node and the test run configuration.
+ ports: The ports of this node specified in the test run configuration.
+ virtual_devices: The virtual devices used on the node.
"""
main_session: OSSession
@@ -52,6 +69,17 @@ class Node(ABC):
virtual_devices: list[VirtualDevice]
def __init__(self, node_config: NodeConfiguration):
+ """Connect to the node and gather info during initialization.
+
+ Extra gathered information:
+
+ * The list of available logical CPUs. This is then filtered by
+ the ``lcores`` configuration in the YAML test run configuration file,
+ * Information about ports from the YAML test run configuration file.
+
+ Args:
+ node_config: The node's test run configuration.
+ """
self.config = node_config
self.name = node_config.name
self._logger = getLogger(self.name)
@@ -60,7 +88,7 @@ def __init__(self, node_config: NodeConfiguration):
self._logger.info(f"Connected to node: {self.name}")
self._get_remote_cpus()
- # filter the node lcores according to user config
+ # filter the node lcores according to the test run configuration
self.lcores = LogicalCoreListFilter(
self.lcores, LogicalCoreList(self.config.lcores)
).filter()
@@ -76,9 +104,14 @@ def _init_ports(self) -> None:
self.configure_port_state(port)
def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- Perform the execution setup that will be done for each execution
- this node is part of.
+ """Execution setup steps.
+
+ Configure hugepages and call :meth:`_set_up_execution` where
+ the rest of the configuration steps (if any) are implemented.
+
+ Args:
+ execution_config: The execution test run configuration according to which
+ the setup steps will be taken.
"""
self._setup_hugepages()
self._set_up_execution(execution_config)
@@ -87,54 +120,70 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
self.virtual_devices.append(VirtualDevice(vdev))
def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution setup steps.
"""
def tear_down_execution(self) -> None:
- """
- Perform the execution teardown that will be done after each execution
- this node is part of concludes.
+ """Execution teardown steps.
+
+ There are currently no common execution teardown steps common to all DTS node types.
"""
self.virtual_devices = []
self._tear_down_execution()
def _tear_down_execution(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional execution teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional execution teardown steps.
"""
def set_up_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
- """
- Perform the build target setup that will be done for each build target
- tested on this node.
+ """Build target setup steps.
+
+ There are currently no common build target setup steps common to all DTS node types.
+
+ Args:
+ build_target_config: The build target test run configuration according to which
+ the setup steps will be taken.
"""
self._set_up_build_target(build_target_config)
def _set_up_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target setup steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target setup steps.
"""
def tear_down_build_target(self) -> None:
- """
- Perform the build target teardown that will be done after each build target
- tested on this node.
+ """Build target teardown steps.
+
+ There are currently no common build target teardown steps common to all DTS node types.
"""
self._tear_down_build_target()
def _tear_down_build_target(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Optional additional build target teardown steps for subclasses.
+
+ Subclasses should override this if they need to add additional build target teardown steps.
"""
def create_session(self, name: str) -> OSSession:
- """
- Create and return a new OSSession tailored to the remote OS.
+ """Create and return a new OS-aware remote session.
+
+ The returned session won't be used by the node creating it. The session must be used by
+ the caller. The session will be maintained for the entire lifecycle of the node object,
+ at the end of which the session will be cleaned up automatically.
+
+ Note:
+ Any number of these supplementary sessions may be created.
+
+ Args:
+ name: The name of the session.
+
+ Returns:
+ A new OS-aware remote session.
"""
session_name = f"{self.name} {name}"
connection = create_session(
@@ -152,19 +201,19 @@ def create_interactive_shell(
privileged: bool = False,
app_args: str = "",
) -> InteractiveShellType:
- """Create a handler for an interactive session.
+ """Factory for interactive session handlers.
- Instantiate shell_cls according to the remote OS specifics.
+ Instantiate `shell_cls` according to the remote OS specifics.
Args:
shell_cls: The class of the shell.
- timeout: Timeout for reading output from the SSH channel. If you are
- reading from the buffer and don't receive any data within the timeout
- it will throw an error.
+ timeout: Timeout for reading output from the SSH channel. If you are reading from
+ the buffer and don't receive any data within the timeout it will throw an error.
privileged: Whether to run the shell with administrative privileges.
app_args: The arguments to be passed to the application.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not shell_cls.dpdk_app:
shell_cls.path = self.main_session.join_remote_path(shell_cls.path)
@@ -181,14 +230,22 @@ def filter_lcores(
filter_specifier: LogicalCoreCount | LogicalCoreList,
ascending: bool = True,
) -> list[LogicalCore]:
- """
- Filter the LogicalCores found on the Node according to
- a LogicalCoreCount or a LogicalCoreList.
+ """Filter the node's logical cores that DTS can use.
+
+ Logical cores that DTS can use are the ones that are present on the node, but filtered
+ according to the test run configuration. The `filter_specifier` will filter cores from
+ those logical cores.
+
+ Args:
+ filter_specifier: Two different filters can be used, one that specifies the number
+ of logical cores per core, cores per socket and the number of sockets,
+ and another one that specifies a logical core list.
+ ascending: If :data:`True`, use cores with the lowest numerical id first and continue
+ in ascending order. If :data:`False`, start with the highest id and continue
+ in descending order. This ordering affects which sockets to consider first as well.
- If ascending is True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the highest
- id and continue in descending order. This ordering affects which
- sockets to consider first as well.
+ Returns:
+ The filtered logical cores.
"""
self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
return lcore_filter(
@@ -198,17 +255,14 @@ def filter_lcores(
).filter()
def _get_remote_cpus(self) -> None:
- """
- Scan CPUs in the remote OS and store a list of LogicalCores.
- """
+ """Scan CPUs in the remote OS and store a list of LogicalCores."""
self._logger.info("Getting CPU information.")
self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
def _setup_hugepages(self) -> None:
- """
- Setup hugepages on the Node. Different architectures can supply different
- amounts of memory for hugepages and numa-based hugepage allocation may need
- to be considered.
+ """Setup hugepages on the node.
+
+ Configure the hugepages only if they're specified in the node's test run configuration.
"""
if self.config.hugepages:
self.main_session.setup_hugepages(
@@ -216,8 +270,11 @@ def _setup_hugepages(self) -> None:
)
def configure_port_state(self, port: Port, enable: bool = True) -> None:
- """
- Enable/disable port.
+ """Enable/disable `port`.
+
+ Args:
+ port: The port to enable/disable.
+ enable: :data:`True` to enable, :data:`False` to disable.
"""
self.main_session.configure_port_state(port, enable)
@@ -227,15 +284,17 @@ def configure_port_ip_address(
port: Port,
delete: bool = False,
) -> None:
- """
- Configure the IP address of a port on this node.
+ """Add an IP address to `port` on this node.
+
+ Args:
+ address: The IP address with mask in CIDR format. Can be either IPv4 or IPv6.
+ port: The port to which to add the address.
+ delete: If :data:`True`, will delete the address from the port instead of adding it.
"""
self.main_session.configure_port_ip_address(address, port, delete)
def close(self) -> None:
- """
- Close all connections and free other resources.
- """
+ """Close all connections and free other resources."""
if self.main_session:
self.main_session.close()
for session in self._other_sessions:
@@ -244,6 +303,11 @@ def close(self) -> None:
@staticmethod
def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+ """Skip the decorated function.
+
+ The :option:`--skip-setup` command line argument and the :envvar:`DTS_SKIP_SETUP`
+ environment variable enable the decorator.
+ """
if SETTINGS.skip_setup:
return lambda *args: None
else:
@@ -251,6 +315,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
def create_session(node_config: NodeConfiguration, name: str, logger: DTSLOG) -> OSSession:
+ """Factory for OS-aware sessions.
+
+ Args:
+ node_config: The test run configuration of the node to connect to.
+ name: The name of the session.
+ logger: The logger instance this session will use.
+ """
match node_config.os:
case OS.linux:
return LinuxSession(node_config, name, logger)
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 18/21] dts: sut and tg nodes docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (16 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 17/21] dts: node " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 19/21] dts: base traffic generators " Juraj Linkeš
` (3 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/testbed_model/sut_node.py | 230 ++++++++++++++++--------
dts/framework/testbed_model/tg_node.py | 42 +++--
2 files changed, 176 insertions(+), 96 deletions(-)
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 5ce9446dba..c4acea38d1 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -3,6 +3,14 @@
# Copyright(c) 2023 PANTHEON.tech s.r.o.
# Copyright(c) 2023 University of New Hampshire
+"""System under test (DPDK + hardware) node.
+
+A system under test (SUT) is the combination of DPDK
+and the hardware we're testing with DPDK (NICs, crypto and other devices).
+An SUT node is where this SUT runs.
+"""
+
+
import os
import tarfile
import time
@@ -26,6 +34,11 @@
class EalParameters(object):
+ """The environment abstraction layer parameters.
+
+ The string representation can be created by converting the instance to a string.
+ """
+
def __init__(
self,
lcore_list: LogicalCoreList,
@@ -35,21 +48,23 @@ def __init__(
vdevs: list[VirtualDevice],
other_eal_param: str,
):
- """
- Generate eal parameters character string;
- :param lcore_list: the list of logical cores to use.
- :param memory_channels: the number of memory channels to use.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
+ """Initialize the parameters according to inputs.
+
+ Process the parameters into the format used on the command line.
+
+ Args:
+ lcore_list: The list of logical cores to use.
+ memory_channels: The number of memory channels to use.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``
"""
self._lcore_list = f"-l {lcore_list}"
self._memory_channels = f"-n {memory_channels}"
@@ -61,6 +76,7 @@ def __init__(
self._other_eal_param = other_eal_param
def __str__(self) -> str:
+ """Create the EAL string."""
return (
f"{self._lcore_list} "
f"{self._memory_channels} "
@@ -72,11 +88,21 @@ def __str__(self) -> str:
class SutNode(Node):
- """
- A class for managing connections to the System under Test, providing
- methods that retrieve the necessary information about the node (such as
- CPU, memory and NIC details) and configuration capabilities.
- Another key capability is building DPDK according to given build target.
+ """The system under test node.
+
+ The SUT node extends :class:`Node` with DPDK specific features:
+
+ * DPDK build,
+ * Gathering of DPDK build info,
+ * The running of DPDK apps, interactively or one-time execution,
+ * DPDK apps cleanup.
+
+ The :option:`--tarball` command line argument and the :envvar:`DTS_DPDK_TARBALL`
+ environment variable configure the path to the DPDK tarball
+ or the git commit ID, tag ID or tree ID to test.
+
+ Attributes:
+ config: The SUT node configuration
"""
config: SutNodeConfiguration
@@ -94,6 +120,11 @@ class SutNode(Node):
_path_to_devbind_script: PurePath | None
def __init__(self, node_config: SutNodeConfiguration):
+ """Extend the constructor with SUT node specifics.
+
+ Args:
+ node_config: The SUT node's test run configuration.
+ """
super(SutNode, self).__init__(node_config)
self._dpdk_prefix_list = []
self._build_target_config = None
@@ -113,6 +144,12 @@ def __init__(self, node_config: SutNodeConfiguration):
@property
def _remote_dpdk_dir(self) -> PurePath:
+ """The remote DPDK dir.
+
+ This internal property should be set after extracting the DPDK tarball. If it's not set,
+ that implies the DPDK setup step has been skipped, in which case we can guess where
+ a previous build was located.
+ """
if self.__remote_dpdk_dir is None:
self.__remote_dpdk_dir = self._guess_dpdk_remote_dir()
return self.__remote_dpdk_dir
@@ -123,6 +160,11 @@ def _remote_dpdk_dir(self, value: PurePath) -> None:
@property
def remote_dpdk_build_dir(self) -> PurePath:
+ """The remote DPDK build directory.
+
+ This is the directory where DPDK was built.
+ We assume it was built in a subdirectory of the extracted tarball.
+ """
if self._build_target_config:
return self.main_session.join_remote_path(
self._remote_dpdk_dir, self._build_target_config.name
@@ -132,18 +174,21 @@ def remote_dpdk_build_dir(self) -> PurePath:
@property
def dpdk_version(self) -> str:
+ """Last built DPDK version."""
if self._dpdk_version is None:
self._dpdk_version = self.main_session.get_dpdk_version(self._remote_dpdk_dir)
return self._dpdk_version
@property
def node_info(self) -> NodeInfo:
+ """Additional node information."""
if self._node_info is None:
self._node_info = self.main_session.get_node_info()
return self._node_info
@property
def compiler_version(self) -> str:
+ """The node's compiler version."""
if self._compiler_version is None:
if self._build_target_config is not None:
self._compiler_version = self.main_session.get_compiler_version(
@@ -158,6 +203,7 @@ def compiler_version(self) -> str:
@property
def path_to_devbind_script(self) -> PurePath:
+ """The path to the dpdk-devbind.py script on the node."""
if self._path_to_devbind_script is None:
self._path_to_devbind_script = self.main_session.join_remote_path(
self._remote_dpdk_dir, "usertools", "dpdk-devbind.py"
@@ -165,6 +211,11 @@ def path_to_devbind_script(self) -> PurePath:
return self._path_to_devbind_script
def get_build_target_info(self) -> BuildTargetInfo:
+ """Get additional build target information.
+
+ Returns:
+ The build target information,
+ """
return BuildTargetInfo(
dpdk_version=self.dpdk_version, compiler_version=self.compiler_version
)
@@ -173,8 +224,9 @@ def _guess_dpdk_remote_dir(self) -> PurePath:
return self.main_session.guess_dpdk_remote_dir(self._remote_tmp_dir)
def _set_up_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
- """
- Setup DPDK on the SUT node.
+ """Setup DPDK on the SUT node.
+
+ Additional build target setup steps on top of those in :class:`Node`.
"""
# we want to ensure that dpdk_version and compiler_version is reset for new
# build targets
@@ -186,16 +238,14 @@ def _set_up_build_target(self, build_target_config: BuildTargetConfiguration) ->
self.bind_ports_to_driver()
def _tear_down_build_target(self) -> None:
- """
- This method exists to be optionally overwritten by derived classes and
- is not decorated so that the derived class doesn't have to use the decorator.
+ """Bind ports to the operating system drivers.
+
+ Additional build target teardown steps on top of those in :class:`Node`.
"""
self.bind_ports_to_driver(for_dpdk=False)
def _configure_build_target(self, build_target_config: BuildTargetConfiguration) -> None:
- """
- Populate common environment variables and set build target config.
- """
+ """Populate common environment variables and set build target config."""
self._env_vars = {}
self._build_target_config = build_target_config
self._env_vars.update(self.main_session.get_dpdk_build_env_vars(build_target_config.arch))
@@ -207,9 +257,7 @@ def _configure_build_target(self, build_target_config: BuildTargetConfiguration)
@Node.skip_setup
def _copy_dpdk_tarball(self) -> None:
- """
- Copy to and extract DPDK tarball on the SUT node.
- """
+ """Copy to and extract DPDK tarball on the SUT node."""
self._logger.info("Copying DPDK tarball to SUT.")
self.main_session.copy_to(SETTINGS.dpdk_tarball_path, self._remote_tmp_dir)
@@ -238,8 +286,9 @@ def _copy_dpdk_tarball(self) -> None:
@Node.skip_setup
def _build_dpdk(self) -> None:
- """
- Build DPDK. Uses the already configured target. Assumes that the tarball has
+ """Build DPDK.
+
+ Uses the already configured target. Assumes that the tarball has
already been copied to and extracted on the SUT node.
"""
self.main_session.build_dpdk(
@@ -250,15 +299,19 @@ def _build_dpdk(self) -> None:
)
def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePath:
- """
- Build one or all DPDK apps. Requires DPDK to be already built on the SUT node.
- When app_name is 'all', build all example apps.
- When app_name is any other string, tries to build that example app.
- Return the directory path of the built app. If building all apps, return
- the path to the examples directory (where all apps reside).
- The meson_dpdk_args are keyword arguments
- found in meson_option.txt in root DPDK directory. Do not use -D with them,
- for example: enable_kmods=True.
+ """Build one or all DPDK apps.
+
+ Requires DPDK to be already built on the SUT node.
+
+ Args:
+ app_name: The name of the DPDK app to build.
+ When `app_name` is ``all``, build all example apps.
+ meson_dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+ Do not use ``-D`` with them.
+
+ Returns:
+ The directory path of the built app. If building all apps, return
+ the path to the examples directory (where all apps reside).
"""
self.main_session.build_dpdk(
self._env_vars,
@@ -277,9 +330,7 @@ def build_dpdk_app(self, app_name: str, **meson_dpdk_args: str | bool) -> PurePa
)
def kill_cleanup_dpdk_apps(self) -> None:
- """
- Kill all dpdk applications on the SUT. Cleanup hugepages.
- """
+ """Kill all dpdk applications on the SUT, then clean up hugepages."""
if self._dpdk_kill_session and self._dpdk_kill_session.is_alive():
# we can use the session if it exists and responds
self._dpdk_kill_session.kill_cleanup_dpdk_apps(self._dpdk_prefix_list)
@@ -298,33 +349,34 @@ def create_eal_parameters(
vdevs: list[VirtualDevice] | None = None,
other_eal_param: str = "",
) -> "EalParameters":
- """
- Generate eal parameters character string;
- :param lcore_filter_specifier: a number of lcores/cores/sockets to use
- or a list of lcore ids to use.
- The default will select one lcore for each of two cores
- on one socket, in ascending order of core ids.
- :param ascending_cores: True, use cores with the lowest numerical id first
- and continue in ascending order. If False, start with the
- highest id and continue in descending order. This ordering
- affects which sockets to consider first as well.
- :param prefix: set file prefix string, eg:
- prefix='vf'
- :param append_prefix_timestamp: if True, will append a timestamp to
- DPDK file prefix.
- :param no_pci: switch of disable PCI bus eg:
- no_pci=True
- :param vdevs: virtual device list, eg:
- vdevs=[
- VirtualDevice('net_ring0'),
- VirtualDevice('net_ring1')
- ]
- :param other_eal_param: user defined DPDK eal parameters, eg:
- other_eal_param='--single-file-segments'
- :return: eal param string, eg:
- '-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420';
- """
+ """Compose the EAL parameters.
+
+ Process the list of cores and the DPDK prefix and pass that along with
+ the rest of the arguments.
+ Args:
+ lcore_filter_specifier: A number of lcores/cores/sockets to use
+ or a list of lcore ids to use.
+ The default will select one lcore for each of two cores
+ on one socket, in ascending order of core ids.
+ ascending_cores: Sort cores in ascending order (lowest to highest IDs).
+ If :data:`False`, sort in descending order.
+ prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix='vf'``.
+ append_prefix_timestamp: If :data:`True`, will append a timestamp to DPDK file prefix.
+ no_pci: Switch to disable PCI bus e.g.: ``no_pci=True``.
+ vdevs: Virtual devices, e.g.::
+
+ vdevs=[
+ VirtualDevice('net_ring0'),
+ VirtualDevice('net_ring1')
+ ]
+ other_eal_param: user defined DPDK EAL parameters, e.g.:
+ ``other_eal_param='--single-file-segments'``.
+
+ Returns:
+ An EAL param string, such as
+ ``-c 0xf -a 0000:88:00.0 --file-prefix=dpdk_1112_20190809143420``.
+ """
lcore_list = LogicalCoreList(self.filter_lcores(lcore_filter_specifier, ascending_cores))
if append_prefix_timestamp:
@@ -348,14 +400,29 @@ def create_eal_parameters(
def run_dpdk_app(
self, app_path: PurePath, eal_args: "EalParameters", timeout: float = 30
) -> CommandResult:
- """
- Run DPDK application on the remote node.
+ """Run DPDK application on the remote node.
+
+ The application is not run interactively - the command that starts the application
+ is executed and then the call waits for it to finish execution.
+
+ Args:
+ app_path: The remote path to the DPDK application.
+ eal_args: EAL parameters to run the DPDK application with.
+ timeout: Wait at most this long in seconds for `command` execution to complete.
+
+ Returns:
+ The result of the DPDK app execution.
"""
return self.main_session.send_command(
f"{app_path} {eal_args}", timeout, privileged=True, verify=True
)
def configure_ipv4_forwarding(self, enable: bool) -> None:
+ """Enable/disable IPv4 forwarding on the node.
+
+ Args:
+ enable: If :data:`True`, enable the forwarding, otherwise disable it.
+ """
self.main_session.configure_ipv4_forwarding(enable)
def create_interactive_shell(
@@ -365,9 +432,13 @@ def create_interactive_shell(
privileged: bool = False,
eal_parameters: EalParameters | str | None = None,
) -> InteractiveShellType:
- """Factory method for creating a handler for an interactive session.
+ """Extend the factory for interactive session handlers.
+
+ The extensions are SUT node specific:
- Instantiate shell_cls according to the remote OS specifics.
+ * The default for `eal_parameters`,
+ * The interactive shell path `shell_cls.path` is prepended with path to the remote
+ DPDK build directory for DPDK apps.
Args:
shell_cls: The class of the shell.
@@ -377,9 +448,10 @@ def create_interactive_shell(
privileged: Whether to run the shell with administrative privileges.
eal_parameters: List of EAL parameters to use to launch the app. If this
isn't provided or an empty string is passed, it will default to calling
- create_eal_parameters().
+ :meth:`create_eal_parameters`.
+
Returns:
- Instance of the desired interactive application.
+ An instance of the desired interactive application shell.
"""
if not eal_parameters:
eal_parameters = self.create_eal_parameters()
@@ -396,8 +468,8 @@ def bind_ports_to_driver(self, for_dpdk: bool = True) -> None:
"""Bind all ports on the SUT to a driver.
Args:
- for_dpdk: Boolean that, when True, binds ports to os_driver_for_dpdk
- or, when False, binds to os_driver. Defaults to True.
+ for_dpdk: If :data:`True`, binds ports to os_driver_for_dpdk.
+ If :data:`False`, binds to os_driver.
"""
for port in self.ports:
driver = port.os_driver_for_dpdk if for_dpdk else port.os_driver
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 8a8f0019f3..f269d4c585 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -5,13 +5,8 @@
"""Traffic generator node.
-This is the node where the traffic generator resides.
-The distinction between a node and a traffic generator is as follows:
-A node is a host that DTS connects to. It could be a baremetal server,
-a VM or a container.
-A traffic generator is software running on the node.
-A traffic generator node is a node running a traffic generator.
-A node can be a traffic generator node as well as system under test node.
+A traffic generator (TG) generates traffic that's sent towards the SUT node.
+A TG node is where the TG runs.
"""
from scapy.packet import Packet # type: ignore[import]
@@ -24,13 +19,16 @@
class TGNode(Node):
- """Manage connections to a node with a traffic generator.
+ """The traffic generator node.
- Apart from basic node management capabilities, the Traffic Generator node has
- specialized methods for handling the traffic generator running on it.
+ The TG node extends :class:`Node` with TG specific features:
- Arguments:
- node_config: The user configuration of the traffic generator node.
+ * Traffic generator initialization,
+ * The sending of traffic and receiving packets,
+ * The sending of traffic without receiving packets.
+
+ Not all traffic generators are capable of capturing traffic, which is why there
+ must be a way to send traffic without that.
Attributes:
traffic_generator: The traffic generator running on the node.
@@ -39,6 +37,13 @@ class TGNode(Node):
traffic_generator: CapturingTrafficGenerator
def __init__(self, node_config: TGNodeConfiguration):
+ """Extend the constructor with TG node specifics.
+
+ Initialize the traffic generator on the TG node.
+
+ Args:
+ node_config: The TG node's test run configuration.
+ """
super(TGNode, self).__init__(node_config)
self.traffic_generator = create_traffic_generator(self, node_config.traffic_generator)
self._logger.info(f"Created node: {self.name}")
@@ -50,17 +55,17 @@ def send_packet_and_capture(
receive_port: Port,
duration: float = 1,
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet`, return received traffic.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given duration. Also record the captured traffic
in a pcap file.
Args:
packet: The packet to send.
send_port: The egress port on the TG node.
receive_port: The ingress port in the TG node.
- duration: Capture traffic for this amount of time after sending the packet.
+ duration: Capture traffic for this amount of time after sending `packet`.
Returns:
A list of received packets. May be empty if no packets are captured.
@@ -70,6 +75,9 @@ def send_packet_and_capture(
)
def close(self) -> None:
- """Free all resources used by the node"""
+ """Free all resources used by the node.
+
+ This extends the superclass method with TG cleanup.
+ """
self.traffic_generator.close()
super(TGNode, self).close()
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 19/21] dts: base traffic generators docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (17 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 18/21] dts: sut and tg nodes " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 20/21] dts: scapy tg " Juraj Linkeš
` (2 subsequent siblings)
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../traffic_generator/__init__.py | 22 ++++++++-
.../capturing_traffic_generator.py | 45 +++++++++++--------
.../traffic_generator/traffic_generator.py | 33 ++++++++------
3 files changed, 67 insertions(+), 33 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
index 52888d03fa..11e2bd7d97 100644
--- a/dts/framework/testbed_model/traffic_generator/__init__.py
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -1,6 +1,19 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
+"""DTS traffic generators.
+
+A traffic generator is capable of generating traffic and then monitor returning traffic.
+All traffic generators must count the number of received packets. Some may additionally capture
+individual packets.
+
+A traffic generator may be software running on generic hardware or it could be specialized hardware.
+
+The traffic generators that only count the number of received packets are suitable only for
+performance testing. In functional testing, we need to be able to dissect each arrived packet
+and a capturing traffic generator is required.
+"""
+
from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
from framework.exception import ConfigurationError
from framework.testbed_model.node import Node
@@ -12,8 +25,15 @@
def create_traffic_generator(
tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
) -> CapturingTrafficGenerator:
- """A factory function for creating traffic generator object from user config."""
+ """The factory function for creating traffic generator objects from the test run configuration.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ traffic_generator_config: The traffic generator config.
+ Returns:
+ A traffic generator capable of capturing received packets.
+ """
match traffic_generator_config.traffic_generator_type:
case TrafficGeneratorType.SCAPY:
return ScapyTrafficGenerator(tg_node, traffic_generator_config)
diff --git a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index 1fc7f98c05..0246590333 100644
--- a/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -23,19 +23,21 @@
def _get_default_capture_name() -> str:
- """
- This is the function used for the default implementation of capture names.
- """
return str(uuid.uuid4())
class CapturingTrafficGenerator(TrafficGenerator):
"""Capture packets after sending traffic.
- A mixin interface which enables a packet generator to declare that it can capture
+ The intermediary interface which enables a packet generator to declare that it can capture
packets and return them to the user.
+ Similarly to :class:`~.traffic_generator.TrafficGenerator`, this class exposes
+ the public methods specific to capturing traffic generators and defines a private method
+ that must implement the traffic generation and capturing logic in subclasses.
+
The methods of capturing traffic generators obey the following workflow:
+
1. send packets
2. capture packets
3. write the capture to a .pcap file
@@ -44,6 +46,7 @@ class CapturingTrafficGenerator(TrafficGenerator):
@property
def is_capturing(self) -> bool:
+ """This traffic generator can capture traffic."""
return True
def send_packet_and_capture(
@@ -54,11 +57,12 @@ def send_packet_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send a packet, return received traffic.
+ """Send `packet` and capture received traffic.
+
+ Send `packet` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
- Send a packet on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ The captured traffic is recorded in the `capture_name`.pcap file.
Args:
packet: The packet to send.
@@ -68,7 +72,7 @@ def send_packet_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
return self.send_packets_and_capture(
[packet], send_port, receive_port, duration, capture_name
@@ -82,11 +86,14 @@ def send_packets_and_capture(
duration: float,
capture_name: str = _get_default_capture_name(),
) -> list[Packet]:
- """Send packets, return received traffic.
+ """Send `packets` and capture received traffic.
- Send packets on the send_port and then return all traffic captured
- on the receive_port for the given duration. Also record the captured traffic
- in a pcap file.
+ Send `packets` on `send_port` and then return all traffic captured
+ on `receive_port` for the given `duration`.
+
+ The captured traffic is recorded in the `capture_name`.pcap file. The target directory
+ can be configured with the :option:`--output-dir` command line argument or
+ the :envvar:`DTS_OUTPUT_DIR` environment variable.
Args:
packets: The packets to send.
@@ -96,7 +103,7 @@ def send_packets_and_capture(
capture_name: The name of the .pcap file where to store the capture.
Returns:
- A list of received packets. May be empty if no packets are captured.
+ The received packets. May be empty if no packets are captured.
"""
self._logger.debug(get_packet_summaries(packets))
self._logger.debug(
@@ -121,10 +128,12 @@ def _send_packets_and_capture(
receive_port: Port,
duration: float,
) -> list[Packet]:
- """
- The extended classes must implement this method which
- sends packets on send_port and receives packets on the receive_port
- for the specified duration. It must be able to handle no received packets.
+ """The implementation of :method:`send_packets_and_capture`.
+
+ The subclasses must implement this method which sends `packets` on `send_port`
+ and receives packets on `receive_port` for the specified `duration`.
+
+ It must be able to handle receiving no packets.
"""
def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]) -> None:
diff --git a/dts/framework/testbed_model/traffic_generator/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 0d9902ddb7..c49fbff488 100644
--- a/dts/framework/testbed_model/traffic_generator/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -22,7 +22,8 @@
class TrafficGenerator(ABC):
"""The base traffic generator.
- Defines the few basic methods that each traffic generator must implement.
+ Exposes the common public methods of all traffic generators and defines private methods
+ that must implement the traffic generation logic in subclasses.
"""
_config: TrafficGeneratorConfig
@@ -30,14 +31,20 @@ class TrafficGenerator(ABC):
_logger: DTSLOG
def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+ """Initialize the traffic generator.
+
+ Args:
+ tg_node: The traffic generator node where the created traffic generator will be running.
+ config: The traffic generator's test run configuration.
+ """
self._config = config
self._tg_node = tg_node
self._logger = getLogger(f"{self._tg_node.name} {self._config.traffic_generator_type}")
def send_packet(self, packet: Packet, port: Port) -> None:
- """Send a packet and block until it is fully sent.
+ """Send `packet` and block until it is fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packet` on `port`, then wait until `packet` is fully sent.
Args:
packet: The packet to send.
@@ -46,9 +53,9 @@ def send_packet(self, packet: Packet, port: Port) -> None:
self.send_packets([packet], port)
def send_packets(self, packets: list[Packet], port: Port) -> None:
- """Send packets and block until they are fully sent.
+ """Send `packets` and block until they are fully sent.
- What fully sent means is defined by the traffic generator.
+ Send `packets` on `port`, then wait until `packets` are fully sent.
Args:
packets: The packets to send.
@@ -60,19 +67,17 @@ def send_packets(self, packets: list[Packet], port: Port) -> None:
@abstractmethod
def _send_packets(self, packets: list[Packet], port: Port) -> None:
- """
- The extended classes must implement this method which
- sends packets on send_port. The method should block until all packets
- are fully sent.
+ """The implementation of :method:`send_packets`.
+
+ The subclasses must implement this method which sends `packets` on `port`.
+ The method should block until all `packets` are fully sent.
+
+ What fully sent means is defined by the traffic generator.
"""
@property
def is_capturing(self) -> bool:
- """Whether this traffic generator can capture traffic.
-
- Returns:
- True if the traffic generator can capture traffic, False otherwise.
- """
+ """This traffic generator can't capture traffic."""
return False
@abstractmethod
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 20/21] dts: scapy tg docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (18 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 19/21] dts: base traffic generators " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-04 10:24 ` [PATCH v9 21/21] dts: test suites " Juraj Linkeš
2023-12-21 11:48 ` [PATCH v9 00/21] dts: docstrings update Thomas Monjalon
21 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
.../testbed_model/traffic_generator/scapy.py | 91 +++++++++++--------
1 file changed, 54 insertions(+), 37 deletions(-)
diff --git a/dts/framework/testbed_model/traffic_generator/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
index c88cf28369..5b60f66237 100644
--- a/dts/framework/testbed_model/traffic_generator/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -2,14 +2,15 @@
# Copyright(c) 2022 University of New Hampshire
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""Scapy traffic generator.
+"""The Scapy traffic generator.
-Traffic generator used for functional testing, implemented using the Scapy library.
+A traffic generator used for functional testing, implemented with
+`the Scapy library <https://scapy.readthedocs.io/en/latest/>`_.
The traffic generator uses an XML-RPC server to run Scapy on the remote TG node.
-The XML-RPC server runs in an interactive remote SSH session running Python console,
-where we start the server. The communication with the server is facilitated with
-a local server proxy.
+The traffic generator uses the :mod:`xmlrpc.server` module to run an XML-RPC server
+in an interactive remote Python SSH session. The communication with the server is facilitated
+with a local server proxy from the :mod:`xmlrpc.client` module.
"""
import inspect
@@ -69,20 +70,20 @@ def scapy_send_packets_and_capture(
recv_iface: str,
duration: float,
) -> list[bytes]:
- """RPC function to send and capture packets.
+ """The RPC function to send and capture packets.
- The function is meant to be executed on the remote TG node.
+ This function is meant to be executed on the remote TG node via the server proxy.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
recv_iface: The logical name of the ingress interface.
duration: Capture for this amount of time, in seconds.
Returns:
A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
+ to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
sniffer = scapy.all.AsyncSniffer(
@@ -96,19 +97,15 @@ def scapy_send_packets_and_capture(
def scapy_send_packets(xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: str) -> None:
- """RPC function to send packets.
+ """The RPC function to send packets.
- The function is meant to be executed on the remote TG node.
- It doesn't return anything, only sends packets.
+ This function is meant to be executed on the remote TG node via the server proxy.
+ It only sends `xmlrpc_packets`, without capturing them.
Args:
xmlrpc_packets: The packets to send. These need to be converted to
- xmlrpc.client.Binary before sending to the remote server.
+ :class:`~xmlrpc.client.Binary` objects before sending to the remote server.
send_iface: The logical name of the egress interface.
-
- Returns:
- A list of bytes. Each item in the list represents one packet, which needs
- to be converted back upon transfer from the remote node.
"""
scapy_packets = [scapy.all.Packet(packet.data) for packet in xmlrpc_packets]
scapy.all.sendp(scapy_packets, iface=send_iface, realtime=True, verbose=True)
@@ -128,11 +125,19 @@ def scapy_send_packets(xmlrpc_packets: list[xmlrpc.client.Binary], send_iface: s
class QuittableXMLRPCServer(SimpleXMLRPCServer):
- """Basic XML-RPC server that may be extended
- by functions serializable by the marshal module.
+ """Basic XML-RPC server.
+
+ The server may be augmented by functions serializable by the :mod:`marshal` module.
"""
def __init__(self, *args, **kwargs):
+ """Extend the XML-RPC server initialization.
+
+ Args:
+ args: The positional arguments that will be passed to the superclass's constructor.
+ kwargs: The keyword arguments that will be passed to the superclass's constructor.
+ The `allow_none` argument will be set to :data:`True`.
+ """
kwargs["allow_none"] = True
super().__init__(*args, **kwargs)
self.register_introspection_functions()
@@ -140,13 +145,12 @@ def __init__(self, *args, **kwargs):
self.register_function(self.add_rpc_function)
def quit(self) -> None:
+ """Quit the server."""
self._BaseServer__shutdown_request = True
return None
def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
- """Add a function to the server.
-
- This is meant to be executed remotely.
+ """Add a function to the server from the local server proxy.
Args:
name: The name of the function.
@@ -157,6 +161,11 @@ def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> N
self.register_function(function)
def serve_forever(self, poll_interval: float = 0.5) -> None:
+ """Extend the superclass method with an additional print.
+
+ Once executed in the local server proxy, the print gives us a clear string to expect
+ when starting the server. The print means this function was executed on the XML-RPC server.
+ """
print("XMLRPC OK")
super().serve_forever(poll_interval)
@@ -164,19 +173,12 @@ def serve_forever(self, poll_interval: float = 0.5) -> None:
class ScapyTrafficGenerator(CapturingTrafficGenerator):
"""Provides access to scapy functions via an RPC interface.
- The traffic generator first starts an XML-RPC on the remote TG node.
- Then it populates the server with functions which use the Scapy library
- to send/receive traffic.
-
- Any packets sent to the remote server are first converted to bytes.
- They are received as xmlrpc.client.Binary objects on the server side.
- When the server sends the packets back, they are also received as
- xmlrpc.client.Binary object on the client side, are converted back to Scapy
- packets and only then returned from the methods.
+ This class extends the base with remote execution of scapy functions.
- Arguments:
- tg_node: The node where the traffic generator resides.
- config: The user configuration of the traffic generator.
+ Any packets sent to the remote server are first converted to bytes. They are received as
+ :class:`~xmlrpc.client.Binary` objects on the server side. When the server sends the packets
+ back, they are also received as :class:`~xmlrpc.client.Binary` objects on the client side, are
+ converted back to :class:`~scapy.packet.Packet` objects and only then returned from the methods.
Attributes:
session: The exclusive interactive remote session created by the Scapy
@@ -190,6 +192,22 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
_config: ScapyTrafficGeneratorConfig
def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+ """Extend the constructor with Scapy TG specifics.
+
+ The traffic generator first starts an XML-RPC on the remote `tg_node`.
+ Then it populates the server with functions which use the Scapy library
+ to send/receive traffic:
+
+ * :func:`scapy_send_packets_and_capture`
+ * :func:`scapy_send_packets`
+
+ To enable verbose logging from the xmlrpc client, use the :option:`--verbose`
+ command line argument or the :envvar:`DTS_VERBOSE` environment variable.
+
+ Args:
+ tg_node: The node where the traffic generator resides.
+ config: The traffic generator's test run configuration.
+ """
super().__init__(tg_node, config)
assert (
@@ -231,10 +249,8 @@ def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
# or class, so strip all lines containing only whitespace
src = "\n".join([line for line in src.splitlines() if not line.isspace() and line != ""])
- spacing = "\n" * 4
-
# execute it in the python terminal
- self.session.send_command(spacing + src + spacing)
+ self.session.send_command(src + "\n")
self.session.send_command(
f"server = QuittableXMLRPCServer(('0.0.0.0', {listen_port}));server.serve_forever()",
"XMLRPC OK",
@@ -267,6 +283,7 @@ def _send_packets_and_capture(
return scapy_packets
def close(self) -> None:
+ """Close the traffic generator."""
try:
self.rpc_server_proxy.quit()
except ConnectionRefusedError:
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 21/21] dts: test suites docstring update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (19 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 20/21] dts: scapy tg " Juraj Linkeš
@ 2023-12-04 10:24 ` Juraj Linkeš
2023-12-05 18:39 ` Jeremy Spewock
2023-12-21 11:48 ` [PATCH v9 00/21] dts: docstrings update Thomas Monjalon
21 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-12-04 10:24 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Format according to the Google format and PEP257, with slight
deviations.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/tests/TestSuite_hello_world.py | 16 +++++---
dts/tests/TestSuite_os_udp.py | 20 ++++++----
dts/tests/TestSuite_smoke_tests.py | 61 ++++++++++++++++++++++++------
3 files changed, 72 insertions(+), 25 deletions(-)
diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
index 768ba1cfa8..fd7ff1534d 100644
--- a/dts/tests/TestSuite_hello_world.py
+++ b/dts/tests/TestSuite_hello_world.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2010-2014 Intel Corporation
-"""
+"""The DPDK hello world app test suite.
+
Run the helloworld example app and verify it prints a message for each used core.
No other EAL parameters apart from cores are used.
"""
@@ -15,22 +16,25 @@
class TestHelloWorld(TestSuite):
+ """DPDK hello world app test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
Build the app we're about to test - helloworld.
"""
self.app_helloworld_path = self.sut_node.build_dpdk_app("helloworld")
def test_hello_world_single_core(self) -> None:
- """
+ """Single core test case.
+
Steps:
Run the helloworld app on the first usable logical core.
Verify:
The app prints a message from the used core:
"hello from core <core_id>"
"""
-
# get the first usable core
lcore_amount = LogicalCoreCount(1, 1, 1)
lcores = LogicalCoreCountFilter(self.sut_node.lcores, lcore_amount).filter()
@@ -42,14 +46,14 @@ def test_hello_world_single_core(self) -> None:
)
def test_hello_world_all_cores(self) -> None:
- """
+ """All cores test case.
+
Steps:
Run the helloworld app on all usable logical cores.
Verify:
The app prints a message from all used cores:
"hello from core <core_id>"
"""
-
# get the maximum logical core number
eal_para = self.sut_node.create_eal_parameters(
lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
index bf6b93deb5..2cf29d37bb 100644
--- a/dts/tests/TestSuite_os_udp.py
+++ b/dts/tests/TestSuite_os_udp.py
@@ -1,7 +1,8 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 PANTHEON.tech s.r.o.
-"""
+"""Basic IPv4 OS routing test suite.
+
Configure SUT node to route traffic from if1 to if2.
Send a packet to the SUT node, verify it comes back on the second port on the TG node.
"""
@@ -13,24 +14,26 @@
class TestOSUdp(TestSuite):
+ """IPv4 UDP OS routing test suite."""
+
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
- Configure SUT ports and SUT to route traffic from if1 to if2.
+ Bind the SUT ports to the OS driver, configure the ports and configure the SUT
+ to route traffic from if1 to if2.
"""
-
- # This test uses kernel drivers
self.sut_node.bind_ports_to_driver(for_dpdk=False)
self.configure_testbed_ipv4()
def test_os_udp(self) -> None:
- """
+ """Basic UDP IPv4 traffic test case.
+
Steps:
Send a UDP packet.
Verify:
The packet with proper addresses arrives at the other TG port.
"""
-
packet = Ether() / IP() / UDP()
received_packets = self.send_packet_and_capture(packet)
@@ -40,7 +43,8 @@ def test_os_udp(self) -> None:
self.verify_packets(expected_packet, received_packets)
def tear_down_suite(self) -> None:
- """
+ """Tear down the test suite.
+
Teardown:
Remove the SUT port configuration configured in setup.
"""
diff --git a/dts/tests/TestSuite_smoke_tests.py b/dts/tests/TestSuite_smoke_tests.py
index 8958f58dac..5e2bac14bd 100644
--- a/dts/tests/TestSuite_smoke_tests.py
+++ b/dts/tests/TestSuite_smoke_tests.py
@@ -1,6 +1,17 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2023 University of New Hampshire
+"""Smoke test suite.
+
+Smoke tests are a class of tests which are used for validating a minimal set of important features.
+These are the most important features without which (or when they're faulty) the software wouldn't
+work properly. Thus, if any failure occurs while testing these features,
+there isn't that much of a reason to continue testing, as the software is fundamentally broken.
+
+These tests don't have to include only DPDK tests, as the reason for failures could be
+in the infrastructure (a faulty link between NICs or a misconfiguration).
+"""
+
import re
from framework.config import PortConfig
@@ -11,23 +22,39 @@
class SmokeTests(TestSuite):
+ """DPDK and infrastructure smoke test suite.
+
+ The test cases validate the most basic DPDK functionality needed for all other test suites.
+ The infrastructure also needs to be tested, as that is also used by all other test suites.
+
+ Attributes:
+ is_blocking: This test suite will block the execution of all other test suites
+ in the build target after it.
+ nics_in_node: The NICs present on the SUT node.
+ """
+
is_blocking = True
# dicts in this list are expected to have two keys:
# "pci_address" and "current_driver"
nics_in_node: list[PortConfig] = []
def set_up_suite(self) -> None:
- """
+ """Set up the test suite.
+
Setup:
- Set the build directory path and generate a list of NICs in the SUT node.
+ Set the build directory path and a list of NICs in the SUT node.
"""
self.dpdk_build_dir_path = self.sut_node.remote_dpdk_build_dir
self.nics_in_node = self.sut_node.config.ports
def test_unit_tests(self) -> None:
- """
+ """DPDK meson ``fast-tests`` unit tests.
+
+ Test that all unit test from the ``fast-tests`` suite pass.
+ The suite is a subset with only the most basic tests.
+
Test:
- Run the fast-test unit-test suite through meson.
+ Run the ``fast-tests`` unit test suite through meson.
"""
self.sut_node.main_session.send_command(
f"meson test -C {self.dpdk_build_dir_path} --suite fast-tests -t 60",
@@ -37,9 +64,14 @@ def test_unit_tests(self) -> None:
)
def test_driver_tests(self) -> None:
- """
+ """DPDK meson ``driver-tests`` unit tests.
+
+ Test that all unit test from the ``driver-tests`` suite pass.
+ The suite is a subset with driver tests. This suite may be run with virtual devices
+ configured in the test run configuration.
+
Test:
- Run the driver-test unit-test suite through meson.
+ Run the ``driver-tests`` unit test suite through meson.
"""
vdev_args = ""
for dev in self.sut_node.virtual_devices:
@@ -60,9 +92,12 @@ def test_driver_tests(self) -> None:
)
def test_devices_listed_in_testpmd(self) -> None:
- """
+ """Testpmd device discovery.
+
+ Test that the devices configured in the test run configuration are found in testpmd.
+
Test:
- Uses testpmd driver to verify that devices have been found by testpmd.
+ List all devices found in testpmd and verify the configured devices are among them.
"""
testpmd_driver = self.sut_node.create_interactive_shell(TestPmdShell, privileged=True)
dev_list = [str(x) for x in testpmd_driver.get_devices()]
@@ -74,10 +109,14 @@ def test_devices_listed_in_testpmd(self) -> None:
)
def test_device_bound_to_driver(self) -> None:
- """
+ """Device driver in OS.
+
+ Test that the devices configured in the test run configuration are bound to
+ the proper driver.
+
Test:
- Ensure that all drivers listed in the config are bound to the correct
- driver.
+ List all devices with the ``dpdk-devbind.py`` script and verify that
+ the configured devices are bound to the proper driver.
"""
path_to_devbind = self.sut_node.path_to_devbind_script
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v9 21/21] dts: test suites docstring update
2023-12-04 10:24 ` [PATCH v9 21/21] dts: test suites " Juraj Linkeš
@ 2023-12-05 18:39 ` Jeremy Spewock
0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2023-12-05 18:39 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
[-- Attachment #1: Type: text/plain, Size: 9406 bytes --]
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
On Mon, Dec 4, 2023 at 5:24 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> Format according to the Google format and PEP257, with slight
> deviations.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> dts/tests/TestSuite_hello_world.py | 16 +++++---
> dts/tests/TestSuite_os_udp.py | 20 ++++++----
> dts/tests/TestSuite_smoke_tests.py | 61 ++++++++++++++++++++++++------
> 3 files changed, 72 insertions(+), 25 deletions(-)
>
> diff --git a/dts/tests/TestSuite_hello_world.py
> b/dts/tests/TestSuite_hello_world.py
> index 768ba1cfa8..fd7ff1534d 100644
> --- a/dts/tests/TestSuite_hello_world.py
> +++ b/dts/tests/TestSuite_hello_world.py
> @@ -1,7 +1,8 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2010-2014 Intel Corporation
>
> -"""
> +"""The DPDK hello world app test suite.
> +
> Run the helloworld example app and verify it prints a message for each
> used core.
> No other EAL parameters apart from cores are used.
> """
> @@ -15,22 +16,25 @@
>
>
> class TestHelloWorld(TestSuite):
> + """DPDK hello world app test suite."""
> +
> def set_up_suite(self) -> None:
> - """
> + """Set up the test suite.
> +
> Setup:
> Build the app we're about to test - helloworld.
> """
> self.app_helloworld_path =
> self.sut_node.build_dpdk_app("helloworld")
>
> def test_hello_world_single_core(self) -> None:
> - """
> + """Single core test case.
> +
> Steps:
> Run the helloworld app on the first usable logical core.
> Verify:
> The app prints a message from the used core:
> "hello from core <core_id>"
> """
> -
> # get the first usable core
> lcore_amount = LogicalCoreCount(1, 1, 1)
> lcores = LogicalCoreCountFilter(self.sut_node.lcores,
> lcore_amount).filter()
> @@ -42,14 +46,14 @@ def test_hello_world_single_core(self) -> None:
> )
>
> def test_hello_world_all_cores(self) -> None:
> - """
> + """All cores test case.
> +
> Steps:
> Run the helloworld app on all usable logical cores.
> Verify:
> The app prints a message from all used cores:
> "hello from core <core_id>"
> """
> -
> # get the maximum logical core number
> eal_para = self.sut_node.create_eal_parameters(
> lcore_filter_specifier=LogicalCoreList(self.sut_node.lcores)
> diff --git a/dts/tests/TestSuite_os_udp.py b/dts/tests/TestSuite_os_udp.py
> index bf6b93deb5..2cf29d37bb 100644
> --- a/dts/tests/TestSuite_os_udp.py
> +++ b/dts/tests/TestSuite_os_udp.py
> @@ -1,7 +1,8 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> -"""
> +"""Basic IPv4 OS routing test suite.
> +
> Configure SUT node to route traffic from if1 to if2.
> Send a packet to the SUT node, verify it comes back on the second port on
> the TG node.
> """
> @@ -13,24 +14,26 @@
>
>
> class TestOSUdp(TestSuite):
> + """IPv4 UDP OS routing test suite."""
> +
> def set_up_suite(self) -> None:
> - """
> + """Set up the test suite.
> +
> Setup:
> - Configure SUT ports and SUT to route traffic from if1 to if2.
> + Bind the SUT ports to the OS driver, configure the ports and
> configure the SUT
> + to route traffic from if1 to if2.
> """
> -
> - # This test uses kernel drivers
> self.sut_node.bind_ports_to_driver(for_dpdk=False)
> self.configure_testbed_ipv4()
>
> def test_os_udp(self) -> None:
> - """
> + """Basic UDP IPv4 traffic test case.
> +
> Steps:
> Send a UDP packet.
> Verify:
> The packet with proper addresses arrives at the other TG port.
> """
> -
> packet = Ether() / IP() / UDP()
>
> received_packets = self.send_packet_and_capture(packet)
> @@ -40,7 +43,8 @@ def test_os_udp(self) -> None:
> self.verify_packets(expected_packet, received_packets)
>
> def tear_down_suite(self) -> None:
> - """
> + """Tear down the test suite.
> +
> Teardown:
> Remove the SUT port configuration configured in setup.
> """
> diff --git a/dts/tests/TestSuite_smoke_tests.py
> b/dts/tests/TestSuite_smoke_tests.py
> index 8958f58dac..5e2bac14bd 100644
> --- a/dts/tests/TestSuite_smoke_tests.py
> +++ b/dts/tests/TestSuite_smoke_tests.py
> @@ -1,6 +1,17 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2023 University of New Hampshire
>
> +"""Smoke test suite.
> +
> +Smoke tests are a class of tests which are used for validating a minimal
> set of important features.
> +These are the most important features without which (or when they're
> faulty) the software wouldn't
> +work properly. Thus, if any failure occurs while testing these features,
> +there isn't that much of a reason to continue testing, as the software is
> fundamentally broken.
> +
> +These tests don't have to include only DPDK tests, as the reason for
> failures could be
> +in the infrastructure (a faulty link between NICs or a misconfiguration).
> +"""
> +
> import re
>
> from framework.config import PortConfig
> @@ -11,23 +22,39 @@
>
>
> class SmokeTests(TestSuite):
> + """DPDK and infrastructure smoke test suite.
> +
> + The test cases validate the most basic DPDK functionality needed for
> all other test suites.
> + The infrastructure also needs to be tested, as that is also used by
> all other test suites.
> +
> + Attributes:
> + is_blocking: This test suite will block the execution of all
> other test suites
> + in the build target after it.
> + nics_in_node: The NICs present on the SUT node.
> + """
> +
> is_blocking = True
> # dicts in this list are expected to have two keys:
> # "pci_address" and "current_driver"
> nics_in_node: list[PortConfig] = []
>
> def set_up_suite(self) -> None:
> - """
> + """Set up the test suite.
> +
> Setup:
> - Set the build directory path and generate a list of NICs in
> the SUT node.
> + Set the build directory path and a list of NICs in the SUT
> node.
> """
> self.dpdk_build_dir_path = self.sut_node.remote_dpdk_build_dir
> self.nics_in_node = self.sut_node.config.ports
>
> def test_unit_tests(self) -> None:
> - """
> + """DPDK meson ``fast-tests`` unit tests.
> +
> + Test that all unit test from the ``fast-tests`` suite pass.
> + The suite is a subset with only the most basic tests.
> +
> Test:
> - Run the fast-test unit-test suite through meson.
> + Run the ``fast-tests`` unit test suite through meson.
> """
> self.sut_node.main_session.send_command(
> f"meson test -C {self.dpdk_build_dir_path} --suite fast-tests
> -t 60",
> @@ -37,9 +64,14 @@ def test_unit_tests(self) -> None:
> )
>
> def test_driver_tests(self) -> None:
> - """
> + """DPDK meson ``driver-tests`` unit tests.
> +
> + Test that all unit test from the ``driver-tests`` suite pass.
> + The suite is a subset with driver tests. This suite may be run
> with virtual devices
> + configured in the test run configuration.
> +
> Test:
> - Run the driver-test unit-test suite through meson.
> + Run the ``driver-tests`` unit test suite through meson.
> """
> vdev_args = ""
> for dev in self.sut_node.virtual_devices:
> @@ -60,9 +92,12 @@ def test_driver_tests(self) -> None:
> )
>
> def test_devices_listed_in_testpmd(self) -> None:
> - """
> + """Testpmd device discovery.
> +
> + Test that the devices configured in the test run configuration
> are found in testpmd.
> +
> Test:
> - Uses testpmd driver to verify that devices have been found by
> testpmd.
> + List all devices found in testpmd and verify the configured
> devices are among them.
> """
> testpmd_driver =
> self.sut_node.create_interactive_shell(TestPmdShell, privileged=True)
> dev_list = [str(x) for x in testpmd_driver.get_devices()]
> @@ -74,10 +109,14 @@ def test_devices_listed_in_testpmd(self) -> None:
> )
>
> def test_device_bound_to_driver(self) -> None:
> - """
> + """Device driver in OS.
> +
> + Test that the devices configured in the test run configuration
> are bound to
> + the proper driver.
> +
> Test:
> - Ensure that all drivers listed in the config are bound to the
> correct
> - driver.
> + List all devices with the ``dpdk-devbind.py`` script and
> verify that
> + the configured devices are bound to the proper driver.
> """
> path_to_devbind = self.sut_node.path_to_devbind_script
>
> --
> 2.34.1
>
>
[-- Attachment #2: Type: text/html, Size: 11694 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v9 00/21] dts: docstrings update
2023-12-04 10:24 ` [PATCH v9 " Juraj Linkeš
` (20 preceding siblings ...)
2023-12-04 10:24 ` [PATCH v9 21/21] dts: test suites " Juraj Linkeš
@ 2023-12-21 11:48 ` Thomas Monjalon
21 siblings, 0 replies; 393+ messages in thread
From: Thomas Monjalon @ 2023-12-21 11:48 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
yoan.picchi, Luca.Vizzarro, dev
04/12/2023 11:24, Juraj Linkeš:
> The first commit makes changes to the code. These code changes mainly
> change the structure of the code so that the actual API docs generation
> works. There are also some code changes which get reflected in the
> documentation, such as making functions/methods/attributes private or
> public.
>
> The rest of the commits deal with the actual docstring documentation
> (from which the API docs are generated). The format of the docstrings
> is the Google format [0] with PEP257 [1] and some guidelines captured
> in the last commit of this group covering what the Google format
> doesn't.
> The docstring updates are split into many commits to make review
> possible. When accepted, they may be squashed.
> The docstrings have been composed in anticipation of [2], adhering to
> maximum line length of 100. We don't have a tool for automatic docstring
> formatting, hence the usage of 100 right away to save time.
Applied and squashed with few minor improvements.
As discussed with Juraj privately, the copyright date bumps are removed,
because unnecessary.
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v1 0/2] dts: api docs generation
2023-11-08 12:53 ` [PATCH v6 23/23] dts: add doc generation Juraj Linkeš
2023-11-15 13:09 ` [PATCH v7 00/21] dts: docstrings update Juraj Linkeš
@ 2023-11-15 13:36 ` Juraj Linkeš
2023-11-15 13:36 ` [PATCH v1 1/2] dts: add doc generation dependencies Juraj Linkeš
` (19 more replies)
1 sibling, 20 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:36 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration (the sidebar: unlimited depth and better
collapsing - I need comment on this).
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
This patchet depends on the series which updates the DTS docstrings. The
technical reason for this dependency is the hash value of the
poetry.lock file, which wouldn't match the contents were the patches to
be applied individually (the hash value would need to be recomputed
after applying the second patch).
The logical reason is that there's basically no point in generating the
documentation with the configuration in this patch series that's
tailored to the google format, which is contained in the depended
series. The generation would produce confusing errors and incomplete,
not good looking docs.
This patch series is much less important that the one updating the
docstrings. The docstrings series must be finished and applied as soon
as possible, as it has a dramatic impact on future development, while
this series doesn't hamper other development in any way.
Depends-on: series-30302 ("dts: docstrings update")
Juraj Linkeš (2):
dts: add doc generation dependencies
dts: add doc generation
buildtools/call-sphinx-build.py | 29 +-
doc/api/meson.build | 1 +
doc/guides/conf.py | 34 ++-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 32 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/index.rst | 17 ++
dts/doc/meson.build | 60 ++++
dts/meson.build | 16 +
dts/poetry.lock | 499 +++++++++++++++++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
12 files changed, 681 insertions(+), 17 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v1 1/2] dts: add doc generation dependencies
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
@ 2023-11-15 13:36 ` Juraj Linkeš
2023-11-15 13:36 ` [PATCH v1 2/2] dts: add doc generation Juraj Linkeš
` (18 subsequent siblings)
19 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:36 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 499 ++++++++++++++++++++++++++++++++++++++++++++-
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 1 deletion(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index a734fa71f0..8b27b0d751 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -404,6 +651,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -515,6 +773,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -632,6 +904,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -753,6 +1046,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -775,6 +1084,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "6.2.1"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
+ {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.2"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+ {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
+ {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -819,6 +1299,23 @@ files = [
{file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -837,4 +1334,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
+content-hash = "44c0fd4ebd7a1630ad104d55a25ef85d361abc968157254e757d44e340bca06f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 3943c87c87..98df431b3b 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<7"
+sphinx-rtd-theme = "^1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v1 2/2] dts: add doc generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
2023-11-15 13:36 ` [PATCH v1 1/2] dts: add doc generation dependencies Juraj Linkeš
@ 2023-11-15 13:36 ` Juraj Linkeš
2024-01-22 12:00 ` [PATCH v2 0/3] dts: API docs generation Juraj Linkeš
` (17 subsequent siblings)
19 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-15 13:36 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi
Cc: dev, Juraj Linkeš
The tool used to generate developer docs is Sphinx, which is already
used in DPDK. The same configuration is used to preserve style, but it's
been augmented with doc-generating configuration and a change to how the
sidebar displays the content hierarchy.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 29 ++++++++++------
doc/api/meson.build | 1 +
doc/guides/conf.py | 34 ++++++++++++++++---
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 32 +++++++++++++++++-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/index.rst | 17 ++++++++++
dts/doc/meson.build | 60 +++++++++++++++++++++++++++++++++
dts/meson.build | 16 +++++++++
meson.build | 1 +
10 files changed, 176 insertions(+), 16 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..c2f3acfb1d 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,46 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run, PIPE, STDOUT
from packaging.version import Version
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default='.')
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+os.environ['DTS_ROOT'] = args.dts_root
# for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
if Version(ver) >= Version('1.7'):
sphinx_cmd += ['-j', 'auto']
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..92fe10d9e7 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..169b1d24bc 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,31 @@
file=stderr)
pass
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = False
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+# Sidebar config
+html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+}
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +59,8 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+path.append(environ.get('DTS_ROOT'))
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index cd771a428c..77d9434c1c 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -283,7 +283,10 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
warnings when some of the basics are not met.
-The code must be properly documented with docstrings. The style must conform to
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the `API docs build steps <building_api_docs>`_.
+
+Speaking of which, the code must be properly documented with docstrings. The style must conform to
the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style
`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
@@ -408,3 +411,30 @@ There are three tools used in DTS to help with code checking, style and formatti
These three tools are all used in ``devtools/dts-check-format.sh``,
the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+
+
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system. After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..f5dcd553f2
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,17 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's documentation!
+===========================================
+
+.. toctree::
+ :titlesonly:
+ :caption: Contents:
+
+ framework
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..e11ab83843
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,60 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_api_framework_dir = join_paths(dts_dir, 'framework')
+dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+if meson.version().version_compare('>=0.57.0')
+ dts_api_src = custom_target('dts_api_src',
+ output: 'modules.rst',
+ env: {'SPHINX_APIDOC_OPTIONS': 'members,show-inheritance'},
+ command: [sphinx_apidoc, '--append-syspath', '--force',
+ '--module-first', '--separate', '-V', meson.project_version(),
+ '--output-dir', dts_api_build_dir, '--no-toc', '--implicit-namespaces',
+ dts_api_framework_dir],
+ build_by_default: false)
+else
+ dts_api_src = custom_target('dts_api_src',
+ output: 'modules.rst',
+ command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
+ sphinx_apidoc, '--append-syspath', '--force',
+ '--module-first', '--separate', '-V', meson.project_version(),
+ '--output-dir', dts_api_build_dir, '--no-toc', '--implicit-namespaces',
+ dts_api_framework_dir],
+ build_by_default: false)
+endif
+doc_targets += dts_api_src
+doc_target_names += 'DTS_API_sphinx_sources'
+
+cp = find_program('cp')
+cp_index = custom_target('cp_index',
+ input: ['index.rst', 'conf_yaml_schema.json'],
+ output: 'index.rst',
+ depends: dts_api_src,
+ command: [cp, '--dereference', '@INPUT@', dts_api_build_dir],
+ build_by_default: false)
+doc_targets += cp_index
+doc_target_names += 'DTS_API_sphinx_index'
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ depends: cp_index,
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ dts_api_build_dir, dts_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: false,
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 5e161f43e5..001fdcbbbf 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v2 0/3] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
2023-11-15 13:36 ` [PATCH v1 1/2] dts: add doc generation dependencies Juraj Linkeš
2023-11-15 13:36 ` [PATCH v1 2/2] dts: add doc generation Juraj Linkeš
@ 2024-01-22 12:00 ` Juraj Linkeš
2024-01-22 12:00 ` [PATCH v2 1/3] dts: add doc generation dependencies Juraj Linkeš
` (2 more replies)
2024-01-22 16:35 ` [PATCH v3 0/3] dts: API docs generation Juraj Linkeš
` (16 subsequent siblings)
19 siblings, 3 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-01-22 12:00 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
Juraj Linkeš (3):
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 33 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.dts.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 17 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.rst | 30 ++
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 41 ++
dts/doc/meson.build | 27 +
dts/meson.build | 16 +
dts/poetry.lock | 499 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
45 files changed, 950 insertions(+), 20 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.dts.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v2 1/3] dts: add doc generation dependencies
2024-01-22 12:00 ` [PATCH v2 0/3] dts: API docs generation Juraj Linkeš
@ 2024-01-22 12:00 ` Juraj Linkeš
2024-01-22 12:00 ` [PATCH v2 2/3] dts: add API doc sources Juraj Linkeš
2024-01-22 12:00 ` [PATCH v2 3/3] dts: add API doc generation Juraj Linkeš
2 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-01-22 12:00 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 499 ++++++++++++++++++++++++++++++++++++++++++++-
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 1 deletion(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index a734fa71f0..8b27b0d751 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -404,6 +651,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -515,6 +773,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -632,6 +904,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -753,6 +1046,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -775,6 +1084,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "6.2.1"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
+ {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.2"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+ {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
+ {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -819,6 +1299,23 @@ files = [
{file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -837,4 +1334,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
+content-hash = "44c0fd4ebd7a1630ad104d55a25ef85d361abc968157254e757d44e340bca06f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 37a692d655..28bd970ae4 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<7"
+sphinx-rtd-theme = "^1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v2 2/3] dts: add API doc sources
2024-01-22 12:00 ` [PATCH v2 0/3] dts: API docs generation Juraj Linkeš
2024-01-22 12:00 ` [PATCH v2 1/3] dts: add doc generation dependencies Juraj Linkeš
@ 2024-01-22 12:00 ` Juraj Linkeš
2024-01-22 12:00 ` [PATCH v2 3/3] dts: add API doc generation Juraj Linkeš
2 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-01-22 12:00 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific config options in mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.dts.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 17 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.rst | 30 ++++++++++++++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 ++++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 +++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 41 +++++++++++++++++++
33 files changed, 297 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.dts.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..5af915b681
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+types - Configuration Types
+===========================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.dts.rst b/dts/doc/framework.dts.rst
new file mode 100644
index 0000000000..b1de438887
--- /dev/null
+++ b/dts/doc/framework.dts.rst
@@ -0,0 +1,6 @@
+dts - Testbed Setup and Test Suite Runner
+=========================================
+
+.. automodule:: framework.dts
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..74f83f0307
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,17 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.rst b/dts/doc/framework.rst
new file mode 100644
index 0000000000..978d5b5e38
--- /dev/null
+++ b/dts/doc/framework.rst
@@ -0,0 +1,30 @@
+framework - DTS Libraries
+=========================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :maxdepth: 3
+
+ framework.config
+ framework.remote_session
+ framework.testbed_model
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.dts
+ framework.exception
+ framework.logger
+ framework.settings
+ framework.test_result
+ framework.test_suite
+ framework.utils
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..41206c000b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffig Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..fc3b6d78b9
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,41 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.config
+ framework.remote_session
+ framework.testbed_model
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.dts
+ framework.exception
+ framework.logger
+ framework.settings
+ framework.test_result
+ framework.test_suite
+ framework.utils
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v2 3/3] dts: add API doc generation
2024-01-22 12:00 ` [PATCH v2 0/3] dts: API docs generation Juraj Linkeš
2024-01-22 12:00 ` [PATCH v2 1/3] dts: add doc generation dependencies Juraj Linkeš
2024-01-22 12:00 ` [PATCH v2 2/3] dts: add API doc sources Juraj Linkeš
@ 2024-01-22 12:00 ` Juraj Linkeš
2 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-01-22 12:00 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
The tool used to generate developer docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style, but it's
been augmented with doc-generating configuration. There's a change that
modifies how the sidebar displays the content hierarchy that's been put
into an if block to not interfere with regular docs.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 33 +++++++++++++++++++---------
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 11 +++++++---
doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
dts/doc/meson.build | 27 +++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++
meson.build | 1 +
10 files changed, 148 insertions(+), 19 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..aea771a64e 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,50 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run, PIPE, STDOUT
from packaging.version import Version
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
# for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
if Version(ver) >= Version('1.7'):
sphinx_cmd += ['-j', 'auto']
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index a6a768bd7c..b49b24acce 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -241,3 +241,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index e94c9e4e46..d53edeba57 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -121,6 +121,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,14 +33,18 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +65,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index b6a3e1e791..49be2dbfa1 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -284,7 +284,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure, the corresponding changes must
+be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -413,3 +418,30 @@ There are three tools used in DTS to help with code checking, style and formatti
These three tools are all used in ``devtools/dts-check-format.sh``,
the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+
+
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system. After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 5e161f43e5..001fdcbbbf 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v3 0/3] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-01-22 12:00 ` [PATCH v2 0/3] dts: API docs generation Juraj Linkeš
@ 2024-01-22 16:35 ` Juraj Linkeš
2024-01-22 16:35 ` [PATCH v3 1/3] dts: add doc generation dependencies Juraj Linkeš
` (2 more replies)
2024-04-12 10:14 ` [PATCH v4 0/3] dts: API docs generation Juraj Linkeš
` (15 subsequent siblings)
19 siblings, 3 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-01-22 16:35 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
Juraj Linkeš (3):
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 33 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.dts.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 17 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.rst | 30 ++
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 41 ++
dts/doc/meson.build | 27 +
dts/meson.build | 16 +
dts/poetry.lock | 499 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
45 files changed, 950 insertions(+), 20 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.dts.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v3 1/3] dts: add doc generation dependencies
2024-01-22 16:35 ` [PATCH v3 0/3] dts: API docs generation Juraj Linkeš
@ 2024-01-22 16:35 ` Juraj Linkeš
2024-01-22 16:35 ` [PATCH v3 2/3] dts: add API doc sources Juraj Linkeš
2024-01-22 16:35 ` [PATCH v3 3/3] dts: add API doc generation Juraj Linkeš
2 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-01-22 16:35 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 499 ++++++++++++++++++++++++++++++++++++++++++++-
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 1 deletion(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index a734fa71f0..8b27b0d751 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -404,6 +651,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -515,6 +773,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -632,6 +904,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -753,6 +1046,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -775,6 +1084,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "6.2.1"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
+ {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.2"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+ {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
+ {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -819,6 +1299,23 @@ files = [
{file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -837,4 +1334,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
+content-hash = "44c0fd4ebd7a1630ad104d55a25ef85d361abc968157254e757d44e340bca06f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 37a692d655..28bd970ae4 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<7"
+sphinx-rtd-theme = "^1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v3 2/3] dts: add API doc sources
2024-01-22 16:35 ` [PATCH v3 0/3] dts: API docs generation Juraj Linkeš
2024-01-22 16:35 ` [PATCH v3 1/3] dts: add doc generation dependencies Juraj Linkeš
@ 2024-01-22 16:35 ` Juraj Linkeš
2024-01-22 16:35 ` [PATCH v3 3/3] dts: add API doc generation Juraj Linkeš
2 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-01-22 16:35 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific config options in mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.dts.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 17 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.rst | 30 ++++++++++++++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 ++++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 +++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 41 +++++++++++++++++++
33 files changed, 297 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.dts.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..5af915b681
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+types - Configuration Types
+===========================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.dts.rst b/dts/doc/framework.dts.rst
new file mode 100644
index 0000000000..b1de438887
--- /dev/null
+++ b/dts/doc/framework.dts.rst
@@ -0,0 +1,6 @@
+dts - Testbed Setup and Test Suite Runner
+=========================================
+
+.. automodule:: framework.dts
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..74f83f0307
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,17 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.rst b/dts/doc/framework.rst
new file mode 100644
index 0000000000..978d5b5e38
--- /dev/null
+++ b/dts/doc/framework.rst
@@ -0,0 +1,30 @@
+framework - DTS Libraries
+=========================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :maxdepth: 3
+
+ framework.config
+ framework.remote_session
+ framework.testbed_model
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.dts
+ framework.exception
+ framework.logger
+ framework.settings
+ framework.test_result
+ framework.test_suite
+ framework.utils
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..41206c000b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffig Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..fc3b6d78b9
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,41 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.config
+ framework.remote_session
+ framework.testbed_model
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.dts
+ framework.exception
+ framework.logger
+ framework.settings
+ framework.test_result
+ framework.test_suite
+ framework.utils
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v3 3/3] dts: add API doc generation
2024-01-22 16:35 ` [PATCH v3 0/3] dts: API docs generation Juraj Linkeš
2024-01-22 16:35 ` [PATCH v3 1/3] dts: add doc generation dependencies Juraj Linkeš
2024-01-22 16:35 ` [PATCH v3 2/3] dts: add API doc sources Juraj Linkeš
@ 2024-01-22 16:35 ` Juraj Linkeš
2024-01-29 17:09 ` Jeremy Spewock
[not found] ` <CAJvnSUCNjo0p-yhROF1MNLKhjiAw2QTyTHO2hpOaVVUn0xnJ0A@mail.gmail.com>
2 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-01-22 16:35 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro
Cc: dev, Juraj Linkeš
The tool used to generate developer docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style, but it's
been augmented with doc-generating configuration. There's a change that
modifies how the sidebar displays the content hierarchy that's been put
into an if block to not interfere with regular docs.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 33 +++++++++++++++++++---------
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 11 +++++++---
doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
dts/doc/meson.build | 27 +++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++
meson.build | 1 +
10 files changed, 148 insertions(+), 19 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..aea771a64e 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,50 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run, PIPE, STDOUT
from packaging.version import Version
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
# for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
if Version(ver) >= Version('1.7'):
sphinx_cmd += ['-j', 'auto']
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index a6a768bd7c..b49b24acce 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -241,3 +241,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index e94c9e4e46..d53edeba57 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -121,6 +121,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,14 +33,18 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +65,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 846696e14e..21d3d89fc2 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -278,7 +278,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure, the corresponding changes must
+be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -413,6 +418,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system. After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 5e161f43e5..001fdcbbbf 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v3 3/3] dts: add API doc generation
2024-01-22 16:35 ` [PATCH v3 3/3] dts: add API doc generation Juraj Linkeš
@ 2024-01-29 17:09 ` Jeremy Spewock
[not found] ` <CAJvnSUCNjo0p-yhROF1MNLKhjiAw2QTyTHO2hpOaVVUn0xnJ0A@mail.gmail.com>
1 sibling, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-01-29 17:09 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, yoan.picchi, Luca.Vizzarro, dev
On Mon, Jan 22, 2024 at 11:35 AM Juraj Linkeš
<juraj.linkes@pantheon.tech> wrote:
>
> The tool used to generate developer docs is Sphinx, which is already in
> use in DPDK. The same configuration is used to preserve style, but it's
> been augmented with doc-generating configuration. There's a change that
> modifies how the sidebar displays the content hierarchy that's been put
> into an if block to not interfere with regular docs.
>
> Sphinx generates the documentation from Python docstrings. The docstring
> format is the Google format [0] which requires the sphinx.ext.napoleon
> extension. The other extension, sphinx.ext.intersphinx, enables linking
> to object in external documentations, such as the Python documentation.
>
> There are two requirements for building DTS docs:
> * The same Python version as DTS or higher, because Sphinx imports the
> code.
> * Also the same Python packages as DTS, for the same reason.
>
> [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
> buildtools/call-sphinx-build.py | 33 +++++++++++++++++++---------
> doc/api/doxy-api-index.md | 3 +++
> doc/api/doxy-api.conf.in | 2 ++
> doc/api/meson.build | 11 +++++++---
> doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
> doc/guides/meson.build | 1 +
> doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
> dts/doc/meson.build | 27 +++++++++++++++++++++++
> dts/meson.build | 16 ++++++++++++++
> meson.build | 1 +
> 10 files changed, 148 insertions(+), 19 deletions(-)
> create mode 100644 dts/doc/meson.build
> create mode 100644 dts/meson.build
>
> diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
> index 39a60d09fa..aea771a64e 100755
> --- a/buildtools/call-sphinx-build.py
> +++ b/buildtools/call-sphinx-build.py
> @@ -3,37 +3,50 @@
> # Copyright(c) 2019 Intel Corporation
> #
>
> +import argparse
> import sys
> import os
> from os.path import join
> from subprocess import run, PIPE, STDOUT
> from packaging.version import Version
>
> -# assign parameters to variables
> -(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
> +parser = argparse.ArgumentParser()
> +parser.add_argument('sphinx')
> +parser.add_argument('version')
> +parser.add_argument('src')
> +parser.add_argument('dst')
> +parser.add_argument('--dts-root', default=None)
> +args, extra_args = parser.parse_known_args()
>
> # set the version in environment for sphinx to pick up
> -os.environ['DPDK_VERSION'] = version
> +os.environ['DPDK_VERSION'] = args.version
> +if args.dts_root:
> + os.environ['DTS_ROOT'] = args.dts_root
>
> # for sphinx version >= 1.7 add parallelism using "-j auto"
> -ver = run([sphinx, '--version'], stdout=PIPE,
> +ver = run([args.sphinx, '--version'], stdout=PIPE,
> stderr=STDOUT).stdout.decode().split()[-1]
> -sphinx_cmd = [sphinx] + extra_args
> +sphinx_cmd = [args.sphinx] + extra_args
> if Version(ver) >= Version('1.7'):
> sphinx_cmd += ['-j', 'auto']
>
> # find all the files sphinx will process so we can write them as dependencies
> srcfiles = []
> -for root, dirs, files in os.walk(src):
> +for root, dirs, files in os.walk(args.src):
> srcfiles.extend([join(root, f) for f in files])
>
> +if not os.path.exists(args.dst):
> + os.makedirs(args.dst)
> +
> # run sphinx, putting the html output in a "html" directory
> -with open(join(dst, 'sphinx_html.out'), 'w') as out:
> - process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
> - stdout=out)
> +with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
> + process = run(
> + sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
> + stdout=out
> + )
>
> # create a gcc format .d file giving all the dependencies of this doc build
> -with open(join(dst, '.html.d'), 'w') as d:
> +with open(join(args.dst, '.html.d'), 'w') as d:
> d.write('html: ' + ' '.join(srcfiles) + '\n')
>
> sys.exit(process.returncode)
> diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
> index a6a768bd7c..b49b24acce 100644
> --- a/doc/api/doxy-api-index.md
> +++ b/doc/api/doxy-api-index.md
> @@ -241,3 +241,6 @@ The public API headers are grouped by topics:
> [experimental APIs](@ref rte_compat.h),
> [ABI versioning](@ref rte_function_versioning.h),
> [version](@ref rte_version.h)
> +
> +- **tests**:
> + [**DTS**](@dts_api_main_page)
> diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
> index e94c9e4e46..d53edeba57 100644
> --- a/doc/api/doxy-api.conf.in
> +++ b/doc/api/doxy-api.conf.in
> @@ -121,6 +121,8 @@ SEARCHENGINE = YES
> SORT_MEMBER_DOCS = NO
> SOURCE_BROWSER = YES
>
> +ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
> +
> EXAMPLE_PATH = @TOPDIR@/examples
> EXAMPLE_PATTERNS = *.c
> EXAMPLE_RECURSIVE = YES
> diff --git a/doc/api/meson.build b/doc/api/meson.build
> index 5b50692df9..ffc75d7b5a 100644
> --- a/doc/api/meson.build
> +++ b/doc/api/meson.build
> @@ -1,6 +1,7 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
>
> +doc_api_build_dir = meson.current_build_dir()
> doxygen = find_program('doxygen', required: get_option('enable_docs'))
>
> if not doxygen.found()
> @@ -32,14 +33,18 @@ example = custom_target('examples.dox',
> # set up common Doxygen configuration
> cdata = configuration_data()
> cdata.set('VERSION', meson.project_version())
> -cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
> -cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
> +cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
> +cdata.set('OUTPUT', doc_api_build_dir)
> cdata.set('TOPDIR', dpdk_source_root)
> -cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
> +cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
> cdata.set('WARN_AS_ERROR', 'NO')
> if get_option('werror')
> cdata.set('WARN_AS_ERROR', 'YES')
> endif
> +# A local reference must be relative to the main index.html page
> +# The path below can't be taken from the DTS meson file as that would
> +# require recursive subdir traversal (doc, dts, then doc again)
> +cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
>
> # configure HTML Doxygen run
> html_cdata = configuration_data()
> diff --git a/doc/guides/conf.py b/doc/guides/conf.py
> index 0f7ff5282d..b442a1f76c 100644
> --- a/doc/guides/conf.py
> +++ b/doc/guides/conf.py
> @@ -7,10 +7,9 @@
> from sphinx import __version__ as sphinx_version
> from os import listdir
> from os import environ
> -from os.path import basename
> -from os.path import dirname
> +from os.path import basename, dirname
> from os.path import join as path_join
> -from sys import argv, stderr
> +from sys import argv, stderr, path
>
> import configparser
>
> @@ -24,6 +23,37 @@
> file=stderr)
> pass
>
> +# Napoleon enables the Google format of Python doscstrings, used in DTS
> +# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
> +extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
> +
> +# DTS Python docstring options
> +autodoc_default_options = {
> + 'members': True,
> + 'member-order': 'bysource',
> + 'show-inheritance': True,
> +}
> +autodoc_class_signature = 'separated'
> +autodoc_typehints = 'both'
> +autodoc_typehints_format = 'short'
> +autodoc_typehints_description_target = 'documented'
> +napoleon_numpy_docstring = False
> +napoleon_attr_annotations = True
> +napoleon_preprocess_types = True
> +add_module_names = False
> +toc_object_entries = True
> +toc_object_entries_show_parents = 'hide'
> +intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
> +
> +dts_root = environ.get('DTS_ROOT')
> +if dts_root:
> + path.append(dts_root)
> + # DTS Sidebar config
> + html_theme_options = {
> + 'collapse_navigation': False,
> + 'navigation_depth': -1,
> + }
> +
> stop_on_error = ('-W' in argv)
>
> project = 'Data Plane Development Kit'
> @@ -35,8 +65,7 @@
> html_show_copyright = False
> highlight_language = 'none'
>
> -release = environ.setdefault('DPDK_VERSION', "None")
> -version = release
> +version = environ.setdefault('DPDK_VERSION', "None")
>
> master_doc = 'index'
>
> diff --git a/doc/guides/meson.build b/doc/guides/meson.build
> index 51f81da2e3..8933d75f6b 100644
> --- a/doc/guides/meson.build
> +++ b/doc/guides/meson.build
> @@ -1,6 +1,7 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2018 Intel Corporation
>
> +doc_guides_source_dir = meson.current_source_dir()
> sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
>
> if not sphinx.found()
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> index 846696e14e..21d3d89fc2 100644
> --- a/doc/guides/tools/dts.rst
> +++ b/doc/guides/tools/dts.rst
> @@ -278,7 +278,12 @@ and try not to divert much from it.
> The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
> when some of the basics are not met.
>
> -The code must be properly documented with docstrings.
> +The API documentation, which is a helpful reference when developing, may be accessed
> +in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
> +When adding new files or modifying the directory structure, the corresponding changes must
> +be made to DTS api doc sources in ``dts/doc``.
> +
> +Speaking of which, the code must be properly documented with docstrings.
> The style must conform to the `Google style
> <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
> See an example of the style `here
> @@ -413,6 +418,33 @@ the DTS code check and format script.
> Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
>
>
> +.. _building_api_docs:
> +
> +Building DTS API docs
> +---------------------
> +
> +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
> +
> +.. code-block:: console
> +
> + poetry install --with docs
> + poetry shell
> +
The only thing to note here is with newer versions of poetry this will
start to throw warnings because of the way we use poetry and don't
have a root package. It is just a warning message so it shouldn't
cause any real problems, but I believe the way we should be handling
it is passing --no-root into poetry install so that it knows not to
use the root package.
>
> +The documentation is built using the standard DPDK build system. After executing the meson command
> +and entering Poetry's shell, build the documentation with:
> +
> +.. code-block:: console
> +
> + ninja -C build dts-doc
> +
> +The output is generated in ``build/doc/api/dts/html``.
> +
> +.. Note::
> +
> + Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
> + the ``devtools/dts-check-format.sh`` script and address any issues it finds.
> +
> +
> Configuration Schema
> --------------------
>
> diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> new file mode 100644
> index 0000000000..01b7b51034
> --- /dev/null
> +++ b/dts/doc/meson.build
> @@ -0,0 +1,27 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +sphinx = find_program('sphinx-build', required: false)
> +sphinx_apidoc = find_program('sphinx-apidoc', required: false)
> +
> +if not sphinx.found() or not sphinx_apidoc.found()
> + subdir_done()
> +endif
> +
> +dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> +
> +extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
> +if get_option('werror')
> + extra_sphinx_args += '-W'
> +endif
> +
> +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
> +dts_api_html = custom_target('dts_api_html',
> + output: 'html',
> + command: [sphinx_wrapper, sphinx, meson.project_version(),
> + meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
> + build_by_default: false,
> + install: get_option('enable_docs'),
> + install_dir: htmldir)
> +doc_targets += dts_api_html
> +doc_target_names += 'DTS_API_HTML'
> diff --git a/dts/meson.build b/dts/meson.build
> new file mode 100644
> index 0000000000..e8ce0f06ac
> --- /dev/null
> +++ b/dts/meson.build
> @@ -0,0 +1,16 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +doc_targets = []
> +doc_target_names = []
> +dts_dir = meson.current_source_dir()
> +
> +subdir('doc')
> +
> +if doc_targets.length() == 0
> + message = 'No docs targets found'
> +else
> + message = 'Built docs:'
> +endif
> +run_target('dts-doc', command: [echo, message, doc_target_names],
> + depends: doc_targets)
> diff --git a/meson.build b/meson.build
> index 5e161f43e5..001fdcbbbf 100644
> --- a/meson.build
> +++ b/meson.build
> @@ -87,6 +87,7 @@ subdir('app')
>
> # build docs
> subdir('doc')
> +subdir('dts')
>
> # build any examples explicitly requested - useful for developers - and
> # install any example code into the appropriate install path
> --
> 2.34.1
>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
[parent not found: <CAJvnSUCNjo0p-yhROF1MNLKhjiAw2QTyTHO2hpOaVVUn0xnJ0A@mail.gmail.com>]
* Re: [PATCH v3 3/3] dts: add API doc generation
[not found] ` <CAJvnSUCNjo0p-yhROF1MNLKhjiAw2QTyTHO2hpOaVVUn0xnJ0A@mail.gmail.com>
@ 2024-02-29 18:12 ` Nicholas Pratte
0 siblings, 0 replies; 393+ messages in thread
From: Nicholas Pratte @ 2024-02-29 18:12 UTC (permalink / raw)
To: dev, Jeremy Spewock
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
----
The tool used to generate developer docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style, but it's
been augmented with doc-generating configuration. There's a change that
modifies how the sidebar displays the content hierarchy that's been put
into an if block to not interfere with regular docs.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 33 +++++++++++++++++++---------
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 11 +++++++---
doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
dts/doc/meson.build | 27 +++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++
meson.build | 1 +
10 files changed, 148 insertions(+), 19 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..aea771a64e 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,50 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run, PIPE, STDOUT
from packaging.version import Version
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
# for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
if Version(ver) >= Version('1.7'):
sphinx_cmd += ['-j', 'auto']
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index a6a768bd7c..b49b24acce 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -241,3 +241,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index e94c9e4e46..d53edeba57 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -121,6 +121,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,14 +33,18 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api',
'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root,
join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python
docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +65,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 846696e14e..21d3d89fc2 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -278,7 +278,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing,
may be accessed
+in the code directly or generated with the :ref:`API docs build steps
<building_api_docs>`.
+When adding new files or modifying the directory structure, the
corresponding changes must
+be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -413,6 +418,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then
enter its shell:
+
+.. code-block:: console
+
+ poetry install --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system.
After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating
docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir,
extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 5e161f43e5..001fdcbbbf 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v4 0/3] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-01-22 16:35 ` [PATCH v3 0/3] dts: API docs generation Juraj Linkeš
@ 2024-04-12 10:14 ` Juraj Linkeš
2024-04-12 10:14 ` [PATCH v4 1/3] dts: add doc generation dependencies Juraj Linkeš
` (3 more replies)
2024-06-24 13:26 ` [PATCH v5 0/4] " Juraj Linkeš
` (14 subsequent siblings)
19 siblings, 4 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-04-12 10:14 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
We can merge this patch when:
1. We agree on the approach with manually modifying the files.
This approach is, in my opinion, much better than just generating the
.rst files every time,
2. Bruce sends his ack on the meson modifications. I believe we had a
positive reaction on the previous version, but not this one.
3. The link to DTS API docs that was added to doxy-api-index.md is
satisfactory. I think Thomas could check this?
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
v4:
Rebase.
Juraj Linkeš (3):
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 33 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 17 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 41 ++
dts/doc/meson.build | 27 +
dts/meson.build | 16 +
dts/poetry.lock | 499 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
44 files changed, 920 insertions(+), 20 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v4 1/3] dts: add doc generation dependencies
2024-04-12 10:14 ` [PATCH v4 0/3] dts: API docs generation Juraj Linkeš
@ 2024-04-12 10:14 ` Juraj Linkeš
2024-05-31 10:42 ` Luca Vizzarro
2024-06-14 14:32 ` Jeremy Spewock
2024-04-12 10:14 ` [PATCH v4 2/3] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
3 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-04-12 10:14 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module when generating documentation from
docstrings, meaning all DTS dependencies, including Python version,
must be satisfied.
By adding Sphinx to DTS dependencies we provide a convenient way to
generate the DTS API docs which satisfies all dependencies.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 499 ++++++++++++++++++++++++++++++++++++++++++++-
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 1 deletion(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index a734fa71f0..8b27b0d751 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -404,6 +651,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -515,6 +773,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -632,6 +904,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -753,6 +1046,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -775,6 +1084,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "6.2.1"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
+ {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.2"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+ {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
+ {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -819,6 +1299,23 @@ files = [
{file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -837,4 +1334,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
+content-hash = "44c0fd4ebd7a1630ad104d55a25ef85d361abc968157254e757d44e340bca06f"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index a81e46fc07..8eb92b4f11 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<7"
+sphinx-rtd-theme = "^1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v4 1/3] dts: add doc generation dependencies
2024-04-12 10:14 ` [PATCH v4 1/3] dts: add doc generation dependencies Juraj Linkeš
@ 2024-05-31 10:42 ` Luca Vizzarro
2024-06-14 14:32 ` Jeremy Spewock
1 sibling, 0 replies; 393+ messages in thread
From: Luca Vizzarro @ 2024-05-31 10:42 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, npratte
Cc: dev
All looks good to me.
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v4 1/3] dts: add doc generation dependencies
2024-04-12 10:14 ` [PATCH v4 1/3] dts: add doc generation dependencies Juraj Linkeš
2024-05-31 10:42 ` Luca Vizzarro
@ 2024-06-14 14:32 ` Jeremy Spewock
1 sibling, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-06-14 14:32 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On Fri, Apr 12, 2024 at 6:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>
> Sphinx imports every Python module when generating documentation from
> docstrings, meaning all DTS dependencies, including Python version,
> must be satisfied.
> By adding Sphinx to DTS dependencies we provide a convenient way to
> generate the DTS API docs which satisfies all dependencies.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v4 2/3] dts: add API doc sources
2024-04-12 10:14 ` [PATCH v4 0/3] dts: API docs generation Juraj Linkeš
2024-04-12 10:14 ` [PATCH v4 1/3] dts: add doc generation dependencies Juraj Linkeš
@ 2024-04-12 10:14 ` Juraj Linkeš
2024-05-31 10:43 ` Luca Vizzarro
2024-06-14 14:32 ` Jeremy Spewock
2024-04-12 10:14 ` [PATCH v4 3/3] dts: add API doc generation Juraj Linkeš
2024-04-29 13:49 ` [PATCH v4 0/3] dts: API docs generation Jeremy Spewock
3 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-04-12 10:14 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 17 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 ++++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 +++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 41 +++++++++++++++++++
32 files changed, 267 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..5af915b681
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+types - Configuration Types
+===========================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..74f83f0307
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,17 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..41206c000b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffig Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..501e7204a7
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,41 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v4 2/3] dts: add API doc sources
2024-04-12 10:14 ` [PATCH v4 2/3] dts: add API doc sources Juraj Linkeš
@ 2024-05-31 10:43 ` Luca Vizzarro
2024-06-14 14:32 ` Jeremy Spewock
1 sibling, 0 replies; 393+ messages in thread
From: Luca Vizzarro @ 2024-05-31 10:43 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, npratte
Cc: dev
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v4 2/3] dts: add API doc sources
2024-04-12 10:14 ` [PATCH v4 2/3] dts: add API doc sources Juraj Linkeš
2024-05-31 10:43 ` Luca Vizzarro
@ 2024-06-14 14:32 ` Jeremy Spewock
1 sibling, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-06-14 14:32 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
I noticed one small typo but otherwise:
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
On Fri, Apr 12, 2024 at 6:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>
> These sources could be generated with the sphinx-apidoc utility, but
> that doesn't give us enough flexibility, such as sorting the order of
> modules or changing the headers of the modules.
>
> The sources included in this patch were in fact generated by said
> utility, but modified to improve the look of the documentation. The
> improvements are mainly in toctree definitions and the titles of the
> modules/packages. These were made with specific Sphinx config options in
> mind.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
<snip>
> diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
> new file mode 100644
> index 0000000000..41206c000b
> --- /dev/null
> +++ b/dts/doc/framework.testbed_model.tg_node.rst
> @@ -0,0 +1,6 @@
> +tg\_node - Traffig Generator Node
Typo: Traffic Generator Node
> +=================================
<snip>
> 2.34.1
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v4 3/3] dts: add API doc generation
2024-04-12 10:14 ` [PATCH v4 0/3] dts: API docs generation Juraj Linkeš
2024-04-12 10:14 ` [PATCH v4 1/3] dts: add doc generation dependencies Juraj Linkeš
2024-04-12 10:14 ` [PATCH v4 2/3] dts: add API doc sources Juraj Linkeš
@ 2024-04-12 10:14 ` Juraj Linkeš
2024-05-31 10:43 ` Luca Vizzarro
2024-04-29 13:49 ` [PATCH v4 0/3] dts: API docs generation Jeremy Spewock
3 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-04-12 10:14 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 33 +++++++++++++++++++---------
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 11 +++++++---
doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
dts/doc/meson.build | 27 +++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++
meson.build | 1 +
10 files changed, 148 insertions(+), 19 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..aea771a64e 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,50 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run, PIPE, STDOUT
from packaging.version import Version
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
# for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
if Version(ver) >= Version('1.7'):
sphinx_cmd += ['-j', 'auto']
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index 8c1eb8fafa..d5f823b7f0 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -243,3 +243,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index 27afec8b3b..2e08c6a452 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -123,6 +123,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,14 +33,18 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +65,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 47b218b2c6..d1c3c2af7a 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -280,7 +280,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure, the corresponding changes must
+be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -415,6 +420,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --no-root --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system. After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v4 0/3] dts: API docs generation
2024-04-12 10:14 ` [PATCH v4 0/3] dts: API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-04-12 10:14 ` [PATCH v4 3/3] dts: add API doc generation Juraj Linkeš
@ 2024-04-29 13:49 ` Jeremy Spewock
2024-04-29 14:12 ` Patrick Robb
3 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2024-04-29 13:49 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
<snip>
> The patchset contains the .rst sources which Sphinx uses to generate the
> html pages. These were first generated with the sphinx-apidoc utility
> and modified to provide a better look. The documentation just doesn't
> look that good without the modifications and there isn't enough
> configuration options to achieve that without manual changes to the .rst
> files. This introduces extra maintenance which involves adding new .rst
> files when a new Python module is added or changing the .rst structure
> if the Python directory/file structure is changed (moved, renamed
> files). This small maintenance burden is outweighed by the flexibility
> afforded by the ability to make manual changes to the .rst files.
>
> We can merge this patch when:
> 1. We agree on the approach with manually modifying the files.
> This approach is, in my opinion, much better than just generating the
> .rst files every time,
+1 for manually modifying .rst files. The .rst files are very simple,
and I think the added flexibility to change headers or tweak things as
needed is a big benefit over just auto-generating and not having as
much control. Additionally, if it just so happens that the
auto-generated file looks fine and the developer doesn't want to make
changes, they can still just generate it themselves and it fits right
in, so this approach still works with the other regardless.
> 2. Bruce sends his ack on the meson modifications. I believe we had a
> positive reaction on the previous version, but not this one.
> 3. The link to DTS API docs that was added to doxy-api-index.md is
> satisfactory. I think Thomas could check this?
>
<snip>
> --
> 2.34.1
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v4 0/3] dts: API docs generation
2024-04-29 13:49 ` [PATCH v4 0/3] dts: API docs generation Jeremy Spewock
@ 2024-04-29 14:12 ` Patrick Robb
0 siblings, 0 replies; 393+ messages in thread
From: Patrick Robb @ 2024-04-29 14:12 UTC (permalink / raw)
To: Jeremy Spewock
Cc: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, paul.szczepanek,
Luca.Vizzarro, npratte, dev
[-- Attachment #1: Type: text/plain, Size: 1550 bytes --]
On Mon, Apr 29, 2024 at 9:49 AM Jeremy Spewock <jspewock@iol.unh.edu> wrote:
> <snip>
> > The patchset contains the .rst sources which Sphinx uses to generate the
> > html pages. These were first generated with the sphinx-apidoc utility
> > and modified to provide a better look. The documentation just doesn't
> > look that good without the modifications and there isn't enough
> > configuration options to achieve that without manual changes to the .rst
> > files. This introduces extra maintenance which involves adding new .rst
> > files when a new Python module is added or changing the .rst structure
> > if the Python directory/file structure is changed (moved, renamed
> > files). This small maintenance burden is outweighed by the flexibility
> > afforded by the ability to make manual changes to the .rst files.
> >
> > We can merge this patch when:
> > 1. We agree on the approach with manually modifying the files.
> > This approach is, in my opinion, much better than just generating the
> > .rst files every time,
>
> +1 for manually modifying .rst files. The .rst files are very simple,
> and I think the added flexibility to change headers or tweak things as
> needed is a big benefit over just auto-generating and not having as
> much control. Additionally, if it just so happens that the
> auto-generated file looks fine and the developer doesn't want to make
> changes, they can still just generate it themselves and it fits right
> in, so this approach still works with the other regardless.
>
> +1
[-- Attachment #2: Type: text/html, Size: 1977 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 0/4] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (4 preceding siblings ...)
2024-04-12 10:14 ` [PATCH v4 0/3] dts: API docs generation Juraj Linkeš
@ 2024-06-24 13:26 ` Juraj Linkeš
2024-06-24 13:26 ` [PATCH v5 1/4] dts: update params and parser docstrings Juraj Linkeš
` (3 more replies)
2024-06-24 13:45 ` [PATCH v6 0/4] dts: API docs generation Juraj Linkeš
` (13 subsequent siblings)
19 siblings, 4 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:26 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
v4:
Rebase.
v5:
Another rebase, but this time the rebase needed the addition of .rst
corresponding to newly added files as well as fixing a few documentation
problems in said files.
Juraj Linkeš (4):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 31 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 27 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
dts/meson.build | 16 +
dts/poetry.lock | 510 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
54 files changed, 977 insertions(+), 37 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 1/4] dts: update params and parser docstrings
2024-06-24 13:26 ` [PATCH v5 0/4] " Juraj Linkeš
@ 2024-06-24 13:26 ` Juraj Linkeš
2024-06-24 13:26 ` [PATCH v5 2/4] dts: add doc generation dependencies Juraj Linkeš
` (2 subsequent siblings)
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:26 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 2/4] dts: add doc generation dependencies
2024-06-24 13:26 ` [PATCH v5 0/4] " Juraj Linkeš
2024-06-24 13:26 ` [PATCH v5 1/4] dts: update params and parser docstrings Juraj Linkeš
@ 2024-06-24 13:26 ` Juraj Linkeš
2024-06-24 13:26 ` [PATCH v5 3/4] dts: add API doc sources Juraj Linkeš
2024-06-24 13:26 ` [PATCH v5 4/4] dts: add API doc generation Juraj Linkeš
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:26 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
Sphinx imports every Python module when generating documentation from
docstrings, meaning all DTS dependencies, including Python version,
must be satisfied.
By adding Sphinx to DTS dependencies we provide a convenient way to
generate the DTS API docs which satisfies all dependencies.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/poetry.lock | 510 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..b6e27f8f38 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -520,6 +778,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +857,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +864,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +882,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +889,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +909,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1051,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1089,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1304,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1339,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "8b9d9363fa0130186f2d72392de6b9d74696c7b1250a4f346f6264c8c07318a0"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..31c7824204 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 3/4] dts: add API doc sources
2024-06-24 13:26 ` [PATCH v5 0/4] " Juraj Linkeš
2024-06-24 13:26 ` [PATCH v5 1/4] dts: update params and parser docstrings Juraj Linkeš
2024-06-24 13:26 ` [PATCH v5 2/4] dts: add doc generation dependencies Juraj Linkeš
@ 2024-06-24 13:26 ` Juraj Linkeš
2024-06-24 13:26 ` [PATCH v5 4/4] dts: add API doc generation Juraj Linkeš
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:26 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v5 4/4] dts: add API doc generation
2024-06-24 13:26 ` [PATCH v5 0/4] " Juraj Linkeš
` (2 preceding siblings ...)
2024-06-24 13:26 ` [PATCH v5 3/4] dts: add API doc sources Juraj Linkeš
@ 2024-06-24 13:26 ` Juraj Linkeš
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:26 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 31 ++++++++++++++++++--------
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 11 +++++++---
doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
dts/doc/meson.build | 27 +++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++
meson.build | 1 +
10 files changed, 147 insertions(+), 18 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index da19e950c9..dff8471560 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,31 +3,44 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9283154f8..cc214ede46 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -244,3 +244,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,14 +33,18 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +65,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..bd42025507 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -292,7 +292,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure, the corresponding changes must
+be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +432,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --no-root --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system. After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 0/4] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (5 preceding siblings ...)
2024-06-24 13:26 ` [PATCH v5 0/4] " Juraj Linkeš
@ 2024-06-24 13:45 ` Juraj Linkeš
2024-06-24 13:45 ` [PATCH v6 1/4] dts: update params and parser docstrings Juraj Linkeš
` (3 more replies)
2024-06-24 14:25 ` [PATCH v7 0/4] dts: API docs generation Juraj Linkeš
` (12 subsequent siblings)
19 siblings, 4 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:45 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
v4:
Rebase.
v5:
Another rebase, but this time the rebase needed the addition of .rst
corresponding to newly added files as well as fixing a few documentation
problems in said files.
v6:
Documentation formatting adjustments.
Juraj Linkeš (4):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
Juraj Linkeš (4):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 31 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 27 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
dts/meson.build | 16 +
dts/poetry.lock | 510 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
54 files changed, 977 insertions(+), 37 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 1/4] dts: update params and parser docstrings
2024-06-24 13:45 ` [PATCH v6 0/4] dts: API docs generation Juraj Linkeš
@ 2024-06-24 13:45 ` Juraj Linkeš
2024-06-24 13:45 ` [PATCH v6 2/4] dts: add doc generation dependencies Juraj Linkeš
` (2 subsequent siblings)
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:45 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 2/4] dts: add doc generation dependencies
2024-06-24 13:45 ` [PATCH v6 0/4] dts: API docs generation Juraj Linkeš
2024-06-24 13:45 ` [PATCH v6 1/4] dts: update params and parser docstrings Juraj Linkeš
@ 2024-06-24 13:45 ` Juraj Linkeš
2024-06-24 13:45 ` [PATCH v6 3/4] dts: add API doc sources Juraj Linkeš
2024-06-24 13:46 ` [PATCH v6 4/4] dts: add API doc generation Juraj Linkeš
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:45 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
Sphinx imports every Python module when generating documentation from
docstrings, meaning all DTS dependencies, including Python version,
must be satisfied.
By adding Sphinx to DTS dependencies we provide a convenient way to
generate the DTS API docs which satisfies all dependencies.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/poetry.lock | 510 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..b6e27f8f38 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -520,6 +778,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +857,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +864,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +882,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +889,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +909,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1051,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1089,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1304,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1339,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "8b9d9363fa0130186f2d72392de6b9d74696c7b1250a4f346f6264c8c07318a0"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..31c7824204 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 3/4] dts: add API doc sources
2024-06-24 13:45 ` [PATCH v6 0/4] dts: API docs generation Juraj Linkeš
2024-06-24 13:45 ` [PATCH v6 1/4] dts: update params and parser docstrings Juraj Linkeš
2024-06-24 13:45 ` [PATCH v6 2/4] dts: add doc generation dependencies Juraj Linkeš
@ 2024-06-24 13:45 ` Juraj Linkeš
2024-06-24 13:46 ` [PATCH v6 4/4] dts: add API doc generation Juraj Linkeš
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:45 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v6 4/4] dts: add API doc generation
2024-06-24 13:45 ` [PATCH v6 0/4] dts: API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-06-24 13:45 ` [PATCH v6 3/4] dts: add API doc sources Juraj Linkeš
@ 2024-06-24 13:46 ` Juraj Linkeš
2024-06-24 13:53 ` Bruce Richardson
` (2 more replies)
3 siblings, 3 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 13:46 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 31 ++++++++++++++++++--------
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 11 +++++++---
doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
dts/doc/meson.build | 27 +++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++
meson.build | 1 +
10 files changed, 147 insertions(+), 18 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index da19e950c9..dff8471560 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,31 +3,44 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9283154f8..cc214ede46 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -244,3 +244,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,14 +33,18 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +65,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..bd42025507 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -292,7 +292,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure, the corresponding changes must
+be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +432,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --no-root --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system. After executing the meson command
+and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings. Also make sure to run
+ the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 4/4] dts: add API doc generation
2024-06-24 13:46 ` [PATCH v6 4/4] dts: add API doc generation Juraj Linkeš
@ 2024-06-24 13:53 ` Bruce Richardson
2024-06-24 14:08 ` Juraj Linkeš
2024-06-24 14:25 ` Thomas Monjalon
2 siblings, 0 replies; 393+ messages in thread
From: Bruce Richardson @ 2024-06-24 13:53 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dev
On Mon, Jun 24, 2024 at 03:46:00PM +0200, Juraj Linkeš wrote:
> The tool used to generate DTS API docs is Sphinx, which is already in
> use in DPDK. The same configuration is used to preserve style with one
> DTS-specific configuration (so that the DPDK docs are unchanged) that
> modifies how the sidebar displays the content.
>
> Sphinx generates the documentation from Python docstrings. The docstring
> format is the Google format [0] which requires the sphinx.ext.napoleon
> extension. The other extension, sphinx.ext.intersphinx, enables linking
> to object in external documentations, such as the Python documentation.
>
> There are two requirements for building DTS docs:
> * The same Python version as DTS or higher, because Sphinx imports the
> code.
> * Also the same Python packages as DTS, for the same reason.
>
> [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
> Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
> ---
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 4/4] dts: add API doc generation
2024-06-24 13:46 ` [PATCH v6 4/4] dts: add API doc generation Juraj Linkeš
2024-06-24 13:53 ` Bruce Richardson
@ 2024-06-24 14:08 ` Juraj Linkeš
2024-06-24 14:25 ` Thomas Monjalon
2 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 14:08 UTC (permalink / raw)
To: thomas
Cc: dev, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Hi Thomas,
I believe the only open question in this patch set is the linking of DTS
API docs on the main doxygen page. I've left only the parts relevant to
the question so that it's easier for us to address it.
> diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
> index f9283154f8..cc214ede46 100644
> --- a/doc/api/doxy-api-index.md
> +++ b/doc/api/doxy-api-index.md
> @@ -244,3 +244,6 @@ The public API headers are grouped by topics:
> [experimental APIs](@ref rte_compat.h),
> [ABI versioning](@ref rte_function_versioning.h),
> [version](@ref rte_version.h)
> +
> +- **tests**:
> + [**DTS**](@dts_api_main_page)
> diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
> index a8823c046f..c94f02d411 100644
> --- a/doc/api/doxy-api.conf.in
> +++ b/doc/api/doxy-api.conf.in
> @@ -124,6 +124,8 @@ SEARCHENGINE = YES
> SORT_MEMBER_DOCS = NO
> SOURCE_BROWSER = YES
>
> +ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
> +
> EXAMPLE_PATH = @TOPDIR@/examples
> EXAMPLE_PATTERNS = *.c
> EXAMPLE_RECURSIVE = YES
> diff --git a/doc/api/meson.build b/doc/api/meson.build
> index 5b50692df9..ffc75d7b5a 100644
> --- a/doc/api/meson.build
> +++ b/doc/api/meson.build
> @@ -32,14 +33,18 @@ example = custom_target('examples.dox',
> # set up common Doxygen configuration
> cdata = configuration_data()
> cdata.set('VERSION', meson.project_version())
> -cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
> -cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
> +cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
> +cdata.set('OUTPUT', doc_api_build_dir)
> cdata.set('TOPDIR', dpdk_source_root)
> -cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
> +cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
These three changes are here only for context, they're not relevant to
the linking question.
> cdata.set('WARN_AS_ERROR', 'NO')
> if get_option('werror')
> cdata.set('WARN_AS_ERROR', 'YES')
> endif
> +# A local reference must be relative to the main index.html page
> +# The path below can't be taken from the DTS meson file as that would
> +# require recursive subdir traversal (doc, dts, then doc again)
> +cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
This is where the path is actually set.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v6 4/4] dts: add API doc generation
2024-06-24 13:46 ` [PATCH v6 4/4] dts: add API doc generation Juraj Linkeš
2024-06-24 13:53 ` Bruce Richardson
2024-06-24 14:08 ` Juraj Linkeš
@ 2024-06-24 14:25 ` Thomas Monjalon
2 siblings, 0 replies; 393+ messages in thread
From: Thomas Monjalon @ 2024-06-24 14:25 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev, Luca Vizzarro
24/06/2024 15:46, Juraj Linkeš:
> The tool used to generate DTS API docs is Sphinx, which is already in
> use in DPDK. The same configuration is used to preserve style with one
> DTS-specific configuration (so that the DPDK docs are unchanged) that
> modifies how the sidebar displays the content.
>
> Sphinx generates the documentation from Python docstrings. The docstring
> format is the Google format [0] which requires the sphinx.ext.napoleon
> extension. The other extension, sphinx.ext.intersphinx, enables linking
> to object in external documentations, such as the Python documentation.
>
> There are two requirements for building DTS docs:
> * The same Python version as DTS or higher, because Sphinx imports the
> code.
> * Also the same Python packages as DTS, for the same reason.
>
> [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
> Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
> Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
> ---
> buildtools/call-sphinx-build.py | 31 ++++++++++++++++++--------
> doc/api/doxy-api-index.md | 3 +++
> doc/api/doxy-api.conf.in | 2 ++
> doc/api/meson.build | 11 +++++++---
> doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
> doc/guides/meson.build | 1 +
> doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
> dts/doc/meson.build | 27 +++++++++++++++++++++++
> dts/meson.build | 16 ++++++++++++++
> meson.build | 1 +
> 10 files changed, 147 insertions(+), 18 deletions(-)
> create mode 100644 dts/doc/meson.build
> create mode 100644 dts/meson.build
There are unrelated changes in this patch.
Please I would prefer to have existing code changed in a separate patch,
so we have only DTS addition in the last patch.
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 0/4] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (6 preceding siblings ...)
2024-06-24 13:45 ` [PATCH v6 0/4] dts: API docs generation Juraj Linkeš
@ 2024-06-24 14:25 ` Juraj Linkeš
2024-06-24 14:25 ` [PATCH v7 1/4] dts: update params and parser docstrings Juraj Linkeš
` (3 more replies)
2024-07-12 8:57 ` [PATCH v8 0/5] dts: API docs generation Juraj Linkeš
` (11 subsequent siblings)
19 siblings, 4 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 14:25 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
v4:
Rebase.
v5:
Another rebase, but this time the rebase needed the addition of .rst
corresponding to newly added files as well as fixing a few documentation
problems in said files.
v6:
Documentation formatting adjustments.
v7:
Now with the actual doc changes.
Juraj Linkeš (4):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 31 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 27 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
dts/meson.build | 16 +
dts/poetry.lock | 510 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
54 files changed, 977 insertions(+), 37 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 1/4] dts: update params and parser docstrings
2024-06-24 14:25 ` [PATCH v7 0/4] dts: API docs generation Juraj Linkeš
@ 2024-06-24 14:25 ` Juraj Linkeš
2024-06-24 15:37 ` Luca Vizzarro
2024-06-24 14:25 ` [PATCH v7 2/4] dts: add doc generation dependencies Juraj Linkeš
` (2 subsequent siblings)
3 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 14:25 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 2/4] dts: add doc generation dependencies
2024-06-24 14:25 ` [PATCH v7 0/4] dts: API docs generation Juraj Linkeš
2024-06-24 14:25 ` [PATCH v7 1/4] dts: update params and parser docstrings Juraj Linkeš
@ 2024-06-24 14:25 ` Juraj Linkeš
2024-06-24 14:25 ` [PATCH v7 3/4] dts: add API doc sources Juraj Linkeš
2024-06-24 14:25 ` [PATCH v7 4/4] dts: add API doc generation Juraj Linkeš
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 14:25 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
Sphinx imports every Python module when generating documentation from
docstrings, meaning all DTS dependencies, including Python version,
must be satisfied.
By adding Sphinx to DTS dependencies we provide a convenient way to
generate the DTS API docs which satisfies all dependencies.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/poetry.lock | 510 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..b6e27f8f38 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -520,6 +778,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +857,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +864,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +882,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +889,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +909,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1051,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1089,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1304,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1339,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "8b9d9363fa0130186f2d72392de6b9d74696c7b1250a4f346f6264c8c07318a0"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..31c7824204 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 3/4] dts: add API doc sources
2024-06-24 14:25 ` [PATCH v7 0/4] dts: API docs generation Juraj Linkeš
2024-06-24 14:25 ` [PATCH v7 1/4] dts: update params and parser docstrings Juraj Linkeš
2024-06-24 14:25 ` [PATCH v7 2/4] dts: add doc generation dependencies Juraj Linkeš
@ 2024-06-24 14:25 ` Juraj Linkeš
2024-06-24 14:25 ` [PATCH v7 4/4] dts: add API doc generation Juraj Linkeš
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 14:25 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v7 4/4] dts: add API doc generation
2024-06-24 14:25 ` [PATCH v7 0/4] dts: API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-06-24 14:25 ` [PATCH v7 3/4] dts: add API doc sources Juraj Linkeš
@ 2024-06-24 14:25 ` Juraj Linkeš
3 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-06-24 14:25 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to object in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 31 ++++++++++++++++++--------
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 11 +++++++---
doc/guides/conf.py | 39 ++++++++++++++++++++++++++++-----
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +++++++++++++++++++++++++++-
dts/doc/meson.build | 27 +++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++
meson.build | 1 +
10 files changed, 147 insertions(+), 18 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index da19e950c9..dff8471560 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,31 +3,44 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9283154f8..cc214ede46 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -244,3 +244,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,14 +33,18 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
@@ -35,8 +65,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..77df7a0378 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -292,7 +292,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +432,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --no-root --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system.
+After executing the meson command and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings,
+ and also run the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 0/5] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (7 preceding siblings ...)
2024-06-24 14:25 ` [PATCH v7 0/4] dts: API docs generation Juraj Linkeš
@ 2024-07-12 8:57 ` Juraj Linkeš
2024-07-12 8:57 ` [PATCH v8 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-01 9:18 ` [PATCH v9 0/5] dts: API docs generation Juraj Linkeš
` (10 subsequent siblings)
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-07-12 8:57 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
v4:
Rebase.
v5:
Another rebase, but this time the rebase needed the addition of .rst
corresponding to newly added files as well as fixing a few documentation
problems in said files.
v6:
Documentation formatting adjustments.
v7:
Now with the actual doc changes.
v8:
Split the last commit into non-DTS and DTS changes.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
doc: guides and API meson improvements
dts: add API doc generation
buildtools/call-sphinx-build.py | 31 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 27 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
dts/meson.build | 16 +
dts/poetry.lock | 510 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
54 files changed, 977 insertions(+), 37 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 1/5] dts: update params and parser docstrings
2024-07-12 8:57 ` [PATCH v8 0/5] dts: API docs generation Juraj Linkeš
@ 2024-07-12 8:57 ` Juraj Linkeš
2024-07-12 8:57 ` [PATCH v8 2/5] dts: add doc generation dependencies Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-07-12 8:57 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 2/5] dts: add doc generation dependencies
2024-07-12 8:57 ` [PATCH v8 0/5] dts: API docs generation Juraj Linkeš
2024-07-12 8:57 ` [PATCH v8 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-07-12 8:57 ` Juraj Linkeš
2024-07-12 8:57 ` [PATCH v8 3/5] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-07-12 8:57 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
Sphinx imports every Python module when generating documentation from
docstrings, meaning all DTS dependencies, including Python version,
must be satisfied.
By adding Sphinx to DTS dependencies we provide a convenient way to
generate the DTS API docs which satisfies all dependencies.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/poetry.lock | 510 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..b6e27f8f38 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -520,6 +778,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +857,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +864,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +882,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +889,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +909,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1051,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1089,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1304,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1339,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "8b9d9363fa0130186f2d72392de6b9d74696c7b1250a4f346f6264c8c07318a0"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..31c7824204 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 3/5] dts: add API doc sources
2024-07-12 8:57 ` [PATCH v8 0/5] dts: API docs generation Juraj Linkeš
2024-07-12 8:57 ` [PATCH v8 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-07-12 8:57 ` [PATCH v8 2/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-07-12 8:57 ` Juraj Linkeš
2024-07-12 8:57 ` [PATCH v8 4/5] doc: guides and API meson improvements Juraj Linkeš
2024-07-12 8:57 ` [PATCH v8 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-07-12 8:57 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 4/5] doc: guides and API meson improvements
2024-07-12 8:57 ` [PATCH v8 0/5] dts: API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-07-12 8:57 ` [PATCH v8 3/5] dts: add API doc sources Juraj Linkeš
@ 2024-07-12 8:57 ` Juraj Linkeš
2024-07-30 13:28 ` Thomas Monjalon
2024-07-12 8:57 ` [PATCH v8 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-07-12 8:57 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The Sphinx script argument parsing improvement gives us more
flexibility going forward, such as the ability to add non-positional
arguments.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 28 +++++++++++++++++++---------
doc/api/meson.build | 7 ++++---
doc/guides/conf.py | 6 ++----
3 files changed, 25 insertions(+), 16 deletions(-)
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index da19e950c9..693274da4e 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,31 +3,41 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import sys
import os
from os.path import join
from subprocess import run
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
sys.exit(process.returncode)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..b828b1ed66 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,10 +33,10 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..8b440fb2a9 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,8 +7,7 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
from sys import argv, stderr
@@ -35,8 +34,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 4/5] doc: guides and API meson improvements
2024-07-12 8:57 ` [PATCH v8 4/5] doc: guides and API meson improvements Juraj Linkeš
@ 2024-07-30 13:28 ` Thomas Monjalon
2024-08-01 10:02 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Thomas Monjalon @ 2024-07-30 13:28 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev, Luca Vizzarro
12/07/2024 10:57, Juraj Linkeš:
> The Sphinx script argument parsing improvement gives us more
> flexibility going forward, such as the ability to add non-positional
> arguments.
You should describe what is changed and why.
> -release = environ.setdefault('DPDK_VERSION', "None")
> -version = release
> +version = environ.setdefault('DPDK_VERSION', "None")
I'm quite sure "release" was set for a reason.
Did it change over time with recent Sphinx?
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 4/5] doc: guides and API meson improvements
2024-07-30 13:28 ` Thomas Monjalon
@ 2024-08-01 10:02 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 10:02 UTC (permalink / raw)
To: Thomas Monjalon
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On 30. 7. 2024 15:28, Thomas Monjalon wrote:
> 12/07/2024 10:57, Juraj Linkeš:
>> The Sphinx script argument parsing improvement gives us more
>> flexibility going forward, such as the ability to add non-positional
>> arguments.
>
> You should describe what is changed and why.
>
The Sphinx script argument parsing improvement gives us more
flexibility going forward, such as the ability to add non-positional
arguments. What is currently missing is the ability to define an
argument with default value, which is added here.
The other change just cleans up the code a bit, replacing the path:
dpdk_build_root/doc/api
with
meson.current_build_dir(),
which is the same, as it's called in the above dir that was previously
hardcoded.
>
>> -release = environ.setdefault('DPDK_VERSION', "None")
>> -version = release
>> +version = environ.setdefault('DPDK_VERSION', "None")
>
> I'm quite sure "release" was set for a reason.
> Did it change over time with recent Sphinx?
>
>
I looked at the docs and it didn't. I didn't realize this was a Sphinx
setting. I'll revert this.
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v8 5/5] dts: add API doc generation
2024-07-12 8:57 ` [PATCH v8 0/5] dts: API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-07-12 8:57 ` [PATCH v8 4/5] doc: guides and API meson improvements Juraj Linkeš
@ 2024-07-12 8:57 ` Juraj Linkeš
2024-07-30 13:51 ` Thomas Monjalon
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-07-12 8:57 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 3 +++
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 4 ++++
doc/guides/conf.py | 33 +++++++++++++++++++++++++++++++-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 ++++++++++++++++++++++++++++++++-
dts/doc/meson.build | 27 ++++++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++++
meson.build | 1 +
10 files changed, 122 insertions(+), 2 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 693274da4e..dff8471560 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -14,10 +14,13 @@
parser.add_argument('version')
parser.add_argument('src')
parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
sphinx_cmd = [args.sphinx] + extra_args
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9283154f8..cc214ede46 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -244,3 +244,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index b828b1ed66..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -41,6 +41,10 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 8b440fb2a9..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -9,7 +9,7 @@
from os import environ
from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -23,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..77df7a0378 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -292,7 +292,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +432,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --no-root --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system.
+After executing the meson command and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings,
+ and also run the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 5/5] dts: add API doc generation
2024-07-12 8:57 ` [PATCH v8 5/5] dts: add API doc generation Juraj Linkeš
@ 2024-07-30 13:51 ` Thomas Monjalon
2024-08-01 13:03 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Thomas Monjalon @ 2024-07-30 13:51 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev, Luca Vizzarro
12/07/2024 10:57, Juraj Linkeš:
> The tool used to generate DTS API docs is Sphinx, which is already in
> use in DPDK. The same configuration is used to preserve style with one
> DTS-specific configuration (so that the DPDK docs are unchanged) that
> modifies how the sidebar displays the content.
What is changed in the sidebar?
> --- a/doc/api/doxy-api-index.md
> +++ b/doc/api/doxy-api-index.md
> @@ -244,3 +244,6 @@ The public API headers are grouped by topics:
> [experimental APIs](@ref rte_compat.h),
> [ABI versioning](@ref rte_function_versioning.h),
> [version](@ref rte_version.h)
> +
> +- **tests**:
> + [**DTS**](@dts_api_main_page)
OK looks good
> --- a/doc/api/doxy-api.conf.in
> +++ b/doc/api/doxy-api.conf.in
> @@ -124,6 +124,8 @@ SEARCHENGINE = YES
> SORT_MEMBER_DOCS = NO
> SOURCE_BROWSER = YES
>
> +ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
Why is it needed?
That's the only way to reference it in doxy-api-index.md?
Would be nice to explain in the commit log.
> --- a/doc/api/meson.build
> +++ b/doc/api/meson.build
> +# A local reference must be relative to the main index.html page
> +# The path below can't be taken from the DTS meson file as that would
> +# require recursive subdir traversal (doc, dts, then doc again)
This comment is really obscure.
> +cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
Oh I think I get it:
- DTS_API_MAIN_PAGE is the Meson variable
- dts_api_main_page is the Doxygen variable
> +# Napoleon enables the Google format of Python doscstrings, used in DTS
> +# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
Close sentences with a dot, it is easier to read.
> +extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
> +
> +# DTS Python docstring options
> +autodoc_default_options = {
> + 'members': True,
> + 'member-order': 'bysource',
> + 'show-inheritance': True,
> +}
> +autodoc_class_signature = 'separated'
> +autodoc_typehints = 'both'
> +autodoc_typehints_format = 'short'
> +autodoc_typehints_description_target = 'documented'
> +napoleon_numpy_docstring = False
> +napoleon_attr_annotations = True
> +napoleon_preprocess_types = True
> +add_module_names = False
> +toc_object_entries = True
> +toc_object_entries_show_parents = 'hide'
> +intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
> +
> +dts_root = environ.get('DTS_ROOT')
Why does it need to be passed as an environment variable?
Isn't it a fixed absolute path?
> +if dts_root:
> + path.append(dts_root)
> + # DTS Sidebar config
> + html_theme_options = {
> + 'collapse_navigation': False,
> + 'navigation_depth': -1,
> + }
[...]
> +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
I don't plan to use Poetry on my machine.
Can we simply describe the dependencies even if the versions are not specified?
> +
> +.. code-block:: console
> +
> + poetry install --no-root --with docs
> + poetry shell
> +
> +The documentation is built using the standard DPDK build system.
> +After executing the meson command and entering Poetry's shell, build the documentation with:
> +
> +.. code-block:: console
> +
> + ninja -C build dts-doc
Don't we rely on the Meson option "enable_docs"?
> +
> +The output is generated in ``build/doc/api/dts/html``.
> +
> +.. Note::
In general the RST expressions are lowercase.
> +
> + Make sure to fix any Sphinx warnings when adding or updating docstrings,
> + and also run the ``devtools/dts-check-format.sh`` script and address any issues it finds.
It looks like something to write in the contributing guide.
> +++ b/dts/doc/meson.build
> @@ -0,0 +1,27 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +sphinx = find_program('sphinx-build', required: false)
> +sphinx_apidoc = find_program('sphinx-apidoc', required: false)
> +
> +if not sphinx.found() or not sphinx_apidoc.found()
You should include the option "enable_docs" here.
> + subdir_done()
> +endif
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 5/5] dts: add API doc generation
2024-07-30 13:51 ` Thomas Monjalon
@ 2024-08-01 13:03 ` Juraj Linkeš
2024-08-01 15:07 ` Thomas Monjalon
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 13:03 UTC (permalink / raw)
To: Thomas Monjalon
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On 30. 7. 2024 15:51, Thomas Monjalon wrote:
> 12/07/2024 10:57, Juraj Linkeš:
>> The tool used to generate DTS API docs is Sphinx, which is already in
>> use in DPDK. The same configuration is used to preserve style with one
>> DTS-specific configuration (so that the DPDK docs are unchanged) that
>> modifies how the sidebar displays the content.
>
> What is changed in the sidebar?
>
These are the two changes:
html_theme_options = {
'collapse_navigation': False,
'navigation_depth': -1,
}
The first allows you to explore the structure without needing to enter
any specific section - it puts the + at each section so everything is
expandable.
The second just means that each section can be fully expanded (there's
no limit).
>
>> --- a/doc/api/doxy-api-index.md
>> +++ b/doc/api/doxy-api-index.md
>> @@ -244,3 +244,6 @@ The public API headers are grouped by topics:
>> [experimental APIs](@ref rte_compat.h),
>> [ABI versioning](@ref rte_function_versioning.h),
>> [version](@ref rte_version.h)
>> +
>> +- **tests**:
>> + [**DTS**](@dts_api_main_page)
>
> OK looks good
>
>
>> --- a/doc/api/doxy-api.conf.in
>> +++ b/doc/api/doxy-api.conf.in
>> @@ -124,6 +124,8 @@ SEARCHENGINE = YES
>> SORT_MEMBER_DOCS = NO
>> SOURCE_BROWSER = YES
>>
>> +ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
>
> Why is it needed?
> That's the only way to reference it in doxy-api-index.md?
> Would be nice to explain in the commit log.
>
I can add something to the commit log. The questions are answered below,
in your other related comment.
>> --- a/doc/api/meson.build
>> +++ b/doc/api/meson.build
>> +# A local reference must be relative to the main index.html page
>> +# The path below can't be taken from the DTS meson file as that would
>> +# require recursive subdir traversal (doc, dts, then doc again)
>
> This comment is really obscure.
>
I guess it is. I just wanted to explain that there's not way to do this
without spelling out the path this way. At least I didn't find a way.
Should I remove the comment or reword it?
>> +cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
>
> Oh I think I get it:
> - DTS_API_MAIN_PAGE is the Meson variable
> - dts_api_main_page is the Doxygen variable
>
Yes, this is a way to make it work. Maybe there's something else (I'm
not that familiar with Doxygen), but from what I can tell, there wasn't
a command line option that would set a variable (passing the path form
Meson to Doxygen) and nothing else I found worked.
Is this solution ok? If we want to explore something else, is there
someone with more experience with Doxygen who could help?
>
>> +# Napoleon enables the Google format of Python doscstrings, used in DTS
>> +# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
>
> Close sentences with a dot, it is easier to read.
>
Ack.
>> +extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
>> +
>> +# DTS Python docstring options
>> +autodoc_default_options = {
>> + 'members': True,
>> + 'member-order': 'bysource',
>> + 'show-inheritance': True,
>> +}
>> +autodoc_class_signature = 'separated'
>> +autodoc_typehints = 'both'
>> +autodoc_typehints_format = 'short'
>> +autodoc_typehints_description_target = 'documented'
>> +napoleon_numpy_docstring = False
>> +napoleon_attr_annotations = True
>> +napoleon_preprocess_types = True
>> +add_module_names = False
>> +toc_object_entries = True
>> +toc_object_entries_show_parents = 'hide'
>> +intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
>> +
>> +dts_root = environ.get('DTS_ROOT')
>
> Why does it need to be passed as an environment variable?
> Isn't it a fixed absolute path?
>
The path to DTS needs to be passed in some way (and added to sys.path)
so that Sphinx knows where the sources are in order to import them.
Do you want us to not pass the path, but just hardcode it here? I didn't
really think about that, maybe that could work.
>> +if dts_root:
>> + path.append(dts_root)
>> + # DTS Sidebar config
>> + html_theme_options = {
>> + 'collapse_navigation': False,
>> + 'navigation_depth': -1,
>> + }
>
> [...]
>
>> +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
>
> I don't plan to use Poetry on my machine.
> Can we simply describe the dependencies even if the versions are not specified?
>
The reason we don't list the dependencies anywhere is that doing it with
Poetry is much easier (and a bit safer, as Poetry is going to install
tested versions).
But I can add references to the two relevant sections of
dts/pyproject.toml which contain the dependencies with a note that they
can be installed with pip (and I guess that would be another
dependency), but at that point it's that not much different than using
Poetry.
>> +
>> +.. code-block:: console
>> +
>> + poetry install --no-root --with docs
>> + poetry shell
>> +
>> +The documentation is built using the standard DPDK build system.
>> +After executing the meson command and entering Poetry's shell, build the documentation with:
>> +
>> +.. code-block:: console
>> +
>> + ninja -C build dts-doc
>
> Don't we rely on the Meson option "enable_docs"?
I had a discussion about this with Bruce, but I can't find it anywhere,
so here's what I remember:
1. We didn't want to tie the dts api doc build to dpdk doc build because
of the dependencies.
2. There's a way to build docs without the enable_docs option (running
ninja with the target), which is what we added for dts. This doesn't tie
the dts api doc build to the dpdk doc build.
3. We had an "enable_dts_docs" Meson option in the past (to keep it
separate from dpdk doc build), but decided to drop it. My memory is hazy
on this, but I think it was, again, because of the additional steps
needed to bring up the dependency (poetry shell) - at that point,
supporting just the ninja build way is sufficient. Bruce may shed more
light on this.
>> +
>> +The output is generated in ``build/doc/api/dts/html``.
>> +
>> +.. Note::
>
> In general the RST expressions are lowercase.
>
Ack.
>> +
>> + Make sure to fix any Sphinx warnings when adding or updating docstrings,
>> + and also run the ``devtools/dts-check-format.sh`` script and address any issues it finds.
>
> It looks like something to write in the contributing guide.
>
I could add it there, where is the right place? In patches.rst, section
"Checking the Patches"?
>
>> +++ b/dts/doc/meson.build
>> @@ -0,0 +1,27 @@
>> +# SPDX-License-Identifier: BSD-3-Clause
>> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
>> +
>> +sphinx = find_program('sphinx-build', required: false)
>> +sphinx_apidoc = find_program('sphinx-apidoc', required: false)
>> +
>> +if not sphinx.found() or not sphinx_apidoc.found()
>
> You should include the option "enable_docs" here.
>
>> + subdir_done()
>> +endif
>
>
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 5/5] dts: add API doc generation
2024-08-01 13:03 ` Juraj Linkeš
@ 2024-08-01 15:07 ` Thomas Monjalon
2024-08-02 10:48 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Thomas Monjalon @ 2024-08-01 15:07 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev, bruce.richardson
01/08/2024 15:03, Juraj Linkeš:
> On 30. 7. 2024 15:51, Thomas Monjalon wrote:
> > 12/07/2024 10:57, Juraj Linkeš:
> >> The tool used to generate DTS API docs is Sphinx, which is already in
> >> use in DPDK. The same configuration is used to preserve style with one
> >> DTS-specific configuration (so that the DPDK docs are unchanged) that
> >> modifies how the sidebar displays the content.
> >
> > What is changed in the sidebar?
> >
>
> These are the two changes:
> html_theme_options = {
> 'collapse_navigation': False,
> 'navigation_depth': -1,
> }
>
> The first allows you to explore the structure without needing to enter
> any specific section - it puts the + at each section so everything is
> expandable.
> The second just means that each section can be fully expanded (there's
> no limit).
OK interesting, you may add a comment # unlimited depth
> >> +# A local reference must be relative to the main index.html page
> >> +# The path below can't be taken from the DTS meson file as that would
> >> +# require recursive subdir traversal (doc, dts, then doc again)
> >
> > This comment is really obscure.
>
> I guess it is. I just wanted to explain that there's not way to do this
> without spelling out the path this way. At least I didn't find a way.
> Should I remove the comment or reword it?
May be removed I think.
> >> +cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
> >
> > Oh I think I get it:
> > - DTS_API_MAIN_PAGE is the Meson variable
> > - dts_api_main_page is the Doxygen variable
> >
>
> Yes, this is a way to make it work. Maybe there's something else (I'm
> not that familiar with Doxygen), but from what I can tell, there wasn't
> a command line option that would set a variable (passing the path form
> Meson to Doxygen) and nothing else I found worked.
>
> Is this solution ok? If we want to explore something else, is there
> someone with more experience with Doxygen who could help?
Yes it's OK like that.
> >> +dts_root = environ.get('DTS_ROOT')
> >
> > Why does it need to be passed as an environment variable?
> > Isn't it a fixed absolute path?
>
> The path to DTS needs to be passed in some way (and added to sys.path)
> so that Sphinx knows where the sources are in order to import them.
>
> Do you want us to not pass the path, but just hardcode it here? I didn't
> really think about that, maybe that could work.
I think hardcode is better here.
> >> +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
> >
> > I don't plan to use Poetry on my machine.
> > Can we simply describe the dependencies even if the versions are not specified?
>
> The reason we don't list the dependencies anywhere is that doing it with
> Poetry is much easier (and a bit safer, as Poetry is going to install
> tested versions).
>
> But I can add references to the two relevant sections of
> dts/pyproject.toml which contain the dependencies with a note that they
> can be installed with pip (and I guess that would be another
> dependency), but at that point it's that not much different than using
> Poetry.
I want to use my system package manager.
I am from this old school thinking we should have a single package manager in a system.
> >> +.. code-block:: console
> >> +
> >> + poetry install --no-root --with docs
> >> + poetry shell
> >> +
> >> +The documentation is built using the standard DPDK build system.
> >> +After executing the meson command and entering Poetry's shell, build the documentation with:
> >> +
> >> +.. code-block:: console
> >> +
> >> + ninja -C build dts-doc
> >
> > Don't we rely on the Meson option "enable_docs"?
>
> I had a discussion about this with Bruce, but I can't find it anywhere,
> so here's what I remember:
> 1. We didn't want to tie the dts api doc build to dpdk doc build because
> of the dependencies.
Sure
But we could just skip if dependencies are not met?
> 2. There's a way to build docs without the enable_docs option (running
> ninja with the target), which is what we added for dts. This doesn't tie
> the dts api doc build to the dpdk doc build.
Yes
> 3. We had an "enable_dts_docs" Meson option in the past (to keep it
> separate from dpdk doc build), but decided to drop it. My memory is hazy
> on this, but I think it was, again, because of the additional steps
> needed to bring up the dependency (poetry shell) - at that point,
> supporting just the ninja build way is sufficient. Bruce may shed more
> light on this.
> >> + Make sure to fix any Sphinx warnings when adding or updating docstrings,
> >> + and also run the ``devtools/dts-check-format.sh`` script and address any issues it finds.
> >
> > It looks like something to write in the contributing guide.
> >
>
> I could add it there, where is the right place? In patches.rst, section
> "Checking the Patches"?
Yes
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 5/5] dts: add API doc generation
2024-08-01 15:07 ` Thomas Monjalon
@ 2024-08-02 10:48 ` Juraj Linkeš
2024-08-02 13:53 ` Thomas Monjalon
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-02 10:48 UTC (permalink / raw)
To: Thomas Monjalon
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On 1. 8. 2024 17:07, Thomas Monjalon wrote:
> 01/08/2024 15:03, Juraj Linkeš:
>> On 30. 7. 2024 15:51, Thomas Monjalon wrote:
>>> 12/07/2024 10:57, Juraj Linkeš:
>>>> The tool used to generate DTS API docs is Sphinx, which is already in
>>>> use in DPDK. The same configuration is used to preserve style with one
>>>> DTS-specific configuration (so that the DPDK docs are unchanged) that
>>>> modifies how the sidebar displays the content.
>>>
>>> What is changed in the sidebar?
>>>
>>
>> These are the two changes:
>> html_theme_options = {
>> 'collapse_navigation': False,
>> 'navigation_depth': -1,
>> }
>>
>> The first allows you to explore the structure without needing to enter
>> any specific section - it puts the + at each section so everything is
>> expandable.
>> The second just means that each section can be fully expanded (there's
>> no limit).
>
> OK interesting, you may add a comment # unlimited depth
>
>
Ack.
>>>> +# A local reference must be relative to the main index.html page
>>>> +# The path below can't be taken from the DTS meson file as that would
>>>> +# require recursive subdir traversal (doc, dts, then doc again)
>>>
>>> This comment is really obscure.
>>
>> I guess it is. I just wanted to explain that there's not way to do this
>> without spelling out the path this way. At least I didn't find a way.
>> Should I remove the comment or reword it?
>
> May be removed I think.
>
Ok, I'll remove it.
>>>> +cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
>>>
>>> Oh I think I get it:
>>> - DTS_API_MAIN_PAGE is the Meson variable
>>> - dts_api_main_page is the Doxygen variable
>>>
>>
>> Yes, this is a way to make it work. Maybe there's something else (I'm
>> not that familiar with Doxygen), but from what I can tell, there wasn't
>> a command line option that would set a variable (passing the path form
>> Meson to Doxygen) and nothing else I found worked.
>>
>> Is this solution ok? If we want to explore something else, is there
>> someone with more experience with Doxygen who could help?
>
> Yes it's OK like that.
>
>
Ack.
>>>> +dts_root = environ.get('DTS_ROOT')
>>>
>>> Why does it need to be passed as an environment variable?
>>> Isn't it a fixed absolute path?
>>
>> The path to DTS needs to be passed in some way (and added to sys.path)
>> so that Sphinx knows where the sources are in order to import them.
>>
>> Do you want us to not pass the path, but just hardcode it here? I didn't
>> really think about that, maybe that could work.
>
> I think hardcode is better here.
>
I tried implementing this, but I ran into an issue with this:
dts_root = environ.get('DTS_ROOT')
if dts_root:
path.append(dts_root)
# DTS Sidebar config
html_theme_options = {
'collapse_navigation': False,
'navigation_depth': -1,
}
The sidebar configuration is conditional, so we have to pass something
to indicate dts build. I'll change it so that we look for 'dts' in src
in call-sphinx-build.py (we're in the dts doc directory, indicating dts
build) and set the DTS_BUILD env var which we can use in conf.py. I
didn't find a better way to do this as conf.py doesn't have any
information about the build itself (and no path that conf.py has access
to points to anything dts). Here's how it'll look:
if environ.get('DTS_BUILD'):
path.append(path_join(dirname(dirname(dirname(__file__))), 'dts'))
# DTS Sidebar config.
html_theme_options = {
'collapse_navigation': False,
'navigation_depth': -1, # unlimited depth
}
>
>>>> +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
>>>
>>> I don't plan to use Poetry on my machine.
>>> Can we simply describe the dependencies even if the versions are not specified?
>>
>> The reason we don't list the dependencies anywhere is that doing it with
>> Poetry is much easier (and a bit safer, as Poetry is going to install
>> tested versions).
>>
>> But I can add references to the two relevant sections of
>> dts/pyproject.toml which contain the dependencies with a note that they
>> can be installed with pip (and I guess that would be another
>> dependency), but at that point it's that not much different than using
>> Poetry.
>
> I want to use my system package manager.
> I am from this old school thinking we should have a single package manager in a system.
>
I understand and would also prefer that, but it just doesn't work for
Python. Not all packages are available from the package managers, and
Python projects should not use system packages as there are frequently
version mismatches between the system packages and what the project
needs (the APIs could be different as well as behavior; a problem we've
seen with Scapy). Poetry is one of the tools that tries to solve this
well-known Python limitation.
I've done a quick search of what's available in Ubuntu and two packages
aren't available, types-PyYAML (which maybe we could do without, I'll
have to test) and aenum (which is currently needed for the capabilities
patch; if absolutely necessary, maybe I could find a solution without
aenum). But even with this we can't be sure that the system package
versions will work.
>>>> +.. code-block:: console
>>>> +
>>>> + poetry install --no-root --with docs
>>>> + poetry shell
>>>> +
>>>> +The documentation is built using the standard DPDK build system.
>>>> +After executing the meson command and entering Poetry's shell, build the documentation with:
>>>> +
>>>> +.. code-block:: console
>>>> +
>>>> + ninja -C build dts-doc
>>>
>>> Don't we rely on the Meson option "enable_docs"?
>>
>> I had a discussion about this with Bruce, but I can't find it anywhere,
>> so here's what I remember:
>> 1. We didn't want to tie the dts api doc build to dpdk doc build because
>> of the dependencies.
>
> Sure
> But we could just skip if dependencies are not met?
>
Maybe we could add a script that would check the dependencies. I'll see
what I can do.
>> 2. There's a way to build docs without the enable_docs option (running
>> ninja with the target), which is what we added for dts. This doesn't tie
>> the dts api doc build to the dpdk doc build.
>
> Yes
>
>> 3. We had an "enable_dts_docs" Meson option in the past (to keep it
>> separate from dpdk doc build), but decided to drop it. My memory is hazy
>> on this, but I think it was, again, because of the additional steps
>> needed to bring up the dependency (poetry shell) - at that point,
>> supporting just the ninja build way is sufficient. Bruce may shed more
>> light on this.
>
>
>>>> + Make sure to fix any Sphinx warnings when adding or updating docstrings,
>>>> + and also run the ``devtools/dts-check-format.sh`` script and address any issues it finds.
>>>
>>> It looks like something to write in the contributing guide.
>>>
>>
>> I could add it there, where is the right place? In patches.rst, section
>> "Checking the Patches"?
>
> Yes
>
Ack.
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 5/5] dts: add API doc generation
2024-08-02 10:48 ` Juraj Linkeš
@ 2024-08-02 13:53 ` Thomas Monjalon
2024-08-05 9:04 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Thomas Monjalon @ 2024-08-02 13:53 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
02é x/08/2024 12:48, Juraj Linkeš:
> On 1. 8. 2024 17:07, Thomas Monjalon wrote:
> > 01/08/2024 15:03, Juraj Linkeš:
> >> On 30. 7. 2024 15:51, Thomas Monjalon wrote:
> >>> 12/07/2024 10:57, Juraj Linkeš:
> >>>> +dts_root = environ.get('DTS_ROOT')
> >>>
> >>> Why does it need to be passed as an environment variable?
> >>> Isn't it a fixed absolute path?
> >>
> >> The path to DTS needs to be passed in some way (and added to sys.path)
> >> so that Sphinx knows where the sources are in order to import them.
> >>
> >> Do you want us to not pass the path, but just hardcode it here? I didn't
> >> really think about that, maybe that could work.
> >
> > I think hardcode is better here. xFalse,
> 'navigation_depth': -1,
> }
>
> The sidebar configuration is conditional, so we have to pass something
> to indicate dts build. I'll change it so that we look for 'dts' in src
> in call-sphinx-build.py (we're in the dts doc directory, indicating dts
> build) and set the DTS_BUILD env var which we can use in conf.py. I
> didn't find a better way to do this as conf.py doesn't have any
> information about the build itself (and no path that conf.py has access
> to points to anything dts). Here's how it'll look:
>
> if environ.get('DTS_BUILD'):
> path.append(path_join(dirname(dirname(dirname(__file__))), 'dts'))
> # DTS Sidebar config.
> html_theme_options = {
> 'collapse_navigation': False,
> 'navigation_depth': -1, # unlimited depth
> }
OK
> >>>> +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
> >>>
> >>> I don't plan to use Poetry on my machine.
> >>> Can we simply describe the dependencies even if the versions are not specified?
> >>
> >> The reason we don't list the dependencies anywhere is that doing it with
> >> Poetry is much easier (and a bit safer, as Poetry is going to install
> >> tested versions).
> >>
> >> But I can add references to the two relevant sections of
> >> dts/pyproject.toml which contain the dependencies with a note that they
> >> can be installed with pip (and I guess that would be another
> >> dependency), but at that point it's that not much different than using
> >> Poetry.
> >
> > I want to use my system package manager.
> > I am from this old school thinking we should have a single package manager in a system.
> >
>
> I understand and would also prefer that, but it just doesn't work for
> Python. Not all packages are available from the package managers, and
> Python projects should not use system packages as there are frequently
> version mismatches between the system packages and what the project
> needs (the APIs could be different as well as behavior; a problem we've
> seen with Scapy). Poetry is one of the tools that tries to solve this
> well-known Python limitation.
I fully agree for DTS runtime.
I'm expecting the dependencies are more tolerant for DTS doc.
> I've done a quick search of what's available in Ubuntu and two packages
> aren't available, types-PyYAML (which maybe we could do without, I'll
> have to test) and aenum (which is currently needed for the capabilities
> patch; if absolutely necessary, maybe I could find a solution without
> aenum). But even with this we can't be sure that the system package
> versions will work.
We need them all to generate the documentation?
> >>>> +.. code-block:: console
> >>>> +
> >>>> + poetry install --no-root --with docs
> >>>> + poetry shell
> >>>> +
> >>>> +The documentation is built using the standard DPDK build system.
> >>>> +After executing the meson command and entering Poetry's shell, build the documentation with:
> >>>> +
> >>>> +.. code-block:: console
> >>>> +
> >>>> + ninja -C build dts-doc
> >>>
> >>> Don't we rely on the Meson option "enable_docs"?
> >>
> >> I had a discussion about this with Bruce, but I can't find it anywhere,
> >> so here's what I remember:
> >> 1. We didn't want to tie the dts api doc build to dpdk doc build because
> >> of the dependencies.
> >
> > Sure
> > But we could just skip if dependencies are not met?
>
> Maybe we could add a script that would check the dependencies. I'll see
> what I can do.
OK thanks
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v8 5/5] dts: add API doc generation
2024-08-02 13:53 ` Thomas Monjalon
@ 2024-08-05 9:04 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-05 9:04 UTC (permalink / raw)
To: Thomas Monjalon
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On 2. 8. 2024 15:53, Thomas Monjalon wrote:
> 02é x/08/2024 12:48, Juraj Linkeš:
>> On 1. 8. 2024 17:07, Thomas Monjalon wrote:
>>> 01/08/2024 15:03, Juraj Linkeš:
>>>> On 30. 7. 2024 15:51, Thomas Monjalon wrote:
>>>>> 12/07/2024 10:57, Juraj Linkeš:
>>>>>> +dts_root = environ.get('DTS_ROOT')
>>>>>
>>>>> Why does it need to be passed as an environment variable?
>>>>> Isn't it a fixed absolute path?
>>>>
>>>> The path to DTS needs to be passed in some way (and added to sys.path)
>>>> so that Sphinx knows where the sources are in order to import them.
>>>>
>>>> Do you want us to not pass the path, but just hardcode it here? I didn't
>>>> really think about that, maybe that could work.
>>>
>>> I think hardcode is better here. xFalse,
>> 'navigation_depth': -1,
>> }
>>
>> The sidebar configuration is conditional, so we have to pass something
>> to indicate dts build. I'll change it so that we look for 'dts' in src
>> in call-sphinx-build.py (we're in the dts doc directory, indicating dts
>> build) and set the DTS_BUILD env var which we can use in conf.py. I
>> didn't find a better way to do this as conf.py doesn't have any
>> information about the build itself (and no path that conf.py has access
>> to points to anything dts). Here's how it'll look:
>>
>> if environ.get('DTS_BUILD'):
>> path.append(path_join(dirname(dirname(dirname(__file__))), 'dts'))
>> # DTS Sidebar config.
>> html_theme_options = {
>> 'collapse_navigation': False,
>> 'navigation_depth': -1, # unlimited depth
>> }
>
> OK
>
>
>>>>>> +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
>>>>>
>>>>> I don't plan to use Poetry on my machine.
>>>>> Can we simply describe the dependencies even if the versions are not specified?
>>>>
>>>> The reason we don't list the dependencies anywhere is that doing it with
>>>> Poetry is much easier (and a bit safer, as Poetry is going to install
>>>> tested versions).
>>>>
>>>> But I can add references to the two relevant sections of
>>>> dts/pyproject.toml which contain the dependencies with a note that they
>>>> can be installed with pip (and I guess that would be another
>>>> dependency), but at that point it's that not much different than using
>>>> Poetry.
>>>
>>> I want to use my system package manager.
>>> I am from this old school thinking we should have a single package manager in a system.
>>>
>>
>> I understand and would also prefer that, but it just doesn't work for
>> Python. Not all packages are available from the package managers, and
>> Python projects should not use system packages as there are frequently
>> version mismatches between the system packages and what the project
>> needs (the APIs could be different as well as behavior; a problem we've
>> seen with Scapy). Poetry is one of the tools that tries to solve this
>> well-known Python limitation.
>
> I fully agree for DTS runtime.
> I'm expecting the dependencies are more tolerant for DTS doc.
>
>> I've done a quick search of what's available in Ubuntu and two packages
>> aren't available, types-PyYAML (which maybe we could do without, I'll
>> have to test) and aenum (which is currently needed for the capabilities
>> patch; if absolutely necessary, maybe I could find a solution without
>> aenum). But even with this we can't be sure that the system package
>> versions will work.
>
> We need them all to generate the documentation?
>
We actually may not. The Python docstrings are part of the code (stored
in the __docstring__ attribute of everything), Sphinx (more precisely
the autodoc extension [0]) imports all the code to access the docstrings
and to do that, it needs the dependencies.
However, I found a config option that mocks imports from the specified
modules [1], so what we can do is list the missing modules there (and we
can build without the dependencies). If we do this, we could emit a
warning from Sphinx, although the resulting docs don't seem any
different according to the basic tests I did.
So this means we can do a build with both passing the target and passing
-Denable_docs, provided I won't encounter anything wild. I'll change the
docs accordingly.
[0]
https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#ensuring-the-code-can-be-imported
[1]
https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#confval-autodoc_mock_imports
>>>>>> +.. code-block:: console
>>>>>> +
>>>>>> + poetry install --no-root --with docs
>>>>>> + poetry shell
>>>>>> +
>>>>>> +The documentation is built using the standard DPDK build system.
>>>>>> +After executing the meson command and entering Poetry's shell, build the documentation with:
>>>>>> +
>>>>>> +.. code-block:: console
>>>>>> +
>>>>>> + ninja -C build dts-doc
>>>>>
>>>>> Don't we rely on the Meson option "enable_docs"?
>>>>
>>>> I had a discussion about this with Bruce, but I can't find it anywhere,
>>>> so here's what I remember:
>>>> 1. We didn't want to tie the dts api doc build to dpdk doc build because
>>>> of the dependencies.
>>>
>>> Sure
>>> But we could just skip if dependencies are not met?
>>
>> Maybe we could add a script that would check the dependencies. I'll see
>> what I can do.
>
> OK thanks
>
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 0/5] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (8 preceding siblings ...)
2024-07-12 8:57 ` [PATCH v8 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-01 9:18 ` Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-01 9:37 ` [PATCH v10 0/5] dts: API docs generation Juraj Linkeš
` (9 subsequent siblings)
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
v4:
Rebase.
v5:
Another rebase, but this time the rebase needed the addition of .rst
corresponding to newly added files as well as fixing a few documentation
problems in said files.
v6:
Documentation formatting adjustments.
v7:
Now with the actual doc changes.
v8:
Split the last commit into non-DTS and DTS changes.
v9:
Rebase.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
doc: guides and API meson improvements
dts: add API doc generation
buildtools/call-sphinx-build.py | 35 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 27 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
dts/meson.build | 16 +
dts/poetry.lock | 510 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
54 files changed, 979 insertions(+), 39 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 1/5] dts: update params and parser docstrings
2024-08-01 9:18 ` [PATCH v9 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-01 9:18 ` Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 2/5] dts: add doc generation dependencies Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 2/5] dts: add doc generation dependencies
2024-08-01 9:18 ` [PATCH v9 0/5] dts: API docs generation Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-01 9:18 ` Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 3/5] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
Sphinx imports every Python module when generating documentation from
docstrings, meaning all DTS dependencies, including Python version,
must be satisfied.
By adding Sphinx to DTS dependencies we provide a convenient way to
generate the DTS API docs which satisfies all dependencies.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/poetry.lock | 510 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..b6e27f8f38 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -520,6 +778,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +857,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +864,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +882,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +889,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +909,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1051,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1089,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1304,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1339,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "8b9d9363fa0130186f2d72392de6b9d74696c7b1250a4f346f6264c8c07318a0"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..31c7824204 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 3/5] dts: add API doc sources
2024-08-01 9:18 ` [PATCH v9 0/5] dts: API docs generation Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 2/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-01 9:18 ` Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 4/5] doc: guides and API meson improvements Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 4/5] doc: guides and API meson improvements
2024-08-01 9:18 ` [PATCH v9 0/5] dts: API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-01 9:18 ` [PATCH v9 3/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-01 9:18 ` Juraj Linkeš
2024-08-01 9:18 ` [PATCH v9 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The Sphinx script argument parsing improvement gives us more
flexibility going forward, such as the ability to add non-positional
arguments.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 32 +++++++++++++++++++++-----------
doc/api/meson.build | 7 ++++---
doc/guides/conf.py | 6 ++----
3 files changed, 27 insertions(+), 18 deletions(-)
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..2034160049 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,6 +3,7 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import filecmp
import shutil
import sys
@@ -10,32 +11,41 @@
from os.path import join
from subprocess import run
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
# copy custom CSS file
css = 'custom.css'
-src_css = join(src, css)
-dst_css = join(dst, 'html', '_static', 'css', css)
+src_css = join(args.src, css)
+dst_css = join(args.dst, 'html', '_static', 'css', css)
if not os.path.exists(dst_css) or not filecmp.cmp(src_css, dst_css):
os.makedirs(os.path.dirname(dst_css), exist_ok=True)
shutil.copyfile(src_css, dst_css)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..b828b1ed66 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,10 +33,10 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..8b440fb2a9 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,8 +7,7 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
from sys import argv, stderr
@@ -35,8 +34,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v9 5/5] dts: add API doc generation
2024-08-01 9:18 ` [PATCH v9 0/5] dts: API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-01 9:18 ` [PATCH v9 4/5] doc: guides and API meson improvements Juraj Linkeš
@ 2024-08-01 9:18 ` Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 3 +++
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 4 ++++
doc/guides/conf.py | 33 +++++++++++++++++++++++++++++++-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 ++++++++++++++++++++++++++++++++-
dts/doc/meson.build | 27 ++++++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++++
meson.build | 1 +
10 files changed, 122 insertions(+), 2 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 2034160049..102f496599 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -16,10 +16,13 @@
parser.add_argument('version')
parser.add_argument('src')
parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
sphinx_cmd = [args.sphinx] + extra_args
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index b828b1ed66..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -41,6 +41,10 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 8b440fb2a9..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -9,7 +9,7 @@
from os import environ
from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -23,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index f8bbfba9f5..b34b7b8eb0 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..77df7a0378 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -292,7 +292,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +432,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --no-root --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system.
+After executing the meson command and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings,
+ and also run the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v10 0/5] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (9 preceding siblings ...)
2024-08-01 9:18 ` [PATCH v9 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-01 9:37 ` Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-05 13:59 ` [PATCH v11 0/5] dts: API docs generation Juraj Linkeš
` (8 subsequent siblings)
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:37 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
Dependencies are installed using Poetry from the dts directory:
poetry install --with docs
After installing, enter the Poetry shell:
poetry shell
And then run the build:
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
v4:
Rebase.
v5:
Another rebase, but this time the rebase needed the addition of .rst
corresponding to newly added files as well as fixing a few documentation
problems in said files.
v6:
Documentation formatting adjustments.
v7:
Now with the actual doc changes.
v8:
Split the last commit into non-DTS and DTS changes.
v9:
Rebase.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
doc: guides and API meson improvements
dts: add API doc generation
buildtools/call-sphinx-build.py | 37 +-
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 11 +-
doc/guides/conf.py | 39 +-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 27 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
dts/meson.build | 16 +
dts/poetry.lock | 510 +++++++++++++++++-
dts/pyproject.toml | 7 +
meson.build | 1 +
54 files changed, 980 insertions(+), 40 deletions(-)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v10 1/5] dts: update params and parser docstrings
2024-08-01 9:37 ` [PATCH v10 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-01 9:37 ` Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 2/5] dts: add doc generation dependencies Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:37 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v10 2/5] dts: add doc generation dependencies
2024-08-01 9:37 ` [PATCH v10 0/5] dts: API docs generation Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-01 9:37 ` Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 3/5] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:37 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
Sphinx imports every Python module when generating documentation from
docstrings, meaning all DTS dependencies, including Python version,
must be satisfied.
By adding Sphinx to DTS dependencies we provide a convenient way to
generate the DTS API docs which satisfies all dependencies.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/poetry.lock | 510 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 7 +
2 files changed, 505 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..b6e27f8f38 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -520,6 +778,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +857,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +864,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +882,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +889,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +909,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1051,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1089,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1304,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1339,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "8b9d9363fa0130186f2d72392de6b9d74696c7b1250a4f346f6264c8c07318a0"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..31c7824204 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,13 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v10 3/5] dts: add API doc sources
2024-08-01 9:37 ` [PATCH v10 0/5] dts: API docs generation Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 2/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-01 9:37 ` Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 4/5] doc: guides and API meson improvements Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:37 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v10 4/5] doc: guides and API meson improvements
2024-08-01 9:37 ` [PATCH v10 0/5] dts: API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-01 9:37 ` [PATCH v10 3/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-01 9:37 ` Juraj Linkeš
2024-08-01 9:37 ` [PATCH v10 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:37 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The Sphinx script argument parsing improvement gives us more
flexibility going forward, such as the ability to add non-positional
arguments.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 32 +++++++++++++++++++++-----------
doc/api/meson.build | 7 ++++---
doc/guides/conf.py | 6 ++----
3 files changed, 27 insertions(+), 18 deletions(-)
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..2034160049 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,6 +3,7 @@
# Copyright(c) 2019 Intel Corporation
#
+import argparse
import filecmp
import shutil
import sys
@@ -10,32 +11,41 @@
from os.path import join
from subprocess import run
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
# find all the files sphinx will process so we can write them as dependencies
srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(args.dst):
+ os.makedirs(args.dst)
+
# run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
- process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
- stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+ process = run(
+ sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+ stdout=out
+ )
# create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
d.write('html: ' + ' '.join(srcfiles) + '\n')
# copy custom CSS file
css = 'custom.css'
-src_css = join(src, css)
-dst_css = join(dst, 'html', '_static', 'css', css)
+src_css = join(args.src, css)
+dst_css = join(args.dst, 'html', '_static', 'css', css)
if not os.path.exists(dst_css) or not filecmp.cmp(src_css, dst_css):
os.makedirs(os.path.dirname(dst_css), exist_ok=True)
shutil.copyfile(src_css, dst_css)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..b828b1ed66 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,10 +33,10 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..8b440fb2a9 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,8 +7,7 @@
from sphinx import __version__ as sphinx_version
from os import listdir
from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
from os.path import join as path_join
from sys import argv, stderr
@@ -35,8 +34,7 @@
html_show_copyright = False
highlight_language = 'none'
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+version = environ.setdefault('DPDK_VERSION', "None")
master_doc = 'index'
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v10 5/5] dts: add API doc generation
2024-08-01 9:37 ` [PATCH v10 0/5] dts: API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-01 9:37 ` [PATCH v10 4/5] doc: guides and API meson improvements Juraj Linkeš
@ 2024-08-01 9:37 ` Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-01 9:37 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx imports the
code.
* Also the same Python packages as DTS, for the same reason.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 5 ++++-
doc/api/doxy-api-index.md | 3 +++
doc/api/doxy-api.conf.in | 2 ++
doc/api/meson.build | 4 ++++
doc/guides/conf.py | 33 +++++++++++++++++++++++++++++++-
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 34 ++++++++++++++++++++++++++++++++-
dts/doc/meson.build | 27 ++++++++++++++++++++++++++
dts/meson.build | 16 ++++++++++++++++
meson.build | 1 +
10 files changed, 123 insertions(+), 3 deletions(-)
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 2034160049..c55d4f6bc9 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -16,10 +16,13 @@
parser.add_argument('version')
parser.add_argument('src')
parser.add_argument('dst')
+parser.add_argument('--dts-root', default=None)
args, extra_args = parser.parse_known_args()
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = args.version
+if args.dts_root:
+ os.environ['DTS_ROOT'] = args.dts_root
sphinx_cmd = [args.sphinx] + extra_args
@@ -46,7 +49,7 @@
css = 'custom.css'
src_css = join(args.src, css)
dst_css = join(args.dst, 'html', '_static', 'css', css)
-if not os.path.exists(dst_css) or not filecmp.cmp(src_css, dst_css):
+if os.path.exists(src_css) and (not os.path.exists(dst_css) or not filecmp.cmp(src_css, dst_css)):
os.makedirs(os.path.dirname(dst_css), exist_ok=True)
shutil.copyfile(src_css, dst_css)
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index b828b1ed66..ffc75d7b5a 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -41,6 +41,10 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+# A local reference must be relative to the main index.html page
+# The path below can't be taken from the DTS meson file as that would
+# require recursive subdir traversal (doc, dts, then doc again)
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 8b440fb2a9..b442a1f76c 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -9,7 +9,7 @@
from os import environ
from os.path import basename, dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -23,6 +23,37 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+dts_root = environ.get('DTS_ROOT')
+if dts_root:
+ path.append(dts_root)
+ # DTS Sidebar config
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1,
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index f8bbfba9f5..b34b7b8eb0 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..77df7a0378 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -292,7 +292,12 @@ and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +432,33 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+.. code-block:: console
+
+ poetry install --no-root --with docs
+ poetry shell
+
+The documentation is built using the standard DPDK build system.
+After executing the meson command and entering Poetry's shell, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings,
+ and also run the ``devtools/dts-check-format.sh`` script and address any issues it finds.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..01b7b51034
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,27 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: false)
+sphinx_apidoc = find_program('sphinx-apidoc', required: false)
+
+if not sphinx.found() or not sphinx_apidoc.found()
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir, '--dts-root', dts_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: false,
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..e8ce0f06ac
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v11 0/5] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (10 preceding siblings ...)
2024-08-01 9:37 ` [PATCH v10 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-05 13:59 ` Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-06 6:13 ` [PATCH v12 0/5] dts: API docs generation Juraj Linkeš
` (7 subsequent siblings)
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-05 13:59 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v2:
Removed the use of sphinx-apidoc from meson in favor of adding the files
generated by it directly to the repository (and modifying them).
v3:
Rebase.
v4:
Rebase.
v5:
Another rebase, but this time the rebase needed the addition of .rst
corresponding to newly added files as well as fixing a few documentation
problems in said files.
v6:
Documentation formatting adjustments.
v7:
Now with the actual doc changes.
v8:
Split the last commit into non-DTS and DTS changes.
v9:
Rebase.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
doc: meson doc API build dir variable
dts: add API doc generation
buildtools/call-sphinx-build.py | 10 +-
buildtools/get-dts-deps.py | 78 +++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 8 +-
doc/guides/conf.py | 41 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 39 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 30 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
dts/meson.build | 15 +
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
meson.build | 1 +
58 files changed, 1071 insertions(+), 25 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v11 1/5] dts: update params and parser docstrings
2024-08-05 13:59 ` [PATCH v11 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-05 13:59 ` Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 2/5] dts: add doc generation dependencies Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-05 13:59 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v11 2/5] dts: add doc generation dependencies
2024-08-05 13:59 ` [PATCH v11 0/5] dts: API docs generation Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-05 13:59 ` Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 3/5] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-05 13:59 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v11 3/5] dts: add API doc sources
2024-08-05 13:59 ` [PATCH v11 0/5] dts: API docs generation Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 2/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-05 13:59 ` Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 4/5] doc: meson doc API build dir variable Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-05 13:59 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v11 4/5] doc: meson doc API build dir variable
2024-08-05 13:59 ` [PATCH v11 0/5] dts: API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-05 13:59 ` [PATCH v11 3/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-05 13:59 ` Juraj Linkeš
2024-08-05 13:59 ` [PATCH v11 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-05 13:59 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The three instances of the path 'dpdk_build_root/doc/api' are replaced
with a variable, moving the definition to one place.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
doc/api/meson.build | 7 ++++---
1 file changed, 4 insertions(+), 3 deletions(-)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..b828b1ed66 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,10 +33,10 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v11 5/5] dts: add API doc generation
2024-08-05 13:59 ` [PATCH v11 0/5] dts: API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-05 13:59 ` [PATCH v11 4/5] doc: meson doc API build dir variable Juraj Linkeš
@ 2024-08-05 13:59 ` Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-05 13:59 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 10 ++-
buildtools/get-dts-deps.py | 78 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 1 +
doc/guides/conf.py | 41 +++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 ++
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 39 +++++++++++-
dts/doc/meson.build | 30 +++++++++
dts/meson.build | 15 +++++
meson.build | 1 +
14 files changed, 225 insertions(+), 3 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..5dd59907cd 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,11 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+conf_src = src
+if src.find('dts') != -1:
+ if '-c' in extra_args:
+ conf_src = extra_args[extra_args.index('-c') + 1]
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
@@ -23,6 +28,9 @@
for root, dirs, files in os.walk(src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(dst):
+ os.makedirs(dst)
+
# run sphinx, putting the html output in a "html" directory
with open(join(dst, 'sphinx_html.out'), 'w') as out:
process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
@@ -34,7 +42,7 @@
# copy custom CSS file
css = 'custom.css'
-src_css = join(src, css)
+src_css = join(conf_src, css)
dst_css = join(dst, 'html', '_static', 'css', css)
if not os.path.exists(dst_css) or not filecmp.cmp(src_css, dst_css):
os.makedirs(os.path.dirname(dst_css), exist_ok=True)
diff --git a/buildtools/get-dts-deps.py b/buildtools/get-dts-deps.py
new file mode 100755
index 0000000000..7114aeb710
--- /dev/null
+++ b/buildtools/get-dts-deps.py
@@ -0,0 +1,78 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script returns the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime and doc generation dependencies in the DTS pyproject.toml file
+a returns a list of module names used in an import statement that are missing.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {'invoke': '>=1.3'}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_version_tuple(version_str):
+ return tuple(map(int, version_str.split(".")))
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ deps = {dep: deps_section[dep].strip('"\'') for dep in deps_section}
+ doc_deps_section = cfg['tool.poetry.group.docs.dependencies']
+ doc_deps = {dep: doc_deps_section[dep].strip("\"'") for dep in doc_deps_section}
+
+ return deps | doc_deps
+
+
+def get_missing_imports():
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, req_ver in (req_deps | _EXTRA_DEPS).items():
+ try:
+ req_ver = _get_version_tuple(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = _get_version_tuple(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(req_dep.lower().replace('-', '_'))
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = _get_version_tuple(platform.python_version())
+ req_ver = _get_version_tuple(python_version.strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..599653bea4 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_deps = py3 + files('get-dts-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index b828b1ed66..b893931b92 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -41,6 +41,7 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..eab3387874 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +24,45 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS.
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options.
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+if environ.get('DTS_BUILD'):
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-deps').get_missing_imports()
+
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index f8bbfba9f5..b34b7b8eb0 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..bd715f8072 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..c2df99bbc6
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,30 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_deps).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..6ed3c93fe1
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,15 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v12 0/5] dts: API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (11 preceding siblings ...)
2024-08-05 13:59 ` [PATCH v11 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-06 6:13 ` Juraj Linkeš
2024-08-06 6:13 ` [PATCH v12 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-06 8:46 ` [PATCH v13 0/6] API docs generation Juraj Linkeš
` (6 subsequent siblings)
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 6:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
v12:
Added paramiko to the required dependencies of get-dts-deps.py.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: add doc generation dependencies
dts: add API doc sources
doc: meson doc API build dir variable
dts: add API doc generation
buildtools/call-sphinx-build.py | 10 +-
buildtools/get-dts-deps.py | 78 +++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 8 +-
doc/guides/conf.py | 41 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 39 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 30 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
dts/meson.build | 15 +
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
meson.build | 1 +
58 files changed, 1071 insertions(+), 25 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v12 1/5] dts: update params and parser docstrings
2024-08-06 6:13 ` [PATCH v12 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-06 6:13 ` Juraj Linkeš
2024-08-06 6:13 ` [PATCH v12 2/5] dts: add doc generation dependencies Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 6:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v12 2/5] dts: add doc generation dependencies
2024-08-06 6:13 ` [PATCH v12 0/5] dts: API docs generation Juraj Linkeš
2024-08-06 6:13 ` [PATCH v12 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-06 6:13 ` Juraj Linkeš
2024-08-06 6:13 ` [PATCH v12 3/5] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 6:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v12 3/5] dts: add API doc sources
2024-08-06 6:13 ` [PATCH v12 0/5] dts: API docs generation Juraj Linkeš
2024-08-06 6:13 ` [PATCH v12 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-06 6:13 ` [PATCH v12 2/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-06 6:13 ` Juraj Linkeš
2024-08-06 6:14 ` [PATCH v12 4/5] doc: meson doc API build dir variable Juraj Linkeš
2024-08-06 6:14 ` [PATCH v12 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 6:13 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v12 4/5] doc: meson doc API build dir variable
2024-08-06 6:13 ` [PATCH v12 0/5] dts: API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-06 6:13 ` [PATCH v12 3/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-06 6:14 ` Juraj Linkeš
2024-08-06 6:14 ` [PATCH v12 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 6:14 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The three instances of the path 'dpdk_build_root/doc/api' are replaced
with a variable, moving the definition to one place.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
doc/api/meson.build | 7 ++++---
1 file changed, 4 insertions(+), 3 deletions(-)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..b828b1ed66 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,10 +33,10 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v12 5/5] dts: add API doc generation
2024-08-06 6:13 ` [PATCH v12 0/5] dts: API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-06 6:14 ` [PATCH v12 4/5] doc: meson doc API build dir variable Juraj Linkeš
@ 2024-08-06 6:14 ` Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 6:14 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 10 ++-
buildtools/get-dts-deps.py | 78 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 1 +
doc/guides/conf.py | 41 +++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 ++
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 39 +++++++++++-
dts/doc/meson.build | 30 +++++++++
dts/meson.build | 15 +++++
meson.build | 1 +
14 files changed, 225 insertions(+), 3 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..5dd59907cd 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,11 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+conf_src = src
+if src.find('dts') != -1:
+ if '-c' in extra_args:
+ conf_src = extra_args[extra_args.index('-c') + 1]
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
@@ -23,6 +28,9 @@
for root, dirs, files in os.walk(src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(dst):
+ os.makedirs(dst)
+
# run sphinx, putting the html output in a "html" directory
with open(join(dst, 'sphinx_html.out'), 'w') as out:
process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
@@ -34,7 +42,7 @@
# copy custom CSS file
css = 'custom.css'
-src_css = join(src, css)
+src_css = join(conf_src, css)
dst_css = join(dst, 'html', '_static', 'css', css)
if not os.path.exists(dst_css) or not filecmp.cmp(src_css, dst_css):
os.makedirs(os.path.dirname(dst_css), exist_ok=True)
diff --git a/buildtools/get-dts-deps.py b/buildtools/get-dts-deps.py
new file mode 100755
index 0000000000..309b83cb5c
--- /dev/null
+++ b/buildtools/get-dts-deps.py
@@ -0,0 +1,78 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script returns the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime and doc generation dependencies in the DTS pyproject.toml file
+a returns a list of module names used in an import statement that are missing.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {'invoke': '>=1.3', 'paramiko': '>=2.4'}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_version_tuple(version_str):
+ return tuple(map(int, version_str.split(".")))
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ deps = {dep: deps_section[dep].strip('"\'') for dep in deps_section}
+ doc_deps_section = cfg['tool.poetry.group.docs.dependencies']
+ doc_deps = {dep: doc_deps_section[dep].strip("\"'") for dep in doc_deps_section}
+
+ return deps | doc_deps
+
+
+def get_missing_imports():
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, req_ver in (req_deps | _EXTRA_DEPS).items():
+ try:
+ req_ver = _get_version_tuple(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = _get_version_tuple(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(req_dep.lower().replace('-', '_'))
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = _get_version_tuple(platform.python_version())
+ req_ver = _get_version_tuple(python_version.strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..599653bea4 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_deps = py3 + files('get-dts-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index b828b1ed66..b893931b92 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -41,6 +41,7 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..eab3387874 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +24,45 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS.
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options.
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+if environ.get('DTS_BUILD'):
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-deps').get_missing_imports()
+
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index f8bbfba9f5..b34b7b8eb0 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..bd715f8072 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..c2df99bbc6
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,30 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_deps).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..6ed3c93fe1
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,15 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v13 0/6] API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (12 preceding siblings ...)
2024-08-06 6:13 ` [PATCH v12 0/5] dts: API docs generation Juraj Linkeš
@ 2024-08-06 8:46 ` Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 1/6] dts: update params and parser docstrings Juraj Linkeš
` (5 more replies)
2024-08-06 11:17 ` [PATCH v14 0/6] API docs generation Juraj Linkeš
` (5 subsequent siblings)
19 siblings, 6 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 8:46 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
v12:
Added paramiko to the required dependencies of get-dts-deps.py.
v13:
Fixed build error:
TypeError: unsupported operand type(s) for |: 'NoneType' and 'Transport'
Juraj Linkeš (6):
dts: update params and parser docstrings
dts: replace the or operator in third party types
dts: add doc generation dependencies
dts: add API doc sources
doc: meson doc API build dir variable
dts: add API doc generation
buildtools/call-sphinx-build.py | 10 +-
buildtools/get-dts-deps.py | 78 +++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 8 +-
doc/guides/conf.py | 41 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 39 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 30 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
.../interactive_remote_session.py | 3 +-
dts/meson.build | 15 +
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
meson.build | 1 +
59 files changed, 1073 insertions(+), 26 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v13 1/6] dts: update params and parser docstrings
2024-08-06 8:46 ` [PATCH v13 0/6] API docs generation Juraj Linkeš
@ 2024-08-06 8:46 ` Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 2/6] dts: replace the or operator in third party types Juraj Linkeš
` (4 subsequent siblings)
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 8:46 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v13 2/6] dts: replace the or operator in third party types
2024-08-06 8:46 ` [PATCH v13 0/6] API docs generation Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 1/6] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-06 8:46 ` Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 3/6] dts: add doc generation dependencies Juraj Linkeš
` (3 subsequent siblings)
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 8:46 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
When the DTS dependencies are not installed when building DTS API
documentation, the or operator produces errors when used with types from
those libraries:
autodoc: failed to import module 'remote_session' from module
'framework'; the following exception was raised:
Traceback (most recent call last):
...
TypeError: unsupported operand type(s) for |: 'Transport' and 'NoneType'
The third part type here is Transport from the paramiko library.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/interactive_remote_session.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 97194e6af8..4605ee14b4 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -5,6 +5,7 @@
import socket
import traceback
+from typing import Union
from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore[import-untyped]
from paramiko.ssh_exception import ( # type: ignore[import-untyped]
@@ -52,7 +53,7 @@ class InteractiveRemoteSession:
session: SSHClient
_logger: DTSLogger
_node_config: NodeConfiguration
- _transport: Transport | None
+ _transport: Union[Transport, None]
def __init__(self, node_config: NodeConfiguration, logger: DTSLogger) -> None:
"""Connect to the node during initialization.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v13 3/6] dts: add doc generation dependencies
2024-08-06 8:46 ` [PATCH v13 0/6] API docs generation Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 1/6] dts: update params and parser docstrings Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 2/6] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-06 8:46 ` Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 4/6] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 8:46 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v13 4/6] dts: add API doc sources
2024-08-06 8:46 ` [PATCH v13 0/6] API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-06 8:46 ` [PATCH v13 3/6] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-06 8:46 ` Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 5/6] doc: meson doc API build dir variable Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 6/6] dts: add API doc generation Juraj Linkeš
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 8:46 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v13 5/6] doc: meson doc API build dir variable
2024-08-06 8:46 ` [PATCH v13 0/6] API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-06 8:46 ` [PATCH v13 4/6] dts: add API doc sources Juraj Linkeš
@ 2024-08-06 8:46 ` Juraj Linkeš
2024-08-06 8:46 ` [PATCH v13 6/6] dts: add API doc generation Juraj Linkeš
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 8:46 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The three instances of the path 'dpdk_build_root/doc/api' are replaced
with a variable, moving the definition to one place.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
doc/api/meson.build | 7 ++++---
1 file changed, 4 insertions(+), 3 deletions(-)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..b828b1ed66 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,10 +33,10 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v13 6/6] dts: add API doc generation
2024-08-06 8:46 ` [PATCH v13 0/6] API docs generation Juraj Linkeš
` (4 preceding siblings ...)
2024-08-06 8:46 ` [PATCH v13 5/6] doc: meson doc API build dir variable Juraj Linkeš
@ 2024-08-06 8:46 ` Juraj Linkeš
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 8:46 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 10 ++-
buildtools/get-dts-deps.py | 78 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 1 +
doc/guides/conf.py | 41 +++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 ++
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 39 +++++++++++-
dts/doc/meson.build | 30 +++++++++
dts/meson.build | 15 +++++
meson.build | 1 +
14 files changed, 225 insertions(+), 3 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..5dd59907cd 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,11 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+conf_src = src
+if src.find('dts') != -1:
+ if '-c' in extra_args:
+ conf_src = extra_args[extra_args.index('-c') + 1]
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
@@ -23,6 +28,9 @@
for root, dirs, files in os.walk(src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(dst):
+ os.makedirs(dst)
+
# run sphinx, putting the html output in a "html" directory
with open(join(dst, 'sphinx_html.out'), 'w') as out:
process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
@@ -34,7 +42,7 @@
# copy custom CSS file
css = 'custom.css'
-src_css = join(src, css)
+src_css = join(conf_src, css)
dst_css = join(dst, 'html', '_static', 'css', css)
if not os.path.exists(dst_css) or not filecmp.cmp(src_css, dst_css):
os.makedirs(os.path.dirname(dst_css), exist_ok=True)
diff --git a/buildtools/get-dts-deps.py b/buildtools/get-dts-deps.py
new file mode 100755
index 0000000000..309b83cb5c
--- /dev/null
+++ b/buildtools/get-dts-deps.py
@@ -0,0 +1,78 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script returns the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime and doc generation dependencies in the DTS pyproject.toml file
+a returns a list of module names used in an import statement that are missing.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {'invoke': '>=1.3', 'paramiko': '>=2.4'}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_version_tuple(version_str):
+ return tuple(map(int, version_str.split(".")))
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ deps = {dep: deps_section[dep].strip('"\'') for dep in deps_section}
+ doc_deps_section = cfg['tool.poetry.group.docs.dependencies']
+ doc_deps = {dep: doc_deps_section[dep].strip("\"'") for dep in doc_deps_section}
+
+ return deps | doc_deps
+
+
+def get_missing_imports():
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, req_ver in (req_deps | _EXTRA_DEPS).items():
+ try:
+ req_ver = _get_version_tuple(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = _get_version_tuple(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(req_dep.lower().replace('-', '_'))
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = _get_version_tuple(platform.python_version())
+ req_ver = _get_version_tuple(python_version.strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..599653bea4 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_deps = py3 + files('get-dts-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index b828b1ed66..b893931b92 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -41,6 +41,7 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..eab3387874 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +24,45 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS.
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options.
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+if environ.get('DTS_BUILD'):
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-deps').get_missing_imports()
+
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index f8bbfba9f5..b34b7b8eb0 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..bd715f8072 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..c2df99bbc6
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,30 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_deps).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), dts_doc_api_build_dir, extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..6ed3c93fe1
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,15 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v14 0/6] API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (13 preceding siblings ...)
2024-08-06 8:46 ` [PATCH v13 0/6] API docs generation Juraj Linkeš
@ 2024-08-06 11:17 ` Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 1/6] dts: update params and parser docstrings Juraj Linkeš
` (5 more replies)
2024-08-06 15:19 ` [PATCH v15 0/5] API docs generation Juraj Linkeš
` (4 subsequent siblings)
19 siblings, 6 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 11:17 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
v12:
Added paramiko to the required dependencies of get-dts-deps.py.
v13:
Fixed build error:
TypeError: unsupported operand type(s) for |: 'NoneType' and 'Transport'
v14:
Fixed install error:
ERROR: File 'dts/doc/html' could not be found
This required me to put the built docs into dts/doc which is outside the
DPDK API doc dir, resulting in linking between DPDK and DTS api docs not
working properly. I addressed this by adding a symlink to the build dir.
This way the link works after installing the docs and the symlink is
just one extra file in the build dir.
Juraj Linkeš (6):
dts: update params and parser docstrings
dts: replace the or operator in third party types
dts: add doc generation dependencies
dts: add API doc sources
doc: meson doc API build dir variable
dts: add API doc generation
buildtools/call-sphinx-build.py | 10 +-
buildtools/get-dts-deps.py | 78 +++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 8 +-
doc/guides/conf.py | 41 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 39 +-
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 +
dts/doc/framework.config.types.rst | 6 +
dts/doc/framework.exception.rst | 6 +
dts/doc/framework.logger.rst | 6 +
dts/doc/framework.params.eal.rst | 6 +
dts/doc/framework.params.rst | 14 +
dts/doc/framework.params.testpmd.rst | 6 +
dts/doc/framework.params.types.rst | 6 +
dts/doc/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
dts/doc/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
dts/doc/framework.runner.rst | 6 +
dts/doc/framework.settings.rst | 6 +
dts/doc/framework.test_result.rst | 6 +
dts/doc/framework.test_suite.rst | 6 +
dts/doc/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
dts/doc/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
dts/doc/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
dts/doc/framework.testbed_model.rst | 26 +
dts/doc/framework.testbed_model.sut_node.rst | 6 +
dts/doc/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
dts/doc/framework.utils.rst | 6 +
dts/doc/index.rst | 43 ++
dts/doc/meson.build | 43 ++
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
.../interactive_remote_session.py | 3 +-
dts/meson.build | 15 +
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
meson.build | 1 +
59 files changed, 1086 insertions(+), 26 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v14 1/6] dts: update params and parser docstrings
2024-08-06 11:17 ` [PATCH v14 0/6] API docs generation Juraj Linkeš
@ 2024-08-06 11:17 ` Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 2/6] dts: replace the or operator in third party types Juraj Linkeš
` (4 subsequent siblings)
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 11:17 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v14 2/6] dts: replace the or operator in third party types
2024-08-06 11:17 ` [PATCH v14 0/6] API docs generation Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 1/6] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-06 11:17 ` Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 3/6] dts: add doc generation dependencies Juraj Linkeš
` (3 subsequent siblings)
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 11:17 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
When the DTS dependencies are not installed when building DTS API
documentation, the or operator produces errors when used with types from
those libraries:
autodoc: failed to import module 'remote_session' from module
'framework'; the following exception was raised:
Traceback (most recent call last):
...
TypeError: unsupported operand type(s) for |: 'Transport' and 'NoneType'
The third part type here is Transport from the paramiko library.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/interactive_remote_session.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 97194e6af8..4605ee14b4 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -5,6 +5,7 @@
import socket
import traceback
+from typing import Union
from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore[import-untyped]
from paramiko.ssh_exception import ( # type: ignore[import-untyped]
@@ -52,7 +53,7 @@ class InteractiveRemoteSession:
session: SSHClient
_logger: DTSLogger
_node_config: NodeConfiguration
- _transport: Transport | None
+ _transport: Union[Transport, None]
def __init__(self, node_config: NodeConfiguration, logger: DTSLogger) -> None:
"""Connect to the node during initialization.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v14 3/6] dts: add doc generation dependencies
2024-08-06 11:17 ` [PATCH v14 0/6] API docs generation Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 1/6] dts: update params and parser docstrings Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 2/6] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-06 11:17 ` Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 4/6] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 11:17 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v14 4/6] dts: add API doc sources
2024-08-06 11:17 ` [PATCH v14 0/6] API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-06 11:17 ` [PATCH v14 3/6] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-06 11:17 ` Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 5/6] doc: meson doc API build dir variable Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 6/6] dts: add API doc generation Juraj Linkeš
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 11:17 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/doc/conf_yaml_schema.json | 1 +
dts/doc/framework.config.rst | 12 ++++++
dts/doc/framework.config.types.rst | 6 +++
dts/doc/framework.exception.rst | 6 +++
dts/doc/framework.logger.rst | 6 +++
dts/doc/framework.params.eal.rst | 6 +++
dts/doc/framework.params.rst | 14 ++++++
dts/doc/framework.params.testpmd.rst | 6 +++
dts/doc/framework.params.types.rst | 6 +++
dts/doc/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
dts/doc/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
dts/doc/framework.runner.rst | 6 +++
dts/doc/framework.settings.rst | 6 +++
dts/doc/framework.test_result.rst | 6 +++
dts/doc/framework.test_suite.rst | 6 +++
dts/doc/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
dts/doc/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
dts/doc/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
dts/doc/framework.testbed_model.rst | 26 +++++++++++
dts/doc/framework.testbed_model.sut_node.rst | 6 +++
dts/doc/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
dts/doc/framework.utils.rst | 6 +++
dts/doc/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 dts/doc/conf_yaml_schema.json
create mode 100644 dts/doc/framework.config.rst
create mode 100644 dts/doc/framework.config.types.rst
create mode 100644 dts/doc/framework.exception.rst
create mode 100644 dts/doc/framework.logger.rst
create mode 100644 dts/doc/framework.params.eal.rst
create mode 100644 dts/doc/framework.params.rst
create mode 100644 dts/doc/framework.params.testpmd.rst
create mode 100644 dts/doc/framework.params.types.rst
create mode 100644 dts/doc/framework.parser.rst
create mode 100644 dts/doc/framework.remote_session.dpdk_shell.rst
create mode 100644 dts/doc/framework.remote_session.interactive_remote_session.rst
create mode 100644 dts/doc/framework.remote_session.interactive_shell.rst
create mode 100644 dts/doc/framework.remote_session.python_shell.rst
create mode 100644 dts/doc/framework.remote_session.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.rst
create mode 100644 dts/doc/framework.remote_session.ssh_session.rst
create mode 100644 dts/doc/framework.remote_session.testpmd_shell.rst
create mode 100644 dts/doc/framework.runner.rst
create mode 100644 dts/doc/framework.settings.rst
create mode 100644 dts/doc/framework.test_result.rst
create mode 100644 dts/doc/framework.test_suite.rst
create mode 100644 dts/doc/framework.testbed_model.cpu.rst
create mode 100644 dts/doc/framework.testbed_model.linux_session.rst
create mode 100644 dts/doc/framework.testbed_model.node.rst
create mode 100644 dts/doc/framework.testbed_model.os_session.rst
create mode 100644 dts/doc/framework.testbed_model.port.rst
create mode 100644 dts/doc/framework.testbed_model.posix_session.rst
create mode 100644 dts/doc/framework.testbed_model.rst
create mode 100644 dts/doc/framework.testbed_model.sut_node.rst
create mode 100644 dts/doc/framework.testbed_model.tg_node.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 dts/doc/framework.testbed_model.virtual_device.rst
create mode 100644 dts/doc/framework.utils.rst
create mode 100644 dts/doc/index.rst
diff --git a/dts/doc/conf_yaml_schema.json b/dts/doc/conf_yaml_schema.json
new file mode 120000
index 0000000000..d89eb81b72
--- /dev/null
+++ b/dts/doc/conf_yaml_schema.json
@@ -0,0 +1 @@
+../framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/dts/doc/framework.config.rst b/dts/doc/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/dts/doc/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/dts/doc/framework.config.types.rst b/dts/doc/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/dts/doc/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.exception.rst b/dts/doc/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/dts/doc/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.logger.rst b/dts/doc/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/dts/doc/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.eal.rst b/dts/doc/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/dts/doc/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.rst b/dts/doc/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/dts/doc/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/dts/doc/framework.params.testpmd.rst b/dts/doc/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/dts/doc/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.params.types.rst b/dts/doc/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/dts/doc/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.parser.rst b/dts/doc/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/dts/doc/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.dpdk_shell.rst b/dts/doc/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/dts/doc/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_remote_session.rst b/dts/doc/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.interactive_shell.rst b/dts/doc/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/dts/doc/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.python_shell.rst b/dts/doc/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/dts/doc/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.remote_session.rst b/dts/doc/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/dts/doc/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.rst b/dts/doc/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/dts/doc/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/dts/doc/framework.remote_session.ssh_session.rst b/dts/doc/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/dts/doc/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.remote_session.testpmd_shell.rst b/dts/doc/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/dts/doc/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.runner.rst b/dts/doc/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/dts/doc/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.settings.rst b/dts/doc/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/dts/doc/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_result.rst b/dts/doc/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/dts/doc/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.test_suite.rst b/dts/doc/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/dts/doc/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.cpu.rst b/dts/doc/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/dts/doc/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.linux_session.rst b/dts/doc/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.node.rst b/dts/doc/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.os_session.rst b/dts/doc/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.port.rst b/dts/doc/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/dts/doc/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.posix_session.rst b/dts/doc/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/dts/doc/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.rst b/dts/doc/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/dts/doc/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/dts/doc/framework.testbed_model.sut_node.rst b/dts/doc/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/dts/doc/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.tg_node.rst b/dts/doc/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/dts/doc/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/dts/doc/framework.testbed_model.traffic_generator.scapy.rst b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/dts/doc/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.testbed_model.virtual_device.rst b/dts/doc/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/dts/doc/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/framework.utils.rst b/dts/doc/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/dts/doc/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/dts/doc/index.rst b/dts/doc/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/dts/doc/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v14 5/6] doc: meson doc API build dir variable
2024-08-06 11:17 ` [PATCH v14 0/6] API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-06 11:17 ` [PATCH v14 4/6] dts: add API doc sources Juraj Linkeš
@ 2024-08-06 11:17 ` Juraj Linkeš
2024-08-06 11:17 ` [PATCH v14 6/6] dts: add API doc generation Juraj Linkeš
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 11:17 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
The three instances of the path 'dpdk_build_root/doc/api' are replaced
with a variable, moving the definition to one place.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Acked-by: Bruce Richardson <bruce.richardson@intel.com>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
Tested-by: Nicholas Pratte <npratte@iol.unh.edu>
---
doc/api/meson.build | 7 ++++---
1 file changed, 4 insertions(+), 3 deletions(-)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..b828b1ed66 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_api_build_dir = meson.current_build_dir()
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -32,10 +33,10 @@ example = custom_target('examples.dox',
# set up common Doxygen configuration
cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
-cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
-cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
+cdata.set('API_EXAMPLES', join_paths(doc_api_build_dir, 'examples.dox'))
+cdata.set('OUTPUT', doc_api_build_dir)
cdata.set('TOPDIR', dpdk_source_root)
-cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, join_paths(dpdk_build_root, 'doc', 'api')]))
+cdata.set('STRIP_FROM_PATH', ' '.join([dpdk_source_root, doc_api_build_dir]))
cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v14 6/6] dts: add API doc generation
2024-08-06 11:17 ` [PATCH v14 0/6] API docs generation Juraj Linkeš
` (4 preceding siblings ...)
2024-08-06 11:17 ` [PATCH v14 5/6] doc: meson doc API build dir variable Juraj Linkeš
@ 2024-08-06 11:17 ` Juraj Linkeš
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 11:17 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
The generated DTS API docs are linked with the DPDK API docs according
to their placement after installing them with 'meson install'. However,
the build path differs from the install path, requiring a symlink from
DPDK API doc build path to DTS API build path to produce the proper link
in the build directory.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 10 ++-
buildtools/get-dts-deps.py | 78 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/meson.build | 1 +
doc/guides/conf.py | 41 +++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 ++
doc/guides/meson.build | 1 +
doc/guides/tools/dts.rst | 39 +++++++++++-
dts/doc/meson.build | 43 +++++++++++++
dts/meson.build | 15 +++++
meson.build | 1 +
14 files changed, 238 insertions(+), 3 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 100644 dts/doc/meson.build
create mode 100644 dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..5dd59907cd 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,11 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+conf_src = src
+if src.find('dts') != -1:
+ if '-c' in extra_args:
+ conf_src = extra_args[extra_args.index('-c') + 1]
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
@@ -23,6 +28,9 @@
for root, dirs, files in os.walk(src):
srcfiles.extend([join(root, f) for f in files])
+if not os.path.exists(dst):
+ os.makedirs(dst)
+
# run sphinx, putting the html output in a "html" directory
with open(join(dst, 'sphinx_html.out'), 'w') as out:
process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
@@ -34,7 +42,7 @@
# copy custom CSS file
css = 'custom.css'
-src_css = join(src, css)
+src_css = join(conf_src, css)
dst_css = join(dst, 'html', '_static', 'css', css)
if not os.path.exists(dst_css) or not filecmp.cmp(src_css, dst_css):
os.makedirs(os.path.dirname(dst_css), exist_ok=True)
diff --git a/buildtools/get-dts-deps.py b/buildtools/get-dts-deps.py
new file mode 100755
index 0000000000..309b83cb5c
--- /dev/null
+++ b/buildtools/get-dts-deps.py
@@ -0,0 +1,78 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script returns the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime and doc generation dependencies in the DTS pyproject.toml file
+a returns a list of module names used in an import statement that are missing.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {'invoke': '>=1.3', 'paramiko': '>=2.4'}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_version_tuple(version_str):
+ return tuple(map(int, version_str.split(".")))
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ deps = {dep: deps_section[dep].strip('"\'') for dep in deps_section}
+ doc_deps_section = cfg['tool.poetry.group.docs.dependencies']
+ doc_deps = {dep: doc_deps_section[dep].strip("\"'") for dep in doc_deps_section}
+
+ return deps | doc_deps
+
+
+def get_missing_imports():
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, req_ver in (req_deps | _EXTRA_DEPS).items():
+ try:
+ req_ver = _get_version_tuple(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = _get_version_tuple(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(req_dep.lower().replace('-', '_'))
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = _get_version_tuple(platform.python_version())
+ req_ver = _get_version_tuple(python_version.strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..599653bea4 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_deps = py3 + files('get-dts-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/meson.build b/doc/api/meson.build
index b828b1ed66..b893931b92 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -41,6 +41,7 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..eab3387874 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +24,45 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS.
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options.
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+if environ.get('DTS_BUILD'):
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-deps').get_missing_imports()
+
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index f8bbfba9f5..b34b7b8eb0 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Intel Corporation
+doc_guides_source_dir = meson.current_source_dir()
sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..bd715f8072 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``dts/doc``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..d48a7f2003
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,43 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_deps).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+dts_doc_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+
+extra_sphinx_args = ['-E', '-c', doc_guides_source_dir]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
+
+build_html_dir = join_paths(meson.current_build_dir(), 'html')
+dts_api_symlink = custom_target('dts_api_symlink',
+ output: 'symlink',
+ depends: dts_api_html,
+ command: ['mkdir', '-p', dts_doc_api_build_dir, '&&',
+ 'ln', '-sf', build_html_dir, dts_doc_api_build_dir],
+ build_by_default: get_option('enable_docs'),
+ install: false)
+
+doc_targets += dts_api_symlink
+doc_target_names += 'DTS_API_SYMLINK'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..6ed3c93fe1
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,15 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+
+subdir('doc')
+
+if doc_targets.length() == 0
+ message = 'No docs targets found'
+else
+ message = 'Built docs:'
+endif
+run_target('dts-doc', command: [echo, message, doc_target_names],
+ depends: doc_targets)
diff --git a/meson.build b/meson.build
index 8b248d4505..835973a0ce 100644
--- a/meson.build
+++ b/meson.build
@@ -87,6 +87,7 @@ subdir('app')
# build docs
subdir('doc')
+subdir('dts')
# build any examples explicitly requested - useful for developers - and
# install any example code into the appropriate install path
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v15 0/5] API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (14 preceding siblings ...)
2024-08-06 11:17 ` [PATCH v14 0/6] API docs generation Juraj Linkeš
@ 2024-08-06 15:19 ` Juraj Linkeš
2024-08-06 15:19 ` [PATCH v15 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-08 8:54 ` [PATCH v16 0/5] API docs generation Juraj Linkeš
` (3 subsequent siblings)
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 15:19 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
v12:
Added paramiko to the required dependencies of get-dts-deps.py.
v13:
Fixed build error:
TypeError: unsupported operand type(s) for |: 'NoneType' and 'Transport'
v14:
Fixed install error:
ERROR: File 'dts/doc/html' could not be found
This required me to put the built docs into dts/doc which is outside the
DPDK API doc dir, resulting in linking between DPDK and DTS api docs not
working properly. I addressed this by adding a symlink to the build dir.
This way the link works after installing the docs and the symlink is
just one extra file in the build dir.
v15:
Moved DTS API sources to doc/api/dts. This simplifies a lot of things in
the build, but mainly makes a lot of sense. Now the source, build and
install paths are the same so there isn't any need for any symlinks or
other workarounds.
Also added a symlink to the custom.css file so that it works with
call-sphinx-build.py without any modifications.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: replace the or operator in third party types
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-deps.py | 78 +++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/custom.css | 1 +
doc/api/dts/framework.config.rst | 12 +
doc/api/dts/framework.config.types.rst | 6 +
doc/api/dts/framework.exception.rst | 6 +
doc/api/dts/framework.logger.rst | 6 +
doc/api/dts/framework.params.eal.rst | 6 +
doc/api/dts/framework.params.rst | 14 +
doc/api/dts/framework.params.testpmd.rst | 6 +
doc/api/dts/framework.params.types.rst | 6 +
doc/api/dts/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
doc/api/dts/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
doc/api/dts/framework.runner.rst | 6 +
doc/api/dts/framework.settings.rst | 6 +
doc/api/dts/framework.test_result.rst | 6 +
doc/api/dts/framework.test_suite.rst | 6 +
doc/api/dts/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
doc/api/dts/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
doc/api/dts/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
doc/api/dts/framework.testbed_model.rst | 26 +
.../dts/framework.testbed_model.sut_node.rst | 6 +
.../dts/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
doc/api/dts/framework.utils.rst | 6 +
doc/api/dts/index.rst | 43 ++
doc/api/dts/meson.build | 29 +
doc/api/meson.build | 13 +
doc/guides/conf.py | 41 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/tools/dts.rst | 39 +-
doc/meson.build | 1 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
.../interactive_remote_session.py | 3 +-
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
58 files changed, 1058 insertions(+), 22 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
create mode 100644 doc/api/dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v15 1/5] dts: update params and parser docstrings
2024-08-06 15:19 ` [PATCH v15 0/5] API docs generation Juraj Linkeš
@ 2024-08-06 15:19 ` Juraj Linkeš
2024-08-06 15:19 ` [PATCH v15 2/5] dts: replace the or operator in third party types Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 15:19 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v15 2/5] dts: replace the or operator in third party types
2024-08-06 15:19 ` [PATCH v15 0/5] API docs generation Juraj Linkeš
2024-08-06 15:19 ` [PATCH v15 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-06 15:19 ` Juraj Linkeš
2024-08-07 13:34 ` Luca Vizzarro
2024-08-06 15:19 ` [PATCH v15 3/5] dts: add doc generation dependencies Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 15:19 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
When the DTS dependencies are not installed when building DTS API
documentation, the or operator produces errors when used with types from
those libraries:
autodoc: failed to import module 'remote_session' from module
'framework'; the following exception was raised:
Traceback (most recent call last):
...
TypeError: unsupported operand type(s) for |: 'Transport' and 'NoneType'
The third part type here is Transport from the paramiko library.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/interactive_remote_session.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 97194e6af8..4605ee14b4 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -5,6 +5,7 @@
import socket
import traceback
+from typing import Union
from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore[import-untyped]
from paramiko.ssh_exception import ( # type: ignore[import-untyped]
@@ -52,7 +53,7 @@ class InteractiveRemoteSession:
session: SSHClient
_logger: DTSLogger
_node_config: NodeConfiguration
- _transport: Transport | None
+ _transport: Union[Transport, None]
def __init__(self, node_config: NodeConfiguration, logger: DTSLogger) -> None:
"""Connect to the node during initialization.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v15 2/5] dts: replace the or operator in third party types
2024-08-06 15:19 ` [PATCH v15 2/5] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-07 13:34 ` Luca Vizzarro
2024-08-07 14:24 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Luca Vizzarro @ 2024-08-07 13:34 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, npratte
Cc: dev
Hi Juraj,
In the past, I have noticed this problem appear only on Python versions
prior to 3.10. Before PEP 604[1] – introduced in Python 3.10 – the pipe
operator was always used as an operator between objects instead of an
alias for Union in the annotations. A quick test verifies this:
Python 3.8.18 (default, Aug 25 2023, 13:20:30)
[GCC 11.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from collections import Counter
>>> from typing import TypedDict
>>> class t(TypedDict):
... a: Counter | None
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in t
TypeError: unsupported operand type(s) for |: 'type' and 'NoneType'
>>>
I have also attempted to build the docs removing this commit on my local
setup (outside of the Poetry shell) and it appears to be working with no
problems.
Best,
Luca
[1] https://peps.python.org/pep-0604/
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v15 2/5] dts: replace the or operator in third party types
2024-08-07 13:34 ` Luca Vizzarro
@ 2024-08-07 14:24 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-07 14:24 UTC (permalink / raw)
To: Luca Vizzarro, thomas, Honnappa.Nagarahalli, bruce.richardson,
jspewock, probb, paul.szczepanek, npratte
Cc: dev
On 7. 8. 2024 15:34, Luca Vizzarro wrote:
> Hi Juraj,
>
> In the past, I have noticed this problem appear only on Python versions
> prior to 3.10. Before PEP 604[1] – introduced in Python 3.10 – the pipe
> operator was always used as an operator between objects instead of an
> alias for Union in the annotations. A quick test verifies this:
>
> Python 3.8.18 (default, Aug 25 2023, 13:20:30)
> [GCC 11.4.0] on linux
> Type "help", "copyright", "credits" or "license" for more information.
> >>> from collections import Counter
> >>> from typing import TypedDict
> >>> class t(TypedDict):
> ... a: Counter | None
> ...
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> File "<stdin>", line 2, in t
> TypeError: unsupported operand type(s) for |: 'type' and 'NoneType'
> >>>
>
> I have also attempted to build the docs removing this commit on my local
> setup (outside of the Poetry shell) and it appears to be working with no
> problems.
>
My local build also works fine, but the problem was found in CI:
https://github.com/ovsrobot/dpdk/actions/runs/10261380458/job/28389032405
I included a script that checks the python version, so the running
version should be at least 3.10. I can't find the actual version
anywhere in the logs, but the environment is Ubuntu22.04 which should
run 3.10.
The TypeError only happens with the Transport from paramiko. This is
likely related to the new feature that I've added in these last
versions: the Python dependencies don't need to be installed. If they're
not found, they're added to the autodoc_mock_imports config option and
basically ignored. CI likely doesn't have these dependencies so I'd say
this is the reason. My local testing (building docs without paramiko)
confirms this.
> Best,
> Luca
>
> [1] https://peps.python.org/pep-0604/
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v15 3/5] dts: add doc generation dependencies
2024-08-06 15:19 ` [PATCH v15 0/5] API docs generation Juraj Linkeš
2024-08-06 15:19 ` [PATCH v15 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-06 15:19 ` [PATCH v15 2/5] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-06 15:19 ` Juraj Linkeš
2024-08-06 15:19 ` [PATCH v15 4/5] dts: add API doc sources Juraj Linkeš
2024-08-06 15:19 ` [PATCH v15 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 15:19 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v15 4/5] dts: add API doc sources
2024-08-06 15:19 ` [PATCH v15 0/5] API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-06 15:19 ` [PATCH v15 3/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-06 15:19 ` Juraj Linkeš
2024-08-06 15:19 ` [PATCH v15 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 15:19 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/framework.config.rst | 12 ++++++
doc/api/dts/framework.config.types.rst | 6 +++
doc/api/dts/framework.exception.rst | 6 +++
doc/api/dts/framework.logger.rst | 6 +++
doc/api/dts/framework.params.eal.rst | 6 +++
doc/api/dts/framework.params.rst | 14 ++++++
doc/api/dts/framework.params.testpmd.rst | 6 +++
doc/api/dts/framework.params.types.rst | 6 +++
doc/api/dts/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
doc/api/dts/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
doc/api/dts/framework.runner.rst | 6 +++
doc/api/dts/framework.settings.rst | 6 +++
doc/api/dts/framework.test_result.rst | 6 +++
doc/api/dts/framework.test_suite.rst | 6 +++
doc/api/dts/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
doc/api/dts/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
doc/api/dts/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
doc/api/dts/framework.testbed_model.rst | 26 +++++++++++
.../dts/framework.testbed_model.sut_node.rst | 6 +++
.../dts/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
doc/api/dts/framework.utils.rst | 6 +++
doc/api/dts/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
new file mode 120000
index 0000000000..5978642d76
--- /dev/null
+++ b/doc/api/dts/conf_yaml_schema.json
@@ -0,0 +1 @@
+../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/doc/api/dts/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/doc/api/dts/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.exception.rst b/doc/api/dts/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/doc/api/dts/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.logger.rst b/doc/api/dts/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/doc/api/dts/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.eal.rst b/doc/api/dts/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/doc/api/dts/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.rst b/doc/api/dts/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/doc/api/dts/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/doc/api/dts/framework.params.testpmd.rst b/doc/api/dts/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/doc/api/dts/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.types.rst b/doc/api/dts/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/doc/api/dts/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.parser.rst b/doc/api/dts/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/doc/api/dts/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.dpdk_shell.rst b/doc/api/dts/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_remote_session.rst b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_shell.rst b/doc/api/dts/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.python_shell.rst b/doc/api/dts/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.remote_session.rst b/doc/api/dts/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.rst b/doc/api/dts/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/doc/api/dts/framework.remote_session.ssh_session.rst b/doc/api/dts/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.testpmd_shell.rst b/doc/api/dts/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.runner.rst b/doc/api/dts/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/doc/api/dts/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.settings.rst b/doc/api/dts/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/doc/api/dts/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_result.rst b/doc/api/dts/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/doc/api/dts/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_suite.rst b/doc/api/dts/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/doc/api/dts/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.cpu.rst b/doc/api/dts/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.linux_session.rst b/doc/api/dts/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.node.rst b/doc/api/dts/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.os_session.rst b/doc/api/dts/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.port.rst b/doc/api/dts/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.posix_session.rst b/doc/api/dts/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.rst b/doc/api/dts/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/doc/api/dts/framework.testbed_model.sut_node.rst b/doc/api/dts/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.tg_node.rst b/doc/api/dts/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.virtual_device.rst b/doc/api/dts/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.utils.rst b/doc/api/dts/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/doc/api/dts/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/index.rst b/doc/api/dts/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/doc/api/dts/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v15 5/5] dts: add API doc generation
2024-08-06 15:19 ` [PATCH v15 0/5] API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-06 15:19 ` [PATCH v15 4/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-06 15:19 ` Juraj Linkeš
2024-08-07 10:41 ` Thomas Monjalon
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-06 15:19 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-deps.py | 78 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/custom.css | 1 +
doc/api/dts/meson.build | 29 +++++++++
doc/api/meson.build | 13 ++++
doc/guides/conf.py | 41 +++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 ++
doc/guides/tools/dts.rst | 39 +++++++++++-
doc/meson.build | 1 +
13 files changed, 214 insertions(+), 2 deletions(-)
create mode 100755 buildtools/get-dts-deps.py
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..45724ffcd4 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,8 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+if src.find('dts') != -1:
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
diff --git a/buildtools/get-dts-deps.py b/buildtools/get-dts-deps.py
new file mode 100755
index 0000000000..309b83cb5c
--- /dev/null
+++ b/buildtools/get-dts-deps.py
@@ -0,0 +1,78 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script returns the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime and doc generation dependencies in the DTS pyproject.toml file
+a returns a list of module names used in an import statement that are missing.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {'invoke': '>=1.3', 'paramiko': '>=2.4'}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_version_tuple(version_str):
+ return tuple(map(int, version_str.split(".")))
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ deps = {dep: deps_section[dep].strip('"\'') for dep in deps_section}
+ doc_deps_section = cfg['tool.poetry.group.docs.dependencies']
+ doc_deps = {dep: doc_deps_section[dep].strip("\"'") for dep in doc_deps_section}
+
+ return deps | doc_deps
+
+
+def get_missing_imports():
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, req_ver in (req_deps | _EXTRA_DEPS).items():
+ try:
+ req_ver = _get_version_tuple(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = _get_version_tuple(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(req_dep.lower().replace('-', '_'))
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = _get_version_tuple(platform.python_version())
+ req_ver = _get_version_tuple(python_version.strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..599653bea4 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_deps = py3 + files('get-dts-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/dts/custom.css b/doc/api/dts/custom.css
new file mode 120000
index 0000000000..3c9480c4a0
--- /dev/null
+++ b/doc/api/dts/custom.css
@@ -0,0 +1 @@
+../../guides/custom.css
\ No newline at end of file
diff --git a/doc/api/dts/meson.build b/doc/api/dts/meson.build
new file mode 100644
index 0000000000..329b60cb1f
--- /dev/null
+++ b/doc/api/dts/meson.build
@@ -0,0 +1,29 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_deps).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+extra_sphinx_args = ['-E', '-c', join_paths(doc_source_dir, 'guides')]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+
+dts_doc_targets += dts_api_html
+dts_doc_target_names += 'DTS_API_HTML'
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..788129336b 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,18 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+dts_doc_targets = []
+dts_doc_target_names = []
+subdir('dts')
+
+if dts_doc_targets.length() == 0
+ dts_message = 'No DTS docs targets found'
+else
+ dts_message = 'Building DTS docs:'
+endif
+run_target('dts-doc', command: [echo, dts_message, dts_doc_target_names],
+ depends: dts_doc_targets)
+
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -40,6 +52,7 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..eab3387874 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -24,6 +24,45 @@
file=stderr)
pass
+# Napoleon enables the Google format of Python doscstrings, used in DTS.
+# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
+extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
+
+# DTS Python docstring options.
+autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+}
+autodoc_class_signature = 'separated'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+autodoc_typehints_description_target = 'documented'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_preprocess_types = True
+add_module_names = False
+toc_object_entries = True
+toc_object_entries_show_parents = 'hide'
+intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+if environ.get('DTS_BUILD'):
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-deps').get_missing_imports()
+
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
stop_on_error = ('-W' in argv)
project = 'Data Plane Development Kit'
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..18cc7908cf 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``doc/api/dts``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/doc/meson.build b/doc/meson.build
index 6f74706aa2..1e0cfa4127 100644
--- a/doc/meson.build
+++ b/doc/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_source_dir = meson.current_source_dir()
doc_targets = []
doc_target_names = []
subdir('api')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v15 5/5] dts: add API doc generation
2024-08-06 15:19 ` [PATCH v15 5/5] dts: add API doc generation Juraj Linkeš
@ 2024-08-07 10:41 ` Thomas Monjalon
2024-08-07 12:03 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Thomas Monjalon @ 2024-08-07 10:41 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
06/08/2024 17:19, Juraj Linkeš:
> +"""Utilities for DTS dependencies.
> +
> +The module can be used as an executable script,
> +which verifies that the running Python version meets the version requirement of DTS.
> +The script returns the standard exit codes in this mode (0 is success, 1 is failure).
Is it returning the list of dependencies for generating doc?
> +
> +The module also contains a function, get_missing_imports,
> +which looks for runtime and doc generation dependencies in the DTS pyproject.toml file
> +a returns a list of module names used in an import statement that are missing.
typo? a -> and
[...]
> +get_dts_deps = py3 + files('get-dts-deps.py')
deps for runtime or doc?
may be good to specify in the name
> --- /dev/null
> +++ b/doc/api/dts/custom.css
> @@ -0,0 +1 @@
> +../../guides/custom.css
> \ No newline at end of file
Is it a link? Why?
> +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
> +dts_api_html = custom_target('dts_api_html',
> + output: 'html',
> + command: [sphinx_wrapper, sphinx, meson.project_version(),
> + meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
> + build_by_default: get_option('enable_docs'),
> + install: get_option('enable_docs'),
> + install_dir: htmldir)
When custom.css is copied?
> +# Napoleon enables the Google format of Python doscstrings, used in DTS.
> +# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
> +extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
What happens if napoleon and intersphinx are not available
when building basic DPDK doc without DTS?
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v15 5/5] dts: add API doc generation
2024-08-07 10:41 ` Thomas Monjalon
@ 2024-08-07 12:03 ` Juraj Linkeš
2024-08-07 12:27 ` Thomas Monjalon
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-07 12:03 UTC (permalink / raw)
To: Thomas Monjalon
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On 7. 8. 2024 12:41, Thomas Monjalon wrote:
> 06/08/2024 17:19, Juraj Linkeš:
>> +"""Utilities for DTS dependencies.
>> +
>> +The module can be used as an executable script,
>> +which verifies that the running Python version meets the version requirement of DTS.
>> +The script returns the standard exit codes in this mode (0 is success, 1 is failure).
>
> Is it returning the list of dependencies for generating doc?
>
It's not worded right. When run as a script, it exits with the standard
exit codes, it doesn't return anything.
The function below is for returning the list of depedencies.
>> +
>> +The module also contains a function, get_missing_imports,
>> +which looks for runtime and doc generation dependencies in the DTS pyproject.toml file
>> +a returns a list of module names used in an import statement that are missing.
>
> typo? a -> and
Ack.
>
> [...]
>> +get_dts_deps = py3 + files('get-dts-deps.py')
>
> deps for runtime or doc?
> may be good to specify in the name
>
It's getting both, actually.
Now that I think about it, we don't need to get the docs dependencies,
as we check for Sphinx elsewhere. We really only need to get the runtime
dependencies and mock what's missing (that is add those to the
autodoc_mock_imports config option).
I think it makes sense to change the script and rename it to
get-dts-runtime-deps.py (and the variable).
>
>> --- /dev/null
>> +++ b/doc/api/dts/custom.css
>> @@ -0,0 +1 @@
>> +../../guides/custom.css
>> \ No newline at end of file
>
> Is it a link? Why?
>
call-sphinx-build.py copies the custom.css file. I added a link to
preserve the look.
>> +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
>> +dts_api_html = custom_target('dts_api_html',
>> + output: 'html',
>> + command: [sphinx_wrapper, sphinx, meson.project_version(),
>> + meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
>> + build_by_default: get_option('enable_docs'),
>> + install: get_option('enable_docs'),
>> + install_dir: htmldir)
>
> When custom.css is copied?
>
I'm not sure what you're asking here. The call-sphinx-build.py copies it
during the build and it's also copied during the install phase.
>
>
>> +# Napoleon enables the Google format of Python doscstrings, used in DTS.
>> +# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
>> +extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
>
> What happens if napoleon and intersphinx are not available
> when building basic DPDK doc without DTS?
>
>
napoleon was added in version 1.3 and intersphinx in 0.5, so I didn't
think of testing that.
I tried adding a non-existent extension to the list and got this error:
Extension error:
Could not import extension sphinx.ext.foo (exception: No module named
'sphinx.ext.foo')
I could add version checks for each of the extensions.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v15 5/5] dts: add API doc generation
2024-08-07 12:03 ` Juraj Linkeš
@ 2024-08-07 12:27 ` Thomas Monjalon
2024-08-07 13:12 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Thomas Monjalon @ 2024-08-07 12:27 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
07/08/2024 14:03, Juraj Linkeš:
> On 7. 8. 2024 12:41, Thomas Monjalon wrote:
> > 06/08/2024 17:19, Juraj Linkeš:
> > [...]
> >> +get_dts_deps = py3 + files('get-dts-deps.py')
> >
> > deps for runtime or doc?
> > may be good to specify in the name
> >
>
> It's getting both, actually.
> Now that I think about it, we don't need to get the docs dependencies,
> as we check for Sphinx elsewhere. We really only need to get the runtime
> dependencies and mock what's missing (that is add those to the
> autodoc_mock_imports config option).
>
> I think it makes sense to change the script and rename it to
> get-dts-runtime-deps.py (and the variable).
OK
> >> --- /dev/null
> >> +++ b/doc/api/dts/custom.css
> >> @@ -0,0 +1 @@
> >> +../../guides/custom.css
> >> \ No newline at end of file
> >
> > Is it a link? Why?
> >
>
> call-sphinx-build.py copies the custom.css file. I added a link to
> preserve the look.
>
> >> +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
> >> +dts_api_html = custom_target('dts_api_html',
> >> + output: 'html',
> >> + command: [sphinx_wrapper, sphinx, meson.project_version(),
> >> + meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
> >> + build_by_default: get_option('enable_docs'),
> >> + install: get_option('enable_docs'),
> >> + install_dir: htmldir)
> >
> > When custom.css is copied?
>
> I'm not sure what you're asking here. The call-sphinx-build.py copies it
> during the build and it's also copied during the install phase.
The file is copied in _static dir of sphinx guides.
How does it work for DTS API?
> >> +# Napoleon enables the Google format of Python doscstrings, used in DTS.
> >> +# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
> >> +extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
> >
> > What happens if napoleon and intersphinx are not available
> > when building basic DPDK doc without DTS?
>
> napoleon was added in version 1.3 and intersphinx in 0.5, so I didn't
> think of testing that.
>
> I tried adding a non-existent extension to the list and got this error:
> Extension error:
> Could not import extension sphinx.ext.foo (exception: No module named
> 'sphinx.ext.foo')
>
> I could add version checks for each of the extensions.
My concern is to allow generating DPDK doc without DTS,
without the new extra dependencies.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v15 5/5] dts: add API doc generation
2024-08-07 12:27 ` Thomas Monjalon
@ 2024-08-07 13:12 ` Juraj Linkeš
2024-08-08 12:27 ` Thomas Monjalon
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-07 13:12 UTC (permalink / raw)
To: Thomas Monjalon
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On 7. 8. 2024 14:27, Thomas Monjalon wrote:
> 07/08/2024 14:03, Juraj Linkeš:
>> On 7. 8. 2024 12:41, Thomas Monjalon wrote:
>>> 06/08/2024 17:19, Juraj Linkeš:
>>> [...]
>>>> +get_dts_deps = py3 + files('get-dts-deps.py')
>>>
>>> deps for runtime or doc?
>>> may be good to specify in the name
>>>
>>
>> It's getting both, actually.
>> Now that I think about it, we don't need to get the docs dependencies,
>> as we check for Sphinx elsewhere. We really only need to get the runtime
>> dependencies and mock what's missing (that is add those to the
>> autodoc_mock_imports config option).
>>
>> I think it makes sense to change the script and rename it to
>> get-dts-runtime-deps.py (and the variable).
>
> OK
>
>>>> --- /dev/null
>>>> +++ b/doc/api/dts/custom.css
>>>> @@ -0,0 +1 @@
>>>> +../../guides/custom.css
>>>> \ No newline at end of file
>>>
>>> Is it a link? Why?
>>>
>>
>> call-sphinx-build.py copies the custom.css file. I added a link to
>> preserve the look.
>>
>>>> +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
>>>> +dts_api_html = custom_target('dts_api_html',
>>>> + output: 'html',
>>>> + command: [sphinx_wrapper, sphinx, meson.project_version(),
>>>> + meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
>>>> + build_by_default: get_option('enable_docs'),
>>>> + install: get_option('enable_docs'),
>>>> + install_dir: htmldir)
>>>
>>> When custom.css is copied?
>>
>> I'm not sure what you're asking here. The call-sphinx-build.py copies it
>> during the build and it's also copied during the install phase.
>
> The file is copied in _static dir of sphinx guides.
> How does it work for DTS API?
>
It works the same way. shutil.copyfile (which is used to copy the file)
follows symlinks, so DTS API gets a copy in its _static dir
(doc/api/dts/html/_static/css).
I did it this way to preserve the style in case there's something in the
css file that applies to both DPDK and DTS API docs.
>
>>>> +# Napoleon enables the Google format of Python doscstrings, used in DTS.
>>>> +# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
>>>> +extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
>>>
>>> What happens if napoleon and intersphinx are not available
>>> when building basic DPDK doc without DTS?
>>
>> napoleon was added in version 1.3 and intersphinx in 0.5, so I didn't
>> think of testing that.
>>
>> I tried adding a non-existent extension to the list and got this error:
>> Extension error:
>> Could not import extension sphinx.ext.foo (exception: No module named
>> 'sphinx.ext.foo')
>>
>> I could add version checks for each of the extensions.
>
> My concern is to allow generating DPDK doc without DTS,
> without the new extra dependencies.
>
Ok, I think putting all of these into the DTS if branch ("if
environ.get('DTS_BUILD'):") would make sense then.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v15 5/5] dts: add API doc generation
2024-08-07 13:12 ` Juraj Linkeš
@ 2024-08-08 12:27 ` Thomas Monjalon
0 siblings, 0 replies; 393+ messages in thread
From: Thomas Monjalon @ 2024-08-08 12:27 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
07/08/2024 15:12, Juraj Linkeš:
> On 7. 8. 2024 14:27, Thomas Monjalon wrote:
> > 07/08/2024 14:03, Juraj Linkeš:
> >> On 7. 8. 2024 12:41, Thomas Monjalon wrote:
> >>> 06/08/2024 17:19, Juraj Linkeš:
> >>>> --- /dev/null
> >>>> +++ b/doc/api/dts/custom.css
> >>>> @@ -0,0 +1 @@
> >>>> +../../guides/custom.css
> >>>> \ No newline at end of file
> >>>
> >>> Is it a link? Why?
> >>>
> >>
> >> call-sphinx-build.py copies the custom.css file. I added a link to
> >> preserve the look.
> >>
> >>>> +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
> >>>> +dts_api_html = custom_target('dts_api_html',
> >>>> + output: 'html',
> >>>> + command: [sphinx_wrapper, sphinx, meson.project_version(),
> >>>> + meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
> >>>> + build_by_default: get_option('enable_docs'),
> >>>> + install: get_option('enable_docs'),
> >>>> + install_dir: htmldir)
> >>>
> >>> When custom.css is copied?
> >>
> >> I'm not sure what you're asking here. The call-sphinx-build.py copies it
> >> during the build and it's also copied during the install phase.
> >
> > The file is copied in _static dir of sphinx guides.
> > How does it work for DTS API?
>
> It works the same way. shutil.copyfile (which is used to copy the file)
> follows symlinks, so DTS API gets a copy in its _static dir
> (doc/api/dts/html/_static/css).
> I did it this way to preserve the style in case there's something in the
> css file that applies to both DPDK and DTS API docs.
Oh yes.
I wrote it 2 weeks ago and already forgot it works with any sphinx path.
> >>>> +# Napoleon enables the Google format of Python doscstrings, used in DTS.
> >>>> +# Intersphinx allows linking to external projects, such as Python docs, also used in DTS.
> >>>> +extensions = ['sphinx.ext.napoleon', 'sphinx.ext.intersphinx']
> >>>
> >>> What happens if napoleon and intersphinx are not available
> >>> when building basic DPDK doc without DTS?
> >>
> >> napoleon was added in version 1.3 and intersphinx in 0.5, so I didn't
> >> think of testing that.
> >>
> >> I tried adding a non-existent extension to the list and got this error:
> >> Extension error:
> >> Could not import extension sphinx.ext.foo (exception: No module named
> >> 'sphinx.ext.foo')
> >>
> >> I could add version checks for each of the extensions.
> >
> > My concern is to allow generating DPDK doc without DTS,
> > without the new extra dependencies.
> >
>
> Ok, I think putting all of these into the DTS if branch ("if
> environ.get('DTS_BUILD'):") would make sense then.
OK
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v16 0/5] API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (15 preceding siblings ...)
2024-08-06 15:19 ` [PATCH v15 0/5] API docs generation Juraj Linkeš
@ 2024-08-08 8:54 ` Juraj Linkeš
2024-08-08 8:54 ` [PATCH v16 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-14 15:05 ` [PATCH v17 0/5] API docs generation Juraj Linkeš
` (2 subsequent siblings)
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-08 8:54 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> dts-doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
v12:
Added paramiko to the required dependencies of get-dts-deps.py.
v13:
Fixed build error:
TypeError: unsupported operand type(s) for |: 'NoneType' and 'Transport'
v14:
Fixed install error:
ERROR: File 'dts/doc/html' could not be found
This required me to put the built docs into dts/doc which is outside the
DPDK API doc dir, resulting in linking between DPDK and DTS api docs not
working properly. I addressed this by adding a symlink to the build dir.
This way the link works after installing the docs and the symlink is
just one extra file in the build dir.
v15:
Moved DTS API sources to doc/api/dts. This simplifies a lot of things in
the build, but mainly makes a lot of sense. Now the source, build and
install paths are the same so there isn't any need for any symlinks or
other workarounds.
Also added a symlink to the custom.css file so that it works with
call-sphinx-build.py without any modifications.
v16:
Renamed the dependency python file to get-dts-runtime-deps.py a modified
it to only get runtime dependencies. We don't need to check docs
dependencies (Sphinx) as we don't need to mock those.
Also moved all new Sphinx configuration into the DTS if branch to make
sure it won't ever affect the DPDK doc build.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: replace the or operator in third party types
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-runtime-deps.py | 72 +++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/custom.css | 1 +
doc/api/dts/framework.config.rst | 12 +
doc/api/dts/framework.config.types.rst | 6 +
doc/api/dts/framework.exception.rst | 6 +
doc/api/dts/framework.logger.rst | 6 +
doc/api/dts/framework.params.eal.rst | 6 +
doc/api/dts/framework.params.rst | 14 +
doc/api/dts/framework.params.testpmd.rst | 6 +
doc/api/dts/framework.params.types.rst | 6 +
doc/api/dts/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
doc/api/dts/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
doc/api/dts/framework.runner.rst | 6 +
doc/api/dts/framework.settings.rst | 6 +
doc/api/dts/framework.test_result.rst | 6 +
doc/api/dts/framework.test_suite.rst | 6 +
doc/api/dts/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
doc/api/dts/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
doc/api/dts/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
doc/api/dts/framework.testbed_model.rst | 26 +
.../dts/framework.testbed_model.sut_node.rst | 6 +
.../dts/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
doc/api/dts/framework.utils.rst | 6 +
doc/api/dts/index.rst | 43 ++
doc/api/dts/meson.build | 29 +
doc/api/meson.build | 13 +
doc/guides/conf.py | 44 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/tools/dts.rst | 39 +-
doc/meson.build | 1 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
.../interactive_remote_session.py | 3 +-
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
58 files changed, 1055 insertions(+), 22 deletions(-)
create mode 100755 buildtools/get-dts-runtime-deps.py
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
create mode 100644 doc/api/dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v16 1/5] dts: update params and parser docstrings
2024-08-08 8:54 ` [PATCH v16 0/5] API docs generation Juraj Linkeš
@ 2024-08-08 8:54 ` Juraj Linkeš
2024-08-09 18:31 ` Jeremy Spewock
2024-08-08 8:54 ` [PATCH v16 2/5] dts: replace the or operator in third party types Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-08 8:54 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v16 1/5] dts: update params and parser docstrings
2024-08-08 8:54 ` [PATCH v16 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-09 18:31 ` Jeremy Spewock
0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-08-09 18:31 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On Thu, Aug 8, 2024 at 4:54 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>
> Address a few errors reported by Sphinx when generating documentation:
> framework/params/__init__.py:docstring of framework.params.modify_str:3:
> WARNING: Inline interpreted text or phrase reference start-string
> without end-string.
> framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
> WARNING: Definition list ends without a blank line; unexpected
> unindent.
> framework/params/types.py:docstring of framework.params.types:8:
> WARNING: Inline strong start-string without end-string.
> framework/params/types.py:docstring of framework.params.types:9:
> WARNING: Inline strong start-string without end-string.
> framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
> Unexpected indentation.
> framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
> Unexpected indentation.
> framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
> Unexpected indentation.
> framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
> ERROR: Unexpected indentation.
> framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
> WARNING: Block quote ends without a blank line; unexpected unindent.
>
> Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
> Fixes: d70159cb62f5 ("dts: add params manipulation module")
> Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
> Fixes: 818fe14e3422 ("dts: add parsing utility module")
> Cc: luca.vizzarro@arm.com
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
> ---
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v16 2/5] dts: replace the or operator in third party types
2024-08-08 8:54 ` [PATCH v16 0/5] API docs generation Juraj Linkeš
2024-08-08 8:54 ` [PATCH v16 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-08 8:54 ` Juraj Linkeš
2024-08-09 19:03 ` Jeremy Spewock
2024-08-08 8:54 ` [PATCH v16 3/5] dts: add doc generation dependencies Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-08 8:54 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
When the DTS dependencies are not installed when building DTS API
documentation, the or operator produces errors when used with types from
those libraries:
autodoc: failed to import module 'remote_session' from module
'framework'; the following exception was raised:
Traceback (most recent call last):
...
TypeError: unsupported operand type(s) for |: 'Transport' and 'NoneType'
The third part type here is Transport from the paramiko library.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/interactive_remote_session.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 97194e6af8..4605ee14b4 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -5,6 +5,7 @@
import socket
import traceback
+from typing import Union
from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore[import-untyped]
from paramiko.ssh_exception import ( # type: ignore[import-untyped]
@@ -52,7 +53,7 @@ class InteractiveRemoteSession:
session: SSHClient
_logger: DTSLogger
_node_config: NodeConfiguration
- _transport: Transport | None
+ _transport: Union[Transport, None]
def __init__(self, node_config: NodeConfiguration, logger: DTSLogger) -> None:
"""Connect to the node during initialization.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v16 2/5] dts: replace the or operator in third party types
2024-08-08 8:54 ` [PATCH v16 2/5] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-09 19:03 ` Jeremy Spewock
2024-08-12 7:58 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2024-08-09 19:03 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
This is a funny change I wouldn't have expected the series to need. I
don't doubt that it does need it of course and I don't think there is
any harm in the change, but, out of curiosity, is this because the or
operator is coming from one of the dependencies we are installing? I
thought it was just shipped with later Python versions. Regardless,
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v16 2/5] dts: replace the or operator in third party types
2024-08-09 19:03 ` Jeremy Spewock
@ 2024-08-12 7:58 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-12 7:58 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On 9. 8. 2024 21:03, Jeremy Spewock wrote:
> This is a funny change I wouldn't have expected the series to need. I
> don't doubt that it does need it of course and I don't think there is
> any harm in the change, but, out of curiosity, is this because the or
> operator is coming from one of the dependencies we are installing? I
> thought it was just shipped with later Python versions. Regardless,
I think this happens when Paramiko is not installed, as is likely the
case in CI. When Paramiko is installed, the issue doesn't crop up.
I added the autodoc_mock_imports [0] configuration so that dependencies
don't have to be installed. I don't know how exactly autodoc gets around
the imports, but that seems to be what's causing the error when Paramiko
is missing.
[0]
https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#confval-autodoc_mock_imports
>
> Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v16 3/5] dts: add doc generation dependencies
2024-08-08 8:54 ` [PATCH v16 0/5] API docs generation Juraj Linkeš
2024-08-08 8:54 ` [PATCH v16 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-08 8:54 ` [PATCH v16 2/5] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-08 8:54 ` Juraj Linkeš
2024-08-09 19:04 ` Jeremy Spewock
2024-08-08 8:54 ` [PATCH v16 4/5] dts: add API doc sources Juraj Linkeš
2024-08-08 8:54 ` [PATCH v16 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-08 8:54 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v16 3/5] dts: add doc generation dependencies
2024-08-08 8:54 ` [PATCH v16 3/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-09 19:04 ` Jeremy Spewock
0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-08-09 19:04 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On Thu, Aug 8, 2024 at 4:54 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>
> Sphinx imports every Python module (through the autodoc extension)
> when generating documentation from docstrings, meaning all DTS
> dependencies, including Python version, should be satisfied. This is not
> a hard requirement, as imports from dependencies may be mocked in the
> autodoc_mock_imports autodoc option.
> In case DTS developers want to use a Sphinx installation from their
> virtualenv, we provide an optional Poetry group for doc generation. The
> pyelftools package is there so that meson picks up the correct Python
> installation, as pyelftools is required by the build system.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v16 4/5] dts: add API doc sources
2024-08-08 8:54 ` [PATCH v16 0/5] API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-08 8:54 ` [PATCH v16 3/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-08 8:54 ` Juraj Linkeš
2024-08-08 8:54 ` [PATCH v16 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-08 8:54 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/framework.config.rst | 12 ++++++
doc/api/dts/framework.config.types.rst | 6 +++
doc/api/dts/framework.exception.rst | 6 +++
doc/api/dts/framework.logger.rst | 6 +++
doc/api/dts/framework.params.eal.rst | 6 +++
doc/api/dts/framework.params.rst | 14 ++++++
doc/api/dts/framework.params.testpmd.rst | 6 +++
doc/api/dts/framework.params.types.rst | 6 +++
doc/api/dts/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
doc/api/dts/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
doc/api/dts/framework.runner.rst | 6 +++
doc/api/dts/framework.settings.rst | 6 +++
doc/api/dts/framework.test_result.rst | 6 +++
doc/api/dts/framework.test_suite.rst | 6 +++
doc/api/dts/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
doc/api/dts/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
doc/api/dts/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
doc/api/dts/framework.testbed_model.rst | 26 +++++++++++
.../dts/framework.testbed_model.sut_node.rst | 6 +++
.../dts/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
doc/api/dts/framework.utils.rst | 6 +++
doc/api/dts/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
new file mode 120000
index 0000000000..5978642d76
--- /dev/null
+++ b/doc/api/dts/conf_yaml_schema.json
@@ -0,0 +1 @@
+../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/doc/api/dts/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/doc/api/dts/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.exception.rst b/doc/api/dts/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/doc/api/dts/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.logger.rst b/doc/api/dts/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/doc/api/dts/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.eal.rst b/doc/api/dts/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/doc/api/dts/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.rst b/doc/api/dts/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/doc/api/dts/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/doc/api/dts/framework.params.testpmd.rst b/doc/api/dts/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/doc/api/dts/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.types.rst b/doc/api/dts/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/doc/api/dts/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.parser.rst b/doc/api/dts/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/doc/api/dts/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.dpdk_shell.rst b/doc/api/dts/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_remote_session.rst b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_shell.rst b/doc/api/dts/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.python_shell.rst b/doc/api/dts/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.remote_session.rst b/doc/api/dts/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.rst b/doc/api/dts/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/doc/api/dts/framework.remote_session.ssh_session.rst b/doc/api/dts/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.testpmd_shell.rst b/doc/api/dts/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.runner.rst b/doc/api/dts/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/doc/api/dts/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.settings.rst b/doc/api/dts/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/doc/api/dts/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_result.rst b/doc/api/dts/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/doc/api/dts/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_suite.rst b/doc/api/dts/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/doc/api/dts/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.cpu.rst b/doc/api/dts/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.linux_session.rst b/doc/api/dts/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.node.rst b/doc/api/dts/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.os_session.rst b/doc/api/dts/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.port.rst b/doc/api/dts/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.posix_session.rst b/doc/api/dts/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.rst b/doc/api/dts/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/doc/api/dts/framework.testbed_model.sut_node.rst b/doc/api/dts/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.tg_node.rst b/doc/api/dts/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.virtual_device.rst b/doc/api/dts/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.utils.rst b/doc/api/dts/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/doc/api/dts/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/index.rst b/doc/api/dts/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/doc/api/dts/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v16 5/5] dts: add API doc generation
2024-08-08 8:54 ` [PATCH v16 0/5] API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-08 8:54 ` [PATCH v16 4/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-08 8:54 ` Juraj Linkeš
2024-08-09 19:04 ` Jeremy Spewock
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-08 8:54 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content. There's other Sphinx
configuration related to Python docstrings which doesn't affect DPDK doc
build. All new configuration is in a conditional block, applied only
when DTS API docs are built to not interfere with DPDK doc build.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
And finally, the DTS API docs can be accessed from the DPDK API doxygen
page.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-runtime-deps.py | 72 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/custom.css | 1 +
doc/api/dts/meson.build | 29 +++++++++
doc/api/meson.build | 13 ++++
doc/guides/conf.py | 44 +++++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 ++
doc/guides/tools/dts.rst | 39 +++++++++++-
doc/meson.build | 1 +
13 files changed, 211 insertions(+), 2 deletions(-)
create mode 100755 buildtools/get-dts-runtime-deps.py
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..45724ffcd4 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,8 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+if src.find('dts') != -1:
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
diff --git a/buildtools/get-dts-runtime-deps.py b/buildtools/get-dts-runtime-deps.py
new file mode 100755
index 0000000000..5d629dd09d
--- /dev/null
+++ b/buildtools/get-dts-runtime-deps.py
@@ -0,0 +1,72 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script exits with the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime and doc generation dependencies in the DTS pyproject.toml file
+and returns a list of module names used in an import statement that are missing.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+from packaging.version import Version
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {'invoke': '>=1.3', 'paramiko': '>=2.4'}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ return {dep: deps_section[dep].strip('"\'') for dep in deps_section}
+
+
+def get_missing_imports():
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, req_ver in (req_deps | _EXTRA_DEPS).items():
+ try:
+ req_ver = Version(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = Version(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(req_dep.lower().replace('-', '_'))
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = Version(platform.python_version())
+ req_ver = Version(python_version.strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..6b938d767c 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_runtime_deps = py3 + files('get-dts-runtime-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/dts/custom.css b/doc/api/dts/custom.css
new file mode 120000
index 0000000000..3c9480c4a0
--- /dev/null
+++ b/doc/api/dts/custom.css
@@ -0,0 +1 @@
+../../guides/custom.css
\ No newline at end of file
diff --git a/doc/api/dts/meson.build b/doc/api/dts/meson.build
new file mode 100644
index 0000000000..b4b6f9d269
--- /dev/null
+++ b/doc/api/dts/meson.build
@@ -0,0 +1,29 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_runtime_deps, check: false).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+extra_sphinx_args = ['-E', '-c', join_paths(doc_source_dir, 'guides')]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+
+dts_doc_targets += dts_api_html
+dts_doc_target_names += 'DTS_API_HTML'
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..788129336b 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,18 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+dts_doc_targets = []
+dts_doc_target_names = []
+subdir('dts')
+
+if dts_doc_targets.length() == 0
+ dts_message = 'No DTS docs targets found'
+else
+ dts_message = 'Building DTS docs:'
+endif
+run_target('dts-doc', command: [echo, dts_message, dts_doc_target_names],
+ depends: dts_doc_targets)
+
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -40,6 +52,7 @@ cdata.set('WARN_AS_ERROR', 'NO')
if get_option('werror')
cdata.set('WARN_AS_ERROR', 'YES')
endif
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
# configure HTML Doxygen run
html_cdata = configuration_data()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..d7f3030838 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -58,6 +58,48 @@
("tools/devbind", "dpdk-devbind",
"check device status and bind/unbind them from drivers", "", 8)]
+# DTS API docs additional configuration
+if environ.get('DTS_BUILD'):
+ extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
+ # Napoleon enables the Google format of Python doscstrings.
+ napoleon_numpy_docstring = False
+ napoleon_attr_annotations = True
+ napoleon_preprocess_types = True
+
+ # Autodoc pulls documentation from code.
+ autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+ }
+ autodoc_class_signature = 'separated'
+ autodoc_typehints = 'both'
+ autodoc_typehints_format = 'short'
+ autodoc_typehints_description_target = 'documented'
+
+ # Intersphinx allows linking to external projects, such as Python docs.
+ intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+ # DTS docstring options.
+ add_module_names = False
+ toc_object_entries = True
+ toc_object_entries_show_parents = 'hide'
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-runtime-deps').get_missing_imports()
+
# ####### :numref: fallback ########
# The following hook functions add some simple handling for the :numref:
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..18cc7908cf 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``doc/api/dts``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build dts-doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/doc/meson.build b/doc/meson.build
index 6f74706aa2..1e0cfa4127 100644
--- a/doc/meson.build
+++ b/doc/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_source_dir = meson.current_source_dir()
doc_targets = []
doc_target_names = []
subdir('api')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v16 5/5] dts: add API doc generation
2024-08-08 8:54 ` [PATCH v16 5/5] dts: add API doc generation Juraj Linkeš
@ 2024-08-09 19:04 ` Jeremy Spewock
2024-08-12 8:08 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Jeremy Spewock @ 2024-08-09 19:04 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
Looks good to me, I just had a few comments about copyright lines but
the actual functionality seems like it's there to me:
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
On Thu, Aug 8, 2024 at 4:55 AM Juraj Linkeš <juraj.linkes@pantheon.tech> wrote:
>
> The tool used to generate DTS API docs is Sphinx, which is already in
> use in DPDK. The same configuration is used to preserve style with one
> DTS-specific configuration (so that the DPDK docs are unchanged) that
> modifies how the sidebar displays the content. There's other Sphinx
> configuration related to Python docstrings which doesn't affect DPDK doc
> build. All new configuration is in a conditional block, applied only
> when DTS API docs are built to not interfere with DPDK doc build.
>
> Sphinx generates the documentation from Python docstrings. The docstring
> format is the Google format [0] which requires the sphinx.ext.napoleon
> extension. The other extension, sphinx.ext.intersphinx, enables linking
> to objects in external documentations, such as the Python documentation.
>
> There is one requirement for building DTS docs - the same Python version
> as DTS or higher, because Sphinx's autodoc extension imports the code.
>
> The dependencies needed to import the code don't have to be satisfied,
> as the autodoc extension allows us to mock the imports. The missing
> packages are taken from the DTS pyproject.toml file.
>
> And finally, the DTS API docs can be accessed from the DPDK API doxygen
> page.
>
> [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
<snip>
> --- /dev/null
> +++ b/doc/api/dts/meson.build
> @@ -0,0 +1,29 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
Should this be 2023 or updated now that we're in 2024? Probably
doesn't matter too much either way.
> +
> +sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
> +if not sphinx.found()
> + subdir_done()
> +endif
> +
> +python_ver_satisfied = run_command(get_dts_runtime_deps, check: false).returncode()
> +if python_ver_satisfied != 0
> + subdir_done()
> +endif
> +
> +extra_sphinx_args = ['-E', '-c', join_paths(doc_source_dir, 'guides')]
> +if get_option('werror')
> + extra_sphinx_args += '-W'
> +endif
> +
> +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
> +dts_api_html = custom_target('dts_api_html',
> + output: 'html',
> + command: [sphinx_wrapper, sphinx, meson.project_version(),
> + meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
> + build_by_default: get_option('enable_docs'),
> + install: get_option('enable_docs'),
> + install_dir: htmldir)
> +
> +dts_doc_targets += dts_api_html
> +dts_doc_target_names += 'DTS_API_HTML'
> diff --git a/doc/api/meson.build b/doc/api/meson.build
> index 5b50692df9..788129336b 100644
> --- a/doc/api/meson.build
> +++ b/doc/api/meson.build
> @@ -1,6 +1,18 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
>
Should you add your copyright to the top of this file now that you've
also modified it?
> +dts_doc_targets = []
> +dts_doc_target_names = []
> +subdir('dts')
> +
> +if dts_doc_targets.length() == 0
> + dts_message = 'No DTS docs targets found'
> +else
> + dts_message = 'Building DTS docs:'
> +endif
> +run_target('dts-doc', command: [echo, dts_message, dts_doc_target_names],
> + depends: dts_doc_targets)
> +
> doxygen = find_program('doxygen', required: get_option('enable_docs'))
>
> if not doxygen.found()
> @@ -40,6 +52,7 @@ cdata.set('WARN_AS_ERROR', 'NO')
> if get_option('werror')
> cdata.set('WARN_AS_ERROR', 'YES')
> endif
> +cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
>
> # configure HTML Doxygen run
> html_cdata = configuration_data()
> diff --git a/doc/guides/conf.py b/doc/guides/conf.py
> index 0f7ff5282d..d7f3030838 100644
> --- a/doc/guides/conf.py
<snip>
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v16 5/5] dts: add API doc generation
2024-08-09 19:04 ` Jeremy Spewock
@ 2024-08-12 8:08 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-12 8:08 UTC (permalink / raw)
To: Jeremy Spewock
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
> <snip>
>> --- /dev/null
>> +++ b/doc/api/dts/meson.build
>> @@ -0,0 +1,29 @@
>> +# SPDX-License-Identifier: BSD-3-Clause
>> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
>
> Should this be 2023 or updated now that we're in 2024? Probably
> doesn't matter too much either way.
>
I've talked to Thomas about copyrights and in essence, the copyright is
going to be valid for a long time, so basically there's no need.
>> diff --git a/doc/api/meson.build b/doc/api/meson.build
>> index 5b50692df9..788129336b 100644
>> --- a/doc/api/meson.build
>> +++ b/doc/api/meson.build
>> @@ -1,6 +1,18 @@
>> # SPDX-License-Identifier: BSD-3-Clause
>> # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
>>
>
> Should you add your copyright to the top of this file now that you've
> also modified it?
>
Possibly, but I actually want to move almost all (or maybe all of if it
works) of this one level deeper, so that would make too small of a
change to warrant the inclusion of you copyright I think.
>> +dts_doc_targets = []
>> +dts_doc_target_names = []
>> +subdir('dts')
>> +
>> +if dts_doc_targets.length() == 0
>> + dts_message = 'No DTS docs targets found'
>> +else
>> + dts_message = 'Building DTS docs:'
>> +endif
>> +run_target('dts-doc', command: [echo, dts_message, dts_doc_target_names],
>> + depends: dts_doc_targets)
>> +
>> doxygen = find_program('doxygen', required: get_option('enable_docs'))
>>
>> if not doxygen.found()
>> @@ -40,6 +52,7 @@ cdata.set('WARN_AS_ERROR', 'NO')
>> if get_option('werror')
>> cdata.set('WARN_AS_ERROR', 'YES')
>> endif
>> +cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
>>
>> # configure HTML Doxygen run
>> html_cdata = configuration_data()
>> diff --git a/doc/guides/conf.py b/doc/guides/conf.py
>> index 0f7ff5282d..d7f3030838 100644
>> --- a/doc/guides/conf.py
> <snip>
>>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v17 0/5] API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (16 preceding siblings ...)
2024-08-08 8:54 ` [PATCH v16 0/5] API docs generation Juraj Linkeš
@ 2024-08-14 15:05 ` Juraj Linkeš
2024-08-14 15:05 ` [PATCH v17 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-20 13:18 ` [PATCH v18 0/5] API docs generation Juraj Linkeš
2024-08-21 15:02 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-14 15:05 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
v12:
Added paramiko to the required dependencies of get-dts-deps.py.
v13:
Fixed build error:
TypeError: unsupported operand type(s) for |: 'NoneType' and 'Transport'
v14:
Fixed install error:
ERROR: File 'dts/doc/html' could not be found
This required me to put the built docs into dts/doc which is outside the
DPDK API doc dir, resulting in linking between DPDK and DTS api docs not
working properly. I addressed this by adding a symlink to the build dir.
This way the link works after installing the docs and the symlink is
just one extra file in the build dir.
v15:
Moved DTS API sources to doc/api/dts. This simplifies a lot of things in
the build, but mainly makes a lot of sense. Now the source, build and
install paths are the same so there isn't any need for any symlinks or
other workarounds.
Also added a symlink to the custom.css file so that it works with
call-sphinx-build.py without any modifications.
v16:
Renamed the dependency python file to get-dts-runtime-deps.py a modified
it to only get runtime dependencies. We don't need to check docs
dependencies (Sphinx) as we don't need to mock those.
Also moved all new Sphinx configuration into the DTS if branch to make
sure it won't ever affect the DPDK doc build.
v17:
Removed the dts-doc build target to mirror the functionality of using
-Denable_docs=true.
Moved DTS-specific meson build code to doc/api/dts/meson.build.
Added comments to get_missing_imports() and the top level docstring of
get-dts-runtime-deps.py to explain the function is there to be imported.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: replace the or operator in third party types
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-runtime-deps.py | 84 +++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/custom.css | 1 +
doc/api/dts/framework.config.rst | 12 +
doc/api/dts/framework.config.types.rst | 6 +
doc/api/dts/framework.exception.rst | 6 +
doc/api/dts/framework.logger.rst | 6 +
doc/api/dts/framework.params.eal.rst | 6 +
doc/api/dts/framework.params.rst | 14 +
doc/api/dts/framework.params.testpmd.rst | 6 +
doc/api/dts/framework.params.types.rst | 6 +
doc/api/dts/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
doc/api/dts/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
doc/api/dts/framework.runner.rst | 6 +
doc/api/dts/framework.settings.rst | 6 +
doc/api/dts/framework.test_result.rst | 6 +
doc/api/dts/framework.test_suite.rst | 6 +
doc/api/dts/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
doc/api/dts/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
doc/api/dts/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
doc/api/dts/framework.testbed_model.rst | 26 +
.../dts/framework.testbed_model.sut_node.rst | 6 +
.../dts/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
doc/api/dts/framework.utils.rst | 6 +
doc/api/dts/index.rst | 43 ++
doc/api/dts/meson.build | 31 ++
doc/api/meson.build | 6 +-
doc/guides/conf.py | 44 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/tools/dts.rst | 39 +-
doc/meson.build | 1 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
.../interactive_remote_session.py | 3 +-
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
58 files changed, 1061 insertions(+), 23 deletions(-)
create mode 100755 buildtools/get-dts-runtime-deps.py
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
create mode 100644 doc/api/dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v17 1/5] dts: update params and parser docstrings
2024-08-14 15:05 ` [PATCH v17 0/5] API docs generation Juraj Linkeš
@ 2024-08-14 15:05 ` Juraj Linkeš
2024-08-14 18:50 ` Jeremy Spewock
2024-08-14 15:05 ` [PATCH v17 2/5] dts: replace the or operator in third party types Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-14 15:05 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v17 1/5] dts: update params and parser docstrings
2024-08-14 15:05 ` [PATCH v17 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-14 18:50 ` Jeremy Spewock
0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-08-14 18:50 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On Wed, Aug 14, 2024 at 11:05 AM Juraj Linkeš
<juraj.linkes@pantheon.tech> wrote:
>
> Address a few errors reported by Sphinx when generating documentation:
> framework/params/__init__.py:docstring of framework.params.modify_str:3:
> WARNING: Inline interpreted text or phrase reference start-string
> without end-string.
> framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
> WARNING: Definition list ends without a blank line; unexpected
> unindent.
> framework/params/types.py:docstring of framework.params.types:8:
> WARNING: Inline strong start-string without end-string.
> framework/params/types.py:docstring of framework.params.types:9:
> WARNING: Inline strong start-string without end-string.
> framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
> Unexpected indentation.
> framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
> Unexpected indentation.
> framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
> Unexpected indentation.
> framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
> ERROR: Unexpected indentation.
> framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
> WARNING: Block quote ends without a blank line; unexpected unindent.
>
> Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
> Fixes: d70159cb62f5 ("dts: add params manipulation module")
> Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
> Fixes: 818fe14e3422 ("dts: add parsing utility module")
> Cc: luca.vizzarro@arm.com
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
> ---
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v17 2/5] dts: replace the or operator in third party types
2024-08-14 15:05 ` [PATCH v17 0/5] API docs generation Juraj Linkeš
2024-08-14 15:05 ` [PATCH v17 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-14 15:05 ` Juraj Linkeš
2024-08-14 18:50 ` Jeremy Spewock
2024-08-14 15:05 ` [PATCH v17 3/5] dts: add doc generation dependencies Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-14 15:05 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
When the DTS dependencies are not installed when building DTS API
documentation, the or operator produces errors when used with types from
those libraries:
autodoc: failed to import module 'remote_session' from module
'framework'; the following exception was raised:
Traceback (most recent call last):
...
TypeError: unsupported operand type(s) for |: 'Transport' and 'NoneType'
The third part type here is Transport from the paramiko library.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/framework/remote_session/interactive_remote_session.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 97194e6af8..4605ee14b4 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -5,6 +5,7 @@
import socket
import traceback
+from typing import Union
from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore[import-untyped]
from paramiko.ssh_exception import ( # type: ignore[import-untyped]
@@ -52,7 +53,7 @@ class InteractiveRemoteSession:
session: SSHClient
_logger: DTSLogger
_node_config: NodeConfiguration
- _transport: Transport | None
+ _transport: Union[Transport, None]
def __init__(self, node_config: NodeConfiguration, logger: DTSLogger) -> None:
"""Connect to the node during initialization.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v17 2/5] dts: replace the or operator in third party types
2024-08-14 15:05 ` [PATCH v17 2/5] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-14 18:50 ` Jeremy Spewock
0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-08-14 18:50 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On Wed, Aug 14, 2024 at 11:05 AM Juraj Linkeš
<juraj.linkes@pantheon.tech> wrote:
>
> When the DTS dependencies are not installed when building DTS API
> documentation, the or operator produces errors when used with types from
> those libraries:
> autodoc: failed to import module 'remote_session' from module
> 'framework'; the following exception was raised:
> Traceback (most recent call last):
> ...
> TypeError: unsupported operand type(s) for |: 'Transport' and 'NoneType'
>
> The third part type here is Transport from the paramiko library.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v17 3/5] dts: add doc generation dependencies
2024-08-14 15:05 ` [PATCH v17 0/5] API docs generation Juraj Linkeš
2024-08-14 15:05 ` [PATCH v17 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-14 15:05 ` [PATCH v17 2/5] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-14 15:05 ` Juraj Linkeš
2024-08-14 18:50 ` Jeremy Spewock
2024-08-14 15:05 ` [PATCH v17 4/5] dts: add API doc sources Juraj Linkeš
2024-08-14 15:05 ` [PATCH v17 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-14 15:05 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v17 3/5] dts: add doc generation dependencies
2024-08-14 15:05 ` [PATCH v17 3/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-14 18:50 ` Jeremy Spewock
0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-08-14 18:50 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On Wed, Aug 14, 2024 at 11:05 AM Juraj Linkeš
<juraj.linkes@pantheon.tech> wrote:
>
> Sphinx imports every Python module (through the autodoc extension)
> when generating documentation from docstrings, meaning all DTS
> dependencies, including Python version, should be satisfied. This is not
> a hard requirement, as imports from dependencies may be mocked in the
> autodoc_mock_imports autodoc option.
> In case DTS developers want to use a Sphinx installation from their
> virtualenv, we provide an optional Poetry group for doc generation. The
> pyelftools package is there so that meson picks up the correct Python
> installation, as pyelftools is required by the build system.
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v17 4/5] dts: add API doc sources
2024-08-14 15:05 ` [PATCH v17 0/5] API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-14 15:05 ` [PATCH v17 3/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-14 15:05 ` Juraj Linkeš
2024-08-14 15:05 ` [PATCH v17 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-14 15:05 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/framework.config.rst | 12 ++++++
doc/api/dts/framework.config.types.rst | 6 +++
doc/api/dts/framework.exception.rst | 6 +++
doc/api/dts/framework.logger.rst | 6 +++
doc/api/dts/framework.params.eal.rst | 6 +++
doc/api/dts/framework.params.rst | 14 ++++++
doc/api/dts/framework.params.testpmd.rst | 6 +++
doc/api/dts/framework.params.types.rst | 6 +++
doc/api/dts/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
doc/api/dts/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
doc/api/dts/framework.runner.rst | 6 +++
doc/api/dts/framework.settings.rst | 6 +++
doc/api/dts/framework.test_result.rst | 6 +++
doc/api/dts/framework.test_suite.rst | 6 +++
doc/api/dts/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
doc/api/dts/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
doc/api/dts/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
doc/api/dts/framework.testbed_model.rst | 26 +++++++++++
.../dts/framework.testbed_model.sut_node.rst | 6 +++
.../dts/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
doc/api/dts/framework.utils.rst | 6 +++
doc/api/dts/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
new file mode 120000
index 0000000000..5978642d76
--- /dev/null
+++ b/doc/api/dts/conf_yaml_schema.json
@@ -0,0 +1 @@
+../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/doc/api/dts/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/doc/api/dts/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.exception.rst b/doc/api/dts/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/doc/api/dts/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.logger.rst b/doc/api/dts/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/doc/api/dts/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.eal.rst b/doc/api/dts/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/doc/api/dts/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.rst b/doc/api/dts/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/doc/api/dts/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/doc/api/dts/framework.params.testpmd.rst b/doc/api/dts/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/doc/api/dts/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.types.rst b/doc/api/dts/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/doc/api/dts/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.parser.rst b/doc/api/dts/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/doc/api/dts/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.dpdk_shell.rst b/doc/api/dts/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_remote_session.rst b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_shell.rst b/doc/api/dts/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.python_shell.rst b/doc/api/dts/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.remote_session.rst b/doc/api/dts/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.rst b/doc/api/dts/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/doc/api/dts/framework.remote_session.ssh_session.rst b/doc/api/dts/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.testpmd_shell.rst b/doc/api/dts/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.runner.rst b/doc/api/dts/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/doc/api/dts/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.settings.rst b/doc/api/dts/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/doc/api/dts/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_result.rst b/doc/api/dts/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/doc/api/dts/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_suite.rst b/doc/api/dts/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/doc/api/dts/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.cpu.rst b/doc/api/dts/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.linux_session.rst b/doc/api/dts/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.node.rst b/doc/api/dts/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.os_session.rst b/doc/api/dts/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.port.rst b/doc/api/dts/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.posix_session.rst b/doc/api/dts/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.rst b/doc/api/dts/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/doc/api/dts/framework.testbed_model.sut_node.rst b/doc/api/dts/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.tg_node.rst b/doc/api/dts/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.virtual_device.rst b/doc/api/dts/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.utils.rst b/doc/api/dts/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/doc/api/dts/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/index.rst b/doc/api/dts/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/doc/api/dts/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v17 5/5] dts: add API doc generation
2024-08-14 15:05 ` [PATCH v17 0/5] API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-14 15:05 ` [PATCH v17 4/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-14 15:05 ` Juraj Linkeš
2024-08-14 18:50 ` Jeremy Spewock
` (2 more replies)
4 siblings, 3 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-14 15:05 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content. There's other Sphinx
configuration related to Python docstrings which doesn't affect DPDK doc
build. All new configuration is in a conditional block, applied only
when DTS API docs are built to not interfere with DPDK doc build.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
And finally, the DTS API docs can be accessed from the DPDK API doxygen
page.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-runtime-deps.py | 84 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/custom.css | 1 +
doc/api/dts/meson.build | 31 +++++++++
doc/api/meson.build | 6 +-
doc/guides/conf.py | 44 +++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 ++
doc/guides/tools/dts.rst | 39 ++++++++++-
doc/meson.build | 1 +
13 files changed, 217 insertions(+), 3 deletions(-)
create mode 100755 buildtools/get-dts-runtime-deps.py
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..154e9f907b 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,8 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+if 'dts' in src:
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
diff --git a/buildtools/get-dts-runtime-deps.py b/buildtools/get-dts-runtime-deps.py
new file mode 100755
index 0000000000..68244480a3
--- /dev/null
+++ b/buildtools/get-dts-runtime-deps.py
@@ -0,0 +1,84 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script exits with the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime dependencies in the DTS pyproject.toml file
+and returns a list of module names used in an import statement (import packages) that are missing.
+This function is not used when the module is run as a script and is available to be imported.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+from packaging.version import Version
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {'invoke': '>=1.3', 'paramiko': '>=2.4'}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ return {dep: deps_section[dep].strip('"\'') for dep in deps_section}
+
+
+def get_missing_imports():
+ """Get missing DTS import packages from third party libraries.
+
+ Scan the DTS pyproject.toml file for dependencies and find those that are not installed.
+ The dependencies in pyproject.toml are listed by their distribution package names,
+ but the function finds the associated import packages - those used in import statements.
+
+ The function is not used when the module is run as a script. It should be imported.
+
+ Returns:
+ A list of missing import packages.
+ """
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, req_ver in (req_deps | _EXTRA_DEPS).items():
+ try:
+ req_ver = Version(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = Version(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(req_dep.lower().replace('-', '_'))
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = Version(platform.python_version())
+ req_ver = Version(python_version.strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..6b938d767c 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_runtime_deps = py3 + files('get-dts-runtime-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/dts/custom.css b/doc/api/dts/custom.css
new file mode 120000
index 0000000000..3c9480c4a0
--- /dev/null
+++ b/doc/api/dts/custom.css
@@ -0,0 +1 @@
+../../guides/custom.css
\ No newline at end of file
diff --git a/doc/api/dts/meson.build b/doc/api/dts/meson.build
new file mode 100644
index 0000000000..f338eb69bf
--- /dev/null
+++ b/doc/api/dts/meson.build
@@ -0,0 +1,31 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_runtime_deps, check: false).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
+
+extra_sphinx_args = ['-E', '-c', join_paths(doc_source_dir, 'guides')]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..71b861e42b 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,11 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+# initialize common Doxygen configuration
+cdata = configuration_data()
+
+subdir('dts')
+
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -30,7 +35,6 @@ example = custom_target('examples.dox',
build_by_default: get_option('enable_docs'))
# set up common Doxygen configuration
-cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..d7f3030838 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -58,6 +58,48 @@
("tools/devbind", "dpdk-devbind",
"check device status and bind/unbind them from drivers", "", 8)]
+# DTS API docs additional configuration
+if environ.get('DTS_BUILD'):
+ extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
+ # Napoleon enables the Google format of Python doscstrings.
+ napoleon_numpy_docstring = False
+ napoleon_attr_annotations = True
+ napoleon_preprocess_types = True
+
+ # Autodoc pulls documentation from code.
+ autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+ }
+ autodoc_class_signature = 'separated'
+ autodoc_typehints = 'both'
+ autodoc_typehints_format = 'short'
+ autodoc_typehints_description_target = 'documented'
+
+ # Intersphinx allows linking to external projects, such as Python docs.
+ intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+ # DTS docstring options.
+ add_module_names = False
+ toc_object_entries = True
+ toc_object_entries_show_parents = 'hide'
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-runtime-deps').get_missing_imports()
+
# ####### :numref: fallback ########
# The following hook functions add some simple handling for the :numref:
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..9e8929f567 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``doc/api/dts``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/doc/meson.build b/doc/meson.build
index 6f74706aa2..1e0cfa4127 100644
--- a/doc/meson.build
+++ b/doc/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_source_dir = meson.current_source_dir()
doc_targets = []
doc_target_names = []
subdir('api')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v17 5/5] dts: add API doc generation
2024-08-14 15:05 ` [PATCH v17 5/5] dts: add API doc generation Juraj Linkeš
@ 2024-08-14 18:50 ` Jeremy Spewock
2024-08-19 14:37 ` Dean Marx
2024-08-19 17:49 ` Dean Marx
2 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2024-08-14 18:50 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On Wed, Aug 14, 2024 at 11:05 AM Juraj Linkeš
<juraj.linkes@pantheon.tech> wrote:
>
> The tool used to generate DTS API docs is Sphinx, which is already in
> use in DPDK. The same configuration is used to preserve style with one
> DTS-specific configuration (so that the DPDK docs are unchanged) that
> modifies how the sidebar displays the content. There's other Sphinx
> configuration related to Python docstrings which doesn't affect DPDK doc
> build. All new configuration is in a conditional block, applied only
> when DTS API docs are built to not interfere with DPDK doc build.
>
> Sphinx generates the documentation from Python docstrings. The docstring
> format is the Google format [0] which requires the sphinx.ext.napoleon
> extension. The other extension, sphinx.ext.intersphinx, enables linking
> to objects in external documentations, such as the Python documentation.
>
> There is one requirement for building DTS docs - the same Python version
> as DTS or higher, because Sphinx's autodoc extension imports the code.
>
> The dependencies needed to import the code don't have to be satisfied,
> as the autodoc extension allows us to mock the imports. The missing
> packages are taken from the DTS pyproject.toml file.
>
> And finally, the DTS API docs can be accessed from the DPDK API doxygen
> page.
>
> [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v17 5/5] dts: add API doc generation
2024-08-14 15:05 ` [PATCH v17 5/5] dts: add API doc generation Juraj Linkeš
2024-08-14 18:50 ` Jeremy Spewock
@ 2024-08-19 14:37 ` Dean Marx
2024-08-19 17:53 ` Dean Marx
2024-08-19 17:49 ` Dean Marx
2 siblings, 1 reply; 393+ messages in thread
From: Dean Marx @ 2024-08-19 14:37 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
[-- Attachment #1: Type: text/plain, Size: 877 bytes --]
I ran into some dependency issues while testing that I figured I'd mention
here. My build failed while running meson setup with the -Denable_docs=true
option since I didn't have the sphinx-build module installed, then my
compilation failed while running ninja -C because I didn't have a package
called tomli installed. I ran a final time where compilation failed again
because my system couldn't find the yaml package. It seems like nobody else
ran into this so I'm a little confused if it's something on my end, but I
tried running the optional poetry install --with docs mentioned in the
cover letter and that didn't seem to work either. I was also able to
build and compile without -Denable_docs. Thought I'd bring it up because
compilation takes a fairly long time, and if a user runs into this I could
see it being frustrating.
Reviewed-by: Dean Marx <dmarx@iol.unh.edu>
[-- Attachment #2: Type: text/html, Size: 1002 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v17 5/5] dts: add API doc generation
2024-08-19 14:37 ` Dean Marx
@ 2024-08-19 17:53 ` Dean Marx
2024-08-20 8:31 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Dean Marx @ 2024-08-19 17:53 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
[-- Attachment #1: Type: text/plain, Size: 1336 bytes --]
On Mon, Aug 19, 2024 at 10:37 AM Dean Marx <dmarx@iol.unh.edu> wrote:
> I ran into some dependency issues while testing that I figured I'd mention
> here. My build failed while running meson setup with the -Denable_docs=true
> option since I didn't have the sphinx-build module installed, then my
> compilation failed while running ninja -C because I didn't have a package
> called tomli installed. I ran a final time where compilation failed again
> because my system couldn't find the yaml package. It seems like nobody else
> ran into this so I'm a little confused if it's something on my end, but I
> tried running the optional poetry install --with docs mentioned in the
> cover letter and that didn't seem to work either. I was also able to
> build and compile without -Denable_docs. Thought I'd bring it up because
> compilation takes a fairly long time, and if a user runs into this I could
> see it being frustrating.
>
> Reviewed-by: Dean Marx <dmarx@iol.unh.edu>
>
Just worked this out with Jeremy, I was running the poetry install --with
docs in the DPDK directory instead of the DTS subdirectory. However, while
that fixes almost everything, the yaml module is never imported and was
throwing errors for me until I installed pyyaml manually, so this might
have been missed in the dependency list
[-- Attachment #2: Type: text/html, Size: 1719 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v17 5/5] dts: add API doc generation
2024-08-19 17:53 ` Dean Marx
@ 2024-08-20 8:31 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-20 8:31 UTC (permalink / raw)
To: Dean Marx
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
On 19. 8. 2024 19:53, Dean Marx wrote:
> On Mon, Aug 19, 2024 at 10:37 AM Dean Marx <dmarx@iol.unh.edu
> <mailto:dmarx@iol.unh.edu>> wrote:
>
> I ran into some dependency issues while testing that I figured I'd
> mention here. My build failed while running meson setup with the
> -Denable_docs=true option since I didn't have the sphinx-build
> module installed,
This one is on your end, sphinx is a dependency that must be installed.
> then my compilation failed while running ninja -C
> because I didn't have a package called tomli installed. I ran a
> final time where compilation failed again because my system couldn't
> find the yaml package.
This, on the other hand, isn't. The build should work without non-sphinx
dependencies. I'll address these two in the next version. Thanks for
catching this.
> It seems like nobody else ran into this so
> I'm a little confused if it's something on my end, but I tried
> running the optional poetry install --with docs mentioned in the
> cover letter and that didn't seem to work either. I was also able to
> build and compile without -Denable_docs. Thought I'd bring it up
> because compilation takes a fairly long time, and if a user runs
> into this I could see it being frustrating.
>
> Reviewed-by: Dean Marx <dmarx@iol.unh.edu <mailto:dmarx@iol.unh.edu>>
>
>
> Just worked this out with Jeremy, I was running the poetry install
> --with docs in the DPDK directory instead of the DTS subdirectory.
> However, while that fixes almost everything, the yaml module is never
> imported
Does this happen after running poetry install? Pyyaml is in poetry
dependencies, so this shouldn't be a problem.
But in any case, both builds (either with -Denable_docs=true or with the
ninja build target ("doc")) should work without installing DTS runtime
dependencies (the doc build dependencies are still needed). I'm going to
try building in a fresh environment to test this more thoroughly.
> and was throwing errors for me until I installed pyyaml
> manually, so this might have been missed in the dependency list
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v17 5/5] dts: add API doc generation
2024-08-14 15:05 ` [PATCH v17 5/5] dts: add API doc generation Juraj Linkeš
2024-08-14 18:50 ` Jeremy Spewock
2024-08-19 14:37 ` Dean Marx
@ 2024-08-19 17:49 ` Dean Marx
2 siblings, 0 replies; 393+ messages in thread
From: Dean Marx @ 2024-08-19 17:49 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte, dev
[-- Attachment #1: Type: text/plain, Size: 1545 bytes --]
On Wed, Aug 14, 2024 at 11:05 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> The tool used to generate DTS API docs is Sphinx, which is already in
> use in DPDK. The same configuration is used to preserve style with one
> DTS-specific configuration (so that the DPDK docs are unchanged) that
> modifies how the sidebar displays the content. There's other Sphinx
> configuration related to Python docstrings which doesn't affect DPDK doc
> build. All new configuration is in a conditional block, applied only
> when DTS API docs are built to not interfere with DPDK doc build.
>
> Sphinx generates the documentation from Python docstrings. The docstring
> format is the Google format [0] which requires the sphinx.ext.napoleon
> extension. The other extension, sphinx.ext.intersphinx, enables linking
> to objects in external documentations, such as the Python documentation.
>
> There is one requirement for building DTS docs - the same Python version
> as DTS or higher, because Sphinx's autodoc extension imports the code.
>
> The dependencies needed to import the code don't have to be satisfied,
> as the autodoc extension allows us to mock the imports. The missing
> packages are taken from the DTS pyproject.toml file.
>
> And finally, the DTS API docs can be accessed from the DPDK API doxygen
> page.
>
> [0]
> https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
>
Tested-by: Dean Marx <dmarx@iol.unh.edu>
[-- Attachment #2: Type: text/html, Size: 2071 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v18 0/5] API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (17 preceding siblings ...)
2024-08-14 15:05 ` [PATCH v17 0/5] API docs generation Juraj Linkeš
@ 2024-08-20 13:18 ` Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 1/5] dts: update params and parser docstrings Juraj Linkeš
` (4 more replies)
2024-08-21 15:02 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
19 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-20 13:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
v12:
Added paramiko to the required dependencies of get-dts-deps.py.
v13:
Fixed build error:
TypeError: unsupported operand type(s) for |: 'NoneType' and 'Transport'
v14:
Fixed install error:
ERROR: File 'dts/doc/html' could not be found
This required me to put the built docs into dts/doc which is outside the
DPDK API doc dir, resulting in linking between DPDK and DTS api docs not
working properly. I addressed this by adding a symlink to the build dir.
This way the link works after installing the docs and the symlink is
just one extra file in the build dir.
v15:
Moved DTS API sources to doc/api/dts. This simplifies a lot of things in
the build, but mainly makes a lot of sense. Now the source, build and
install paths are the same so there isn't any need for any symlinks or
other workarounds.
Also added a symlink to the custom.css file so that it works with
call-sphinx-build.py without any modifications.
v16:
Renamed the dependency python file to get-dts-runtime-deps.py a modified
it to only get runtime dependencies. We don't need to check docs
dependencies (Sphinx) as we don't need to mock those.
Also moved all new Sphinx configuration into the DTS if branch to make
sure it won't ever affect the DPDK doc build.
v17:
Removed the dts-doc build target to mirror the functionality of using
-Denable_docs=true.
Moved DTS-specific meson build code to doc/api/dts/meson.build.
Added comments to get_missing_imports() and the top level docstring of
get-dts-runtime-deps.py to explain the function is there to be imported.
v18:
Added PyYAML to get-dts-runtime-deps.py.
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: replace the or operator in third party types
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-runtime-deps.py | 95 ++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/custom.css | 1 +
doc/api/dts/framework.config.rst | 12 +
doc/api/dts/framework.config.types.rst | 6 +
doc/api/dts/framework.exception.rst | 6 +
doc/api/dts/framework.logger.rst | 6 +
doc/api/dts/framework.params.eal.rst | 6 +
doc/api/dts/framework.params.rst | 14 +
doc/api/dts/framework.params.testpmd.rst | 6 +
doc/api/dts/framework.params.types.rst | 6 +
doc/api/dts/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
doc/api/dts/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
doc/api/dts/framework.runner.rst | 6 +
doc/api/dts/framework.settings.rst | 6 +
doc/api/dts/framework.test_result.rst | 6 +
doc/api/dts/framework.test_suite.rst | 6 +
doc/api/dts/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
doc/api/dts/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
doc/api/dts/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
doc/api/dts/framework.testbed_model.rst | 26 +
.../dts/framework.testbed_model.sut_node.rst | 6 +
.../dts/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
doc/api/dts/framework.utils.rst | 6 +
doc/api/dts/index.rst | 43 ++
doc/api/dts/meson.build | 31 ++
doc/api/meson.build | 6 +-
doc/guides/conf.py | 44 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/tools/dts.rst | 39 +-
doc/meson.build | 1 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
.../interactive_remote_session.py | 3 +-
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
58 files changed, 1072 insertions(+), 23 deletions(-)
create mode 100755 buildtools/get-dts-runtime-deps.py
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
create mode 100644 doc/api/dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v18 1/5] dts: update params and parser docstrings
2024-08-20 13:18 ` [PATCH v18 0/5] API docs generation Juraj Linkeš
@ 2024-08-20 13:18 ` Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 2/5] dts: replace the or operator in third party types Juraj Linkeš
` (3 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-20 13:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v18 2/5] dts: replace the or operator in third party types
2024-08-20 13:18 ` [PATCH v18 0/5] API docs generation Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-20 13:18 ` Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 3/5] dts: add doc generation dependencies Juraj Linkeš
` (2 subsequent siblings)
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-20 13:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
When the DTS dependencies are not installed when building DTS API
documentation, the or operator produces errors when used with types from
those libraries:
autodoc: failed to import module 'remote_session' from module
'framework'; the following exception was raised:
Traceback (most recent call last):
...
TypeError: unsupported operand type(s) for |: 'Transport' and 'NoneType'
The third part type here is Transport from the paramiko library.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
---
dts/framework/remote_session/interactive_remote_session.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 97194e6af8..4605ee14b4 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -5,6 +5,7 @@
import socket
import traceback
+from typing import Union
from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore[import-untyped]
from paramiko.ssh_exception import ( # type: ignore[import-untyped]
@@ -52,7 +53,7 @@ class InteractiveRemoteSession:
session: SSHClient
_logger: DTSLogger
_node_config: NodeConfiguration
- _transport: Transport | None
+ _transport: Union[Transport, None]
def __init__(self, node_config: NodeConfiguration, logger: DTSLogger) -> None:
"""Connect to the node during initialization.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v18 3/5] dts: add doc generation dependencies
2024-08-20 13:18 ` [PATCH v18 0/5] API docs generation Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 2/5] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-20 13:18 ` Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 4/5] dts: add API doc sources Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-20 13:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v18 4/5] dts: add API doc sources
2024-08-20 13:18 ` [PATCH v18 0/5] API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-20 13:18 ` [PATCH v18 3/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-20 13:18 ` Juraj Linkeš
2024-08-20 13:18 ` [PATCH v18 5/5] dts: add API doc generation Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-20 13:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/framework.config.rst | 12 ++++++
doc/api/dts/framework.config.types.rst | 6 +++
doc/api/dts/framework.exception.rst | 6 +++
doc/api/dts/framework.logger.rst | 6 +++
doc/api/dts/framework.params.eal.rst | 6 +++
doc/api/dts/framework.params.rst | 14 ++++++
doc/api/dts/framework.params.testpmd.rst | 6 +++
doc/api/dts/framework.params.types.rst | 6 +++
doc/api/dts/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
doc/api/dts/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
doc/api/dts/framework.runner.rst | 6 +++
doc/api/dts/framework.settings.rst | 6 +++
doc/api/dts/framework.test_result.rst | 6 +++
doc/api/dts/framework.test_suite.rst | 6 +++
doc/api/dts/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
doc/api/dts/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
doc/api/dts/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
doc/api/dts/framework.testbed_model.rst | 26 +++++++++++
.../dts/framework.testbed_model.sut_node.rst | 6 +++
.../dts/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
doc/api/dts/framework.utils.rst | 6 +++
doc/api/dts/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
new file mode 120000
index 0000000000..5978642d76
--- /dev/null
+++ b/doc/api/dts/conf_yaml_schema.json
@@ -0,0 +1 @@
+../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/doc/api/dts/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/doc/api/dts/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.exception.rst b/doc/api/dts/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/doc/api/dts/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.logger.rst b/doc/api/dts/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/doc/api/dts/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.eal.rst b/doc/api/dts/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/doc/api/dts/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.rst b/doc/api/dts/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/doc/api/dts/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/doc/api/dts/framework.params.testpmd.rst b/doc/api/dts/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/doc/api/dts/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.types.rst b/doc/api/dts/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/doc/api/dts/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.parser.rst b/doc/api/dts/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/doc/api/dts/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.dpdk_shell.rst b/doc/api/dts/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_remote_session.rst b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_shell.rst b/doc/api/dts/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.python_shell.rst b/doc/api/dts/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.remote_session.rst b/doc/api/dts/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.rst b/doc/api/dts/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/doc/api/dts/framework.remote_session.ssh_session.rst b/doc/api/dts/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.testpmd_shell.rst b/doc/api/dts/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.runner.rst b/doc/api/dts/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/doc/api/dts/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.settings.rst b/doc/api/dts/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/doc/api/dts/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_result.rst b/doc/api/dts/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/doc/api/dts/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_suite.rst b/doc/api/dts/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/doc/api/dts/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.cpu.rst b/doc/api/dts/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.linux_session.rst b/doc/api/dts/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.node.rst b/doc/api/dts/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.os_session.rst b/doc/api/dts/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.port.rst b/doc/api/dts/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.posix_session.rst b/doc/api/dts/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.rst b/doc/api/dts/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/doc/api/dts/framework.testbed_model.sut_node.rst b/doc/api/dts/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.tg_node.rst b/doc/api/dts/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.virtual_device.rst b/doc/api/dts/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.utils.rst b/doc/api/dts/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/doc/api/dts/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/index.rst b/doc/api/dts/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/doc/api/dts/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v18 5/5] dts: add API doc generation
2024-08-20 13:18 ` [PATCH v18 0/5] API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-20 13:18 ` [PATCH v18 4/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-20 13:18 ` Juraj Linkeš
4 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-20 13:18 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
paul.szczepanek, Luca.Vizzarro, npratte
Cc: dev, Juraj Linkeš, Dean Marx
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content. There's other Sphinx
configuration related to Python docstrings which doesn't affect DPDK doc
build. All new configuration is in a conditional block, applied only
when DTS API docs are built to not interfere with DPDK doc build.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
And finally, the DTS API docs can be accessed from the DPDK API doxygen
page.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Reviewed-by: Dean Marx <dmarx@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-runtime-deps.py | 95 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/custom.css | 1 +
doc/api/dts/meson.build | 31 ++++++++
doc/api/meson.build | 6 +-
doc/guides/conf.py | 44 ++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/tools/dts.rst | 39 +++++++++-
doc/meson.build | 1 +
13 files changed, 228 insertions(+), 3 deletions(-)
create mode 100755 buildtools/get-dts-runtime-deps.py
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..154e9f907b 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,8 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+if 'dts' in src:
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
diff --git a/buildtools/get-dts-runtime-deps.py b/buildtools/get-dts-runtime-deps.py
new file mode 100755
index 0000000000..6f4d3def29
--- /dev/null
+++ b/buildtools/get-dts-runtime-deps.py
@@ -0,0 +1,95 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script exits with the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime dependencies in the DTS pyproject.toml file
+and returns a list of module names used in an import statement (import packages) that are missing.
+This function is not used when the module is run as a script and is available to be imported.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+from packaging.version import Version
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {
+ 'invoke': {'version': '>=1.3'},
+ 'paramiko': {'version': '>=2.4'},
+ 'PyYAML': {'version': '^6.0', 'import_package': 'yaml'}
+}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ return {dep: {'version': deps_section[dep].strip('"\'')} for dep in deps_section}
+
+
+def get_missing_imports():
+ """Get missing DTS import packages from third party libraries.
+
+ Scan the DTS pyproject.toml file for dependencies and find those that are not installed.
+ The dependencies in pyproject.toml are listed by their distribution package names,
+ but the function finds the associated import packages - those used in import statements.
+
+ The function is not used when the module is run as a script. It should be imported.
+
+ Returns:
+ A list of missing import packages.
+ """
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, dep_data in (req_deps | _EXTRA_DEPS).items():
+ req_ver = dep_data['version']
+ try:
+ import_package = dep_data['import_package']
+ except KeyError:
+ import_package = req_dep
+ import_package = import_package.lower().replace('-', '_')
+
+ try:
+ req_ver = Version(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = Version(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(import_package)
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = Version(platform.python_version())
+ req_ver = Version(python_version.strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..6b938d767c 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_runtime_deps = py3 + files('get-dts-runtime-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/dts/custom.css b/doc/api/dts/custom.css
new file mode 120000
index 0000000000..3c9480c4a0
--- /dev/null
+++ b/doc/api/dts/custom.css
@@ -0,0 +1 @@
+../../guides/custom.css
\ No newline at end of file
diff --git a/doc/api/dts/meson.build b/doc/api/dts/meson.build
new file mode 100644
index 0000000000..f338eb69bf
--- /dev/null
+++ b/doc/api/dts/meson.build
@@ -0,0 +1,31 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_runtime_deps, check: false).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
+
+extra_sphinx_args = ['-E', '-c', join_paths(doc_source_dir, 'guides')]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..71b861e42b 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,11 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+# initialize common Doxygen configuration
+cdata = configuration_data()
+
+subdir('dts')
+
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -30,7 +35,6 @@ example = custom_target('examples.dox',
build_by_default: get_option('enable_docs'))
# set up common Doxygen configuration
-cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..d7f3030838 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -58,6 +58,48 @@
("tools/devbind", "dpdk-devbind",
"check device status and bind/unbind them from drivers", "", 8)]
+# DTS API docs additional configuration
+if environ.get('DTS_BUILD'):
+ extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
+ # Napoleon enables the Google format of Python doscstrings.
+ napoleon_numpy_docstring = False
+ napoleon_attr_annotations = True
+ napoleon_preprocess_types = True
+
+ # Autodoc pulls documentation from code.
+ autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+ }
+ autodoc_class_signature = 'separated'
+ autodoc_typehints = 'both'
+ autodoc_typehints_format = 'short'
+ autodoc_typehints_description_target = 'documented'
+
+ # Intersphinx allows linking to external projects, such as Python docs.
+ intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+ # DTS docstring options.
+ add_module_names = False
+ toc_object_entries = True
+ toc_object_entries_show_parents = 'hide'
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-runtime-deps').get_missing_imports()
+
# ####### :numref: fallback ########
# The following hook functions add some simple handling for the :numref:
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..9e8929f567 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``doc/api/dts``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/doc/meson.build b/doc/meson.build
index 6f74706aa2..1e0cfa4127 100644
--- a/doc/meson.build
+++ b/doc/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_source_dir = meson.current_source_dir()
doc_targets = []
doc_target_names = []
subdir('api')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v19 0/5] DTS API docs generation
2023-11-15 13:36 ` [PATCH v1 0/2] dts: api docs generation Juraj Linkeš
` (18 preceding siblings ...)
2024-08-20 13:18 ` [PATCH v18 0/5] API docs generation Juraj Linkeš
@ 2024-08-21 15:02 ` Juraj Linkeš
2024-08-21 15:02 ` [PATCH v19 1/5] dts: update params and parser docstrings Juraj Linkeš
` (5 more replies)
19 siblings, 6 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-21 15:02 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman
Cc: dev, Juraj Linkeš
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration of the sidebar present in an if block.
DTS dependencies do not need to be installed, but there is the option to
install doc build dependencies with Poetry:
poetry install --with docs
The build itself may be run with:
meson setup <meson_build_dir> -Denable_docs=true
ninja -C <meson_build_dir>
The above will do a full DPDK build with docs. To build just docs:
meson setup <meson_build_dir>
ninja -C <meson_build_dir> doc
Python3.10 is required to build the DTS API docs.
The patchset contains the .rst sources which Sphinx uses to generate the
html pages. These were first generated with the sphinx-apidoc utility
and modified to provide a better look. The documentation just doesn't
look that good without the modifications and there isn't enough
configuration options to achieve that without manual changes to the .rst
files. This introduces extra maintenance which involves adding new .rst
files when a new Python module is added or changing the .rst structure
if the Python directory/file structure is changed (moved, renamed
files). This small maintenance burden is outweighed by the flexibility
afforded by the ability to make manual changes to the .rst files.
v10:
Fix dts doc generation issue: Only copy the custom rss file if it exists.
v11:
Added the config option autodoc_mock_imports, which eliminates the need
for DTS dependencies. Added a script that find out which imports need to
be added to autodoc_mock_imports. The script also check the required
Python version for building DTS docs.
Removed tags from the two affected patches which will need to be
reviewed again.
v12:
Added paramiko to the required dependencies of get-dts-deps.py.
v13:
Fixed build error:
TypeError: unsupported operand type(s) for |: 'NoneType' and 'Transport'
v14:
Fixed install error:
ERROR: File 'dts/doc/html' could not be found
This required me to put the built docs into dts/doc which is outside the
DPDK API doc dir, resulting in linking between DPDK and DTS api docs not
working properly. I addressed this by adding a symlink to the build dir.
This way the link works after installing the docs and the symlink is
just one extra file in the build dir.
v15:
Moved DTS API sources to doc/api/dts. This simplifies a lot of things in
the build, but mainly makes a lot of sense. Now the source, build and
install paths are the same so there isn't any need for any symlinks or
other workarounds.
Also added a symlink to the custom.css file so that it works with
call-sphinx-build.py without any modifications.
v16:
Renamed the dependency python file to get-dts-runtime-deps.py a modified
it to only get runtime dependencies. We don't need to check docs
dependencies (Sphinx) as we don't need to mock those.
Also moved all new Sphinx configuration into the DTS if branch to make
sure it won't ever affect the DPDK doc build.
v17:
Removed the dts-doc build target to mirror the functionality of using
-Denable_docs=true.
Moved DTS-specific meson build code to doc/api/dts/meson.build.
Added comments to get_missing_imports() and the top level docstring of
get-dts-runtime-deps.py to explain the function is there to be imported.
v18:
Added PyYAML to get-dts-runtime-deps.py.
v19:
Fixed a regression in get-dts-runtime-deps.py introduced in v18:
AttributeError: 'dict' object has no attribute 'strip'
Juraj Linkeš (5):
dts: update params and parser docstrings
dts: replace the or operator in third party types
dts: add doc generation dependencies
dts: add API doc sources
dts: add API doc generation
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-runtime-deps.py | 95 ++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/custom.css | 1 +
doc/api/dts/framework.config.rst | 12 +
doc/api/dts/framework.config.types.rst | 6 +
doc/api/dts/framework.exception.rst | 6 +
doc/api/dts/framework.logger.rst | 6 +
doc/api/dts/framework.params.eal.rst | 6 +
doc/api/dts/framework.params.rst | 14 +
doc/api/dts/framework.params.testpmd.rst | 6 +
doc/api/dts/framework.params.types.rst | 6 +
doc/api/dts/framework.parser.rst | 6 +
.../framework.remote_session.dpdk_shell.rst | 6 +
...ote_session.interactive_remote_session.rst | 6 +
...ework.remote_session.interactive_shell.rst | 6 +
.../framework.remote_session.python_shell.rst | 6 +
...ramework.remote_session.remote_session.rst | 6 +
doc/api/dts/framework.remote_session.rst | 18 +
.../framework.remote_session.ssh_session.rst | 6 +
...framework.remote_session.testpmd_shell.rst | 6 +
doc/api/dts/framework.runner.rst | 6 +
doc/api/dts/framework.settings.rst | 6 +
doc/api/dts/framework.test_result.rst | 6 +
doc/api/dts/framework.test_suite.rst | 6 +
doc/api/dts/framework.testbed_model.cpu.rst | 6 +
.../framework.testbed_model.linux_session.rst | 6 +
doc/api/dts/framework.testbed_model.node.rst | 6 +
.../framework.testbed_model.os_session.rst | 6 +
doc/api/dts/framework.testbed_model.port.rst | 6 +
.../framework.testbed_model.posix_session.rst | 6 +
doc/api/dts/framework.testbed_model.rst | 26 +
.../dts/framework.testbed_model.sut_node.rst | 6 +
.../dts/framework.testbed_model.tg_node.rst | 6 +
..._generator.capturing_traffic_generator.rst | 6 +
...mework.testbed_model.traffic_generator.rst | 14 +
....testbed_model.traffic_generator.scapy.rst | 6 +
...el.traffic_generator.traffic_generator.rst | 6 +
...framework.testbed_model.virtual_device.rst | 6 +
doc/api/dts/framework.utils.rst | 6 +
doc/api/dts/index.rst | 43 ++
doc/api/dts/meson.build | 31 ++
doc/api/meson.build | 6 +-
doc/guides/conf.py | 44 +-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/tools/dts.rst | 39 +-
doc/meson.build | 1 +
dts/framework/params/__init__.py | 4 +-
dts/framework/params/eal.py | 7 +-
dts/framework/params/types.py | 3 +-
dts/framework/parser.py | 4 +-
.../interactive_remote_session.py | 3 +-
dts/poetry.lock | 521 +++++++++++++++++-
dts/pyproject.toml | 8 +
58 files changed, 1072 insertions(+), 23 deletions(-)
create mode 100755 buildtools/get-dts-runtime-deps.py
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
create mode 100644 doc/api/dts/meson.build
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v19 1/5] dts: update params and parser docstrings
2024-08-21 15:02 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
@ 2024-08-21 15:02 ` Juraj Linkeš
2024-08-21 15:02 ` [PATCH v19 2/5] dts: replace the or operator in third party types Juraj Linkeš
` (4 subsequent siblings)
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-21 15:02 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman
Cc: dev, Juraj Linkeš, luca.vizzarro
Address a few errors reported by Sphinx when generating documentation:
framework/params/__init__.py:docstring of framework.params.modify_str:3:
WARNING: Inline interpreted text or phrase reference start-string
without end-string.
framework/params/eal.py:docstring of framework.params.eal.EalParams:35:
WARNING: Definition list ends without a blank line; unexpected
unindent.
framework/params/types.py:docstring of framework.params.types:8:
WARNING: Inline strong start-string without end-string.
framework/params/types.py:docstring of framework.params.types:9:
WARNING: Inline strong start-string without end-string.
framework/parser.py:docstring of framework.parser.TextParser:33: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:43: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser:49: ERROR:
Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:8:
ERROR: Unexpected indentation.
framework/parser.py:docstring of framework.parser.TextParser.wrap:9:
WARNING: Block quote ends without a blank line; unexpected unindent.
Fixes: 87ba4cdc0dbb ("dts: use Unpack for type checking and hinting")
Fixes: d70159cb62f5 ("dts: add params manipulation module")
Fixes: 967fc62b0a43 ("dts: refactor EAL parameters class")
Fixes: 818fe14e3422 ("dts: add parsing utility module")
Cc: luca.vizzarro@arm.com
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
---
dts/framework/params/__init__.py | 4 ++--
dts/framework/params/eal.py | 7 +++++--
dts/framework/params/types.py | 3 ++-
dts/framework/parser.py | 4 ++--
4 files changed, 11 insertions(+), 7 deletions(-)
diff --git a/dts/framework/params/__init__.py b/dts/framework/params/__init__.py
index 5a6fd93053..1ae227d7b4 100644
--- a/dts/framework/params/__init__.py
+++ b/dts/framework/params/__init__.py
@@ -53,9 +53,9 @@ def reduced_fn(value):
def modify_str(*funcs: FnPtr) -> Callable[[T], T]:
- """Class decorator modifying the ``__str__`` method with a function created from its arguments.
+ r"""Class decorator modifying the ``__str__`` method with a function created from its arguments.
- The :attr:`FnPtr`s fed to the decorator are executed from left to right in the arguments list
+ The :attr:`FnPtr`\s fed to the decorator are executed from left to right in the arguments list
order.
Args:
diff --git a/dts/framework/params/eal.py b/dts/framework/params/eal.py
index 8d7766fefc..cf1594353a 100644
--- a/dts/framework/params/eal.py
+++ b/dts/framework/params/eal.py
@@ -26,13 +26,16 @@ class EalParams(Params):
prefix: Set the file prefix string with which to start DPDK, e.g.: ``prefix="vf"``.
no_pci: Switch to disable PCI bus, e.g.: ``no_pci=True``.
vdevs: Virtual devices, e.g.::
+
vdevs=[
VirtualDevice('net_ring0'),
VirtualDevice('net_ring1')
]
+
ports: The list of ports to allow.
- other_eal_param: user defined DPDK EAL parameters, e.g.:
- ``other_eal_param='--single-file-segments'``
+ other_eal_param: user defined DPDK EAL parameters, e.g.::
+
+ ``other_eal_param='--single-file-segments'``
"""
lcore_list: LogicalCoreList | None = field(default=None, metadata=Params.short("l"))
diff --git a/dts/framework/params/types.py b/dts/framework/params/types.py
index e668f658d8..d77c4625fb 100644
--- a/dts/framework/params/types.py
+++ b/dts/framework/params/types.py
@@ -6,7 +6,8 @@
TypedDicts can be used in conjunction with Unpack and kwargs for type hinting on function calls.
Example:
- ..code:: python
+ .. code:: python
+
def create_testpmd(**kwargs: Unpack[TestPmdParamsDict]):
params = TestPmdParams(**kwargs)
"""
diff --git a/dts/framework/parser.py b/dts/framework/parser.py
index 741dfff821..7254c75b71 100644
--- a/dts/framework/parser.py
+++ b/dts/framework/parser.py
@@ -46,7 +46,7 @@ class TextParser(ABC):
Example:
The following example makes use of and demonstrates every parser function available:
- ..code:: python
+ .. code:: python
from dataclasses import dataclass, field
from enum import Enum
@@ -90,7 +90,7 @@ def wrap(parser_fn: ParserFn, wrapper_fn: Callable) -> ParserFn:
"""Makes a wrapped parser function.
`parser_fn` is called and if a non-None value is returned, `wrapper_function` is called with
- it. Otherwise the function returns early with None. In pseudo-code:
+ it. Otherwise the function returns early with None. In pseudo-code::
intermediate_value := parser_fn(input)
if intermediary_value is None then
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v19 2/5] dts: replace the or operator in third party types
2024-08-21 15:02 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
2024-08-21 15:02 ` [PATCH v19 1/5] dts: update params and parser docstrings Juraj Linkeš
@ 2024-08-21 15:02 ` Juraj Linkeš
2024-09-02 10:56 ` Luca Vizzarro
2024-08-21 15:02 ` [PATCH v19 3/5] dts: add doc generation dependencies Juraj Linkeš
` (3 subsequent siblings)
5 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-21 15:02 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman
Cc: dev, Juraj Linkeš
When the DTS dependencies are not installed when building DTS API
documentation, the or operator produces errors when used with types from
those libraries:
autodoc: failed to import module 'remote_session' from module
'framework'; the following exception was raised:
Traceback (most recent call last):
...
TypeError: unsupported operand type(s) for |: 'Transport' and 'NoneType'
The third part type here is Transport from the paramiko library.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
---
dts/framework/remote_session/interactive_remote_session.py | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/dts/framework/remote_session/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
index 97194e6af8..4605ee14b4 100644
--- a/dts/framework/remote_session/interactive_remote_session.py
+++ b/dts/framework/remote_session/interactive_remote_session.py
@@ -5,6 +5,7 @@
import socket
import traceback
+from typing import Union
from paramiko import AutoAddPolicy, SSHClient, Transport # type: ignore[import-untyped]
from paramiko.ssh_exception import ( # type: ignore[import-untyped]
@@ -52,7 +53,7 @@ class InteractiveRemoteSession:
session: SSHClient
_logger: DTSLogger
_node_config: NodeConfiguration
- _transport: Transport | None
+ _transport: Union[Transport, None]
def __init__(self, node_config: NodeConfiguration, logger: DTSLogger) -> None:
"""Connect to the node during initialization.
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v19 3/5] dts: add doc generation dependencies
2024-08-21 15:02 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
2024-08-21 15:02 ` [PATCH v19 1/5] dts: update params and parser docstrings Juraj Linkeš
2024-08-21 15:02 ` [PATCH v19 2/5] dts: replace the or operator in third party types Juraj Linkeš
@ 2024-08-21 15:02 ` Juraj Linkeš
2024-09-02 10:56 ` Luca Vizzarro
2024-08-21 15:02 ` [PATCH v19 4/5] dts: add API doc sources Juraj Linkeš
` (2 subsequent siblings)
5 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-21 15:02 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman
Cc: dev, Juraj Linkeš
Sphinx imports every Python module (through the autodoc extension)
when generating documentation from docstrings, meaning all DTS
dependencies, including Python version, should be satisfied. This is not
a hard requirement, as imports from dependencies may be mocked in the
autodoc_mock_imports autodoc option.
In case DTS developers want to use a Sphinx installation from their
virtualenv, we provide an optional Poetry group for doc generation. The
pyelftools package is there so that meson picks up the correct Python
installation, as pyelftools is required by the build system.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
---
dts/poetry.lock | 521 +++++++++++++++++++++++++++++++++++++++++++--
dts/pyproject.toml | 8 +
2 files changed, 517 insertions(+), 12 deletions(-)
diff --git a/dts/poetry.lock b/dts/poetry.lock
index 5f8fa03933..2dd8bad498 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
# This file is automatically @generated by Poetry 1.8.2 and should not be changed by hand.
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+ {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
[[package]]
name = "attrs"
version = "23.1.0"
@@ -18,6 +29,23 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
+[[package]]
+name = "babel"
+version = "2.13.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Babel-2.13.1-py3-none-any.whl", hash = "sha256:7077a4984b02b6727ac10f1f7294484f737443d7e2e66c5e4380e41a3ae0b4ed"},
+ {file = "Babel-2.13.1.tar.gz", hash = "sha256:33e0952d7dd6374af8dbf6768cc4ddf3ccfefc244f9986d4074704f2fbd18900"},
+]
+
+[package.dependencies]
+setuptools = {version = "*", markers = "python_version >= \"3.12\""}
+
+[package.extras]
+dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"]
+
[[package]]
name = "bcrypt"
version = "4.0.1"
@@ -86,6 +114,17 @@ d = ["aiohttp (>=3.7.4)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"]
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+ {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
[[package]]
name = "cffi"
version = "1.15.1"
@@ -162,6 +201,105 @@ files = [
[package.dependencies]
pycparser = "*"
+[[package]]
+name = "charset-normalizer"
+version = "3.3.2"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+ {file = "charset-normalizer-3.3.2.tar.gz", hash = "sha256:f30c3cb33b24454a82faecaf01b19c18562b1e89558fb6c56de4d9118a032fd5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:25baf083bf6f6b341f4121c2f3c548875ee6f5339300e08be3f2b2ba1721cdd3"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:06435b539f889b1f6f4ac1758871aae42dc3a8c0e24ac9e60c2384973ad73027"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9063e24fdb1e498ab71cb7419e24622516c4a04476b17a2dab57e8baa30d6e03"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6897af51655e3691ff853668779c7bad41579facacf5fd7253b0133308cf000d"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d3193f4a680c64b4b6a9115943538edb896edc190f0b222e73761716519268e"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cd70574b12bb8a4d2aaa0094515df2463cb429d8536cfb6c7ce983246983e5a6"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8465322196c8b4d7ab6d1e049e4c5cb460d0394da4a27d23cc242fbf0034b6b5"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a9a8e9031d613fd2009c182b69c7b2c1ef8239a0efb1df3f7c8da66d5dd3d537"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:beb58fe5cdb101e3a055192ac291b7a21e3b7ef4f67fa1d74e331a7f2124341c"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:e06ed3eb3218bc64786f7db41917d4e686cc4856944f53d5bdf83a6884432e12"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:2e81c7b9c8979ce92ed306c249d46894776a909505d8f5a4ba55b14206e3222f"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:572c3763a264ba47b3cf708a44ce965d98555f618ca42c926a9c1616d8f34269"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fd1abc0d89e30cc4e02e4064dc67fcc51bd941eb395c502aac3ec19fab46b519"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win32.whl", hash = "sha256:3d47fa203a7bd9c5b6cee4736ee84ca03b8ef23193c0d1ca99b5089f72645c73"},
+ {file = "charset_normalizer-3.3.2-cp310-cp310-win_amd64.whl", hash = "sha256:10955842570876604d404661fbccbc9c7e684caf432c09c715ec38fbae45ae09"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:802fe99cca7457642125a8a88a084cef28ff0cf9407060f7b93dca5aa25480db"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:573f6eac48f4769d667c4442081b1794f52919e7edada77495aaed9236d13a96"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:549a3a73da901d5bc3ce8d24e0600d1fa85524c10287f6004fbab87672bf3e1e"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f27273b60488abe721a075bcca6d7f3964f9f6f067c8c4c605743023d7d3944f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ceae2f17a9c33cb48e3263960dc5fc8005351ee19db217e9b1bb15d28c02574"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:65f6f63034100ead094b8744b3b97965785388f308a64cf8d7c34f2f2e5be0c4"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:753f10e867343b4511128c6ed8c82f7bec3bd026875576dfd88483c5c73b2fd8"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4a78b2b446bd7c934f5dcedc588903fb2f5eec172f3d29e52a9096a43722adfc"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e537484df0d8f426ce2afb2d0f8e1c3d0b114b83f8850e5f2fbea0e797bd82ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:eb6904c354526e758fda7167b33005998fb68c46fbc10e013ca97f21ca5c8887"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:deb6be0ac38ece9ba87dea880e438f25ca3eddfac8b002a2ec3d9183a454e8ae"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:4ab2fe47fae9e0f9dee8c04187ce5d09f48eabe611be8259444906793ab7cbce"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:80402cd6ee291dcb72644d6eac93785fe2c8b9cb30893c1af5b8fdd753b9d40f"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win32.whl", hash = "sha256:7cd13a2e3ddeed6913a65e66e94b51d80a041145a026c27e6bb76c31a853c6ab"},
+ {file = "charset_normalizer-3.3.2-cp311-cp311-win_amd64.whl", hash = "sha256:663946639d296df6a2bb2aa51b60a2454ca1cb29835324c640dafb5ff2131a77"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:0b2b64d2bb6d3fb9112bafa732def486049e63de9618b5843bcdd081d8144cd8"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:ddbb2551d7e0102e7252db79ba445cdab71b26640817ab1e3e3648dad515003b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:55086ee1064215781fff39a1af09518bc9255b50d6333f2e4c74ca09fac6a8f6"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8f4a014bc36d3c57402e2977dada34f9c12300af536839dc38c0beab8878f38a"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a10af20b82360ab00827f916a6058451b723b4e65030c5a18577c8b2de5b3389"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8d756e44e94489e49571086ef83b2bb8ce311e730092d2c34ca8f7d925cb20aa"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:90d558489962fd4918143277a773316e56c72da56ec7aa3dc3dbbe20fdfed15b"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ac7ffc7ad6d040517be39eb591cac5ff87416c2537df6ba3cba3bae290c0fed"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:7ed9e526742851e8d5cc9e6cf41427dfc6068d4f5a3bb03659444b4cabf6bc26"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:8bdb58ff7ba23002a4c5808d608e4e6c687175724f54a5dade5fa8c67b604e4d"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_ppc64le.whl", hash = "sha256:6b3251890fff30ee142c44144871185dbe13b11bab478a88887a639655be1068"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_s390x.whl", hash = "sha256:b4a23f61ce87adf89be746c8a8974fe1c823c891d8f86eb218bb957c924bb143"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:efcb3f6676480691518c177e3b465bcddf57cea040302f9f4e6e191af91174d4"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win32.whl", hash = "sha256:d965bba47ddeec8cd560687584e88cf699fd28f192ceb452d1d7ee807c5597b7"},
+ {file = "charset_normalizer-3.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:96b02a3dc4381e5494fad39be677abcb5e6634bf7b4fa83a6dd3112607547001"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:95f2a5796329323b8f0512e09dbb7a1860c46a39da62ecb2324f116fa8fdc85c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c002b4ffc0be611f0d9da932eb0f704fe2602a9a949d1f738e4c34c75b0863d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a981a536974bbc7a512cf44ed14938cf01030a99e9b3a06dd59578882f06f985"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3287761bc4ee9e33561a7e058c72ac0938c4f57fe49a09eae428fd88aafe7bb6"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:42cb296636fcc8b0644486d15c12376cb9fa75443e00fb25de0b8602e64c1714"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0a55554a2fa0d408816b3b5cedf0045f4b8e1a6065aec45849de2d6f3f8e9786"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:c083af607d2515612056a31f0a8d9e0fcb5876b7bfc0abad3ecd275bc4ebc2d5"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:87d1351268731db79e0f8e745d92493ee2841c974128ef629dc518b937d9194c"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:bd8f7df7d12c2db9fab40bdd87a7c09b1530128315d047a086fa3ae3435cb3a8"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:c180f51afb394e165eafe4ac2936a14bee3eb10debc9d9e4db8958fe36afe711"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:8c622a5fe39a48f78944a87d4fb8a53ee07344641b0562c540d840748571b811"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win32.whl", hash = "sha256:db364eca23f876da6f9e16c9da0df51aa4f104a972735574842618b8c6d999d4"},
+ {file = "charset_normalizer-3.3.2-cp37-cp37m-win_amd64.whl", hash = "sha256:86216b5cee4b06df986d214f664305142d9c76df9b6512be2738aa72a2048f99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:6463effa3186ea09411d50efc7d85360b38d5f09b870c48e4600f63af490e56a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:6c4caeef8fa63d06bd437cd4bdcf3ffefe6738fb1b25951440d80dc7df8c03ac"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:37e55c8e51c236f95b033f6fb391d7d7970ba5fe7ff453dad675e88cf303377a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb69256e180cb6c8a894fee62b3afebae785babc1ee98b81cdf68bbca1987f33"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ae5f4161f18c61806f411a13b0310bea87f987c7d2ecdbdaad0e94eb2e404238"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b2b0a0c0517616b6869869f8c581d4eb2dd83a4d79e0ebcb7d373ef9956aeb0a"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:45485e01ff4d3630ec0d9617310448a8702f70e9c01906b0d0118bdf9d124cf2"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eb00ed941194665c332bf8e078baf037d6c35d7c4f3102ea2d4f16ca94a26dc8"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2127566c664442652f024c837091890cb1942c30937add288223dc895793f898"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a50aebfa173e157099939b17f18600f72f84eed3049e743b68ad15bd69b6bf99"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:4d0d1650369165a14e14e1e47b372cfcb31d6ab44e6e33cb2d4e57265290044d"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:923c0c831b7cfcb071580d3f46c4baf50f174be571576556269530f4bbd79d04"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:06a81e93cd441c56a9b65d8e1d043daeb97a3d0856d177d5c90ba85acb3db087"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win32.whl", hash = "sha256:6ef1d82a3af9d3eecdba2321dc1b3c238245d890843e040e41e470ffa64c3e25"},
+ {file = "charset_normalizer-3.3.2-cp38-cp38-win_amd64.whl", hash = "sha256:eb8821e09e916165e160797a6c17edda0679379a4be5c716c260e836e122f54b"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:c235ebd9baae02f1b77bcea61bce332cb4331dc3617d254df3323aa01ab47bd4"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5b4c145409bef602a690e7cfad0a15a55c13320ff7a3ad7ca59c13bb8ba4d45d"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:68d1f8a9e9e37c1223b656399be5d6b448dea850bed7d0f87a8311f1ff3dabb0"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22afcb9f253dac0696b5a4be4a1c0f8762f8239e21b99680099abd9b2b1b2269"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e27ad930a842b4c5eb8ac0016b0a54f5aebbe679340c26101df33424142c143c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f79682fbe303db92bc2b1136016a38a42e835d932bab5b3b1bfcfbf0640e519"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b261ccdec7821281dade748d088bb6e9b69e6d15b30652b74cbbac25e280b796"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:122c7fa62b130ed55f8f285bfd56d5f4b4a5b503609d181f9ad85e55c89f4185"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d0eccceffcb53201b5bfebb52600a5fb483a20b61da9dbc885f8b103cbe7598c"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:9f96df6923e21816da7e0ad3fd47dd8f94b2a5ce594e00677c0013018b813458"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:7f04c839ed0b6b98b1a7501a002144b76c18fb1c1850c8b98d458ac269e26ed2"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:34d1c8da1e78d2e001f363791c98a272bb734000fcef47a491c1e3b0505657a8"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ff8fa367d09b717b2a17a052544193ad76cd49979c805768879cb63d9ca50561"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win32.whl", hash = "sha256:aed38f6e4fb3f5d6bf81bfa990a07806be9d83cf7bacef998ab1a9bd660a581f"},
+ {file = "charset_normalizer-3.3.2-cp39-cp39-win_amd64.whl", hash = "sha256:b01b88d45a6fcb69667cd6d2f7a9aeb4bf53760d7fc536bf679ec94fe9f3ff3d"},
+ {file = "charset_normalizer-3.3.2-py3-none-any.whl", hash = "sha256:3e4d1f6587322d2788836a99c69062fbb091331ec940e02d12d179c1d53e25fc"},
+]
+
[[package]]
name = "click"
version = "8.1.6"
@@ -232,6 +370,17 @@ ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"]
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+ {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+ {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
[[package]]
name = "fabric"
version = "2.7.1"
@@ -252,6 +401,28 @@ pathlib2 = "*"
pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
testing = ["mock (>=2.0.0,<3.0)"]
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+ {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+ {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+ {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
[[package]]
name = "invoke"
version = "1.7.3"
@@ -280,6 +451,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
plugins = ["setuptools"]
requirements-deprecated-finder = ["pip-api", "pipreqs"]
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+ {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
[[package]]
name = "jsonpatch"
version = "1.33"
@@ -340,6 +528,65 @@ files = [
[package.dependencies]
referencing = ">=0.28.0"
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+ {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+ {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+ {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+ {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+ {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+ {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
[[package]]
name = "mccabe"
version = "0.7.0"
@@ -409,6 +656,17 @@ files = [
{file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
]
+[[package]]
+name = "packaging"
+version = "23.2"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "packaging-23.2-py3-none-any.whl", hash = "sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7"},
+ {file = "packaging-23.2.tar.gz", hash = "sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5"},
+]
+
[[package]]
name = "paramiko"
version = "3.2.0"
@@ -509,6 +767,17 @@ snowballstemmer = "*"
[package.extras]
toml = ["toml"]
+[[package]]
+name = "pyelftools"
+version = "0.31"
+description = "Library for analyzing ELF files and DWARF debugging information"
+optional = false
+python-versions = "*"
+files = [
+ {file = "pyelftools-0.31-py3-none-any.whl", hash = "sha256:f52de7b3c7e8c64c8abc04a79a1cf37ac5fb0b8a49809827130b858944840607"},
+ {file = "pyelftools-0.31.tar.gz", hash = "sha256:c774416b10310156879443b81187d182d8d9ee499660380e645918b50bc88f99"},
+]
+
[[package]]
name = "pyflakes"
version = "2.5.0"
@@ -520,6 +789,20 @@ files = [
{file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
]
+[[package]]
+name = "pygments"
+version = "2.16.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "Pygments-2.16.1-py3-none-any.whl", hash = "sha256:13fc09fa63bc8d8671a6d247e1eb303c4b343eaee81d861f3404db2935653692"},
+ {file = "Pygments-2.16.1.tar.gz", hash = "sha256:1daff0494820c69bc8941e407aa20f577374ee88364ee10a98fdbe0aece96e29"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
[[package]]
name = "pylama"
version = "8.4.1"
@@ -585,7 +868,6 @@ files = [
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:69b023b2b4daa7548bcfbd4aa3da05b3a74b772db9e23b982788168117739938"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e0b275a9ecc9c0c0c07b4b90ba548307583c125f54d5b6946cfee6360c733d"},
{file = "PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ba336e390cd8e4d1739f42dfe9bb83a3cc2e80f567d8805e11b46f4a943f5515"},
- {file = "PyYAML-6.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:326c013efe8048858a6d312ddd31d56e468118ad4cdeda36c719bf5bb6192290"},
{file = "PyYAML-6.0.1-cp310-cp310-win32.whl", hash = "sha256:bd4af7373a854424dabd882decdc5579653d7868b8fb26dc7d0e99f823aa5924"},
{file = "PyYAML-6.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:fd1592b3fdf65fff2ad0004b5e363300ef59ced41c2e6b3a99d4089fa8c5435d"},
{file = "PyYAML-6.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6965a7bc3cf88e5a1c3bd2e0b5c22f8d677dc88a455344035f03399034eb3007"},
@@ -593,16 +875,8 @@ files = [
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:42f8152b8dbc4fe7d96729ec2b99c7097d656dc1213a3229ca5383f973a5ed6d"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:062582fca9fabdd2c8b54a3ef1c978d786e0f6b3a1510e0ac93ef59e0ddae2bc"},
{file = "PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d2b04aac4d386b172d5b9692e2d2da8de7bfb6c387fa4f801fbf6fb2e6ba4673"},
- {file = "PyYAML-6.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:e7d73685e87afe9f3b36c799222440d6cf362062f78be1013661b00c5c6f678b"},
{file = "PyYAML-6.0.1-cp311-cp311-win32.whl", hash = "sha256:1635fd110e8d85d55237ab316b5b011de701ea0f29d07611174a1b42f1444741"},
{file = "PyYAML-6.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:bf07ee2fef7014951eeb99f56f39c9bb4af143d8aa3c21b1677805985307da34"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:855fb52b0dc35af121542a76b9a84f8d1cd886ea97c84703eaa6d88e37a2ad28"},
- {file = "PyYAML-6.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40df9b996c2b73138957fe23a16a4f0ba614f4c0efce1e9406a184b6d07fa3a9"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a08c6f0fe150303c1c6b71ebcd7213c2858041a7e01975da3a99aed1e7a378ef"},
- {file = "PyYAML-6.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c22bec3fbe2524cde73d7ada88f6566758a8f7227bfbf93a408a9d86bcc12a0"},
- {file = "PyYAML-6.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8d4e9c88387b0f5c7d5f281e55304de64cf7f9c0021a3525bd3b1c542da3b0e4"},
- {file = "PyYAML-6.0.1-cp312-cp312-win32.whl", hash = "sha256:d483d2cdf104e7c9fa60c544d92981f12ad66a457afae824d146093b8c294c54"},
- {file = "PyYAML-6.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:0d3304d8c0adc42be59c5f8a4d9e3d7379e6955ad754aa9d6ab7a398b59dd1df"},
{file = "PyYAML-6.0.1-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:50550eb667afee136e9a77d6dc71ae76a44df8b3e51e41b77f6de2932bfe0f47"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fe35611261b29bd1de0070f0b2f47cb6ff71fa6595c077e42bd0c419fa27b98"},
{file = "PyYAML-6.0.1-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:704219a11b772aea0d8ecd7058d0082713c3562b4e271b849ad7dc4a5c90c13c"},
@@ -619,7 +893,6 @@ files = [
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a0cd17c15d3bb3fa06978b4e8958dcdc6e0174ccea823003a106c7d4d7899ac5"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:28c119d996beec18c05208a8bd78cbe4007878c6dd15091efb73a30e90539696"},
{file = "PyYAML-6.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e07cbde391ba96ab58e532ff4803f79c4129397514e1413a7dc761ccd755735"},
- {file = "PyYAML-6.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:49a183be227561de579b4a36efbb21b3eab9651dd81b1858589f796549873dd6"},
{file = "PyYAML-6.0.1-cp38-cp38-win32.whl", hash = "sha256:184c5108a2aca3c5b3d3bf9395d50893a7ab82a38004c8f61c258d4428e80206"},
{file = "PyYAML-6.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:1e2722cc9fbb45d9b87631ac70924c11d3a401b2d7f410cc0e3bbf249f2dca62"},
{file = "PyYAML-6.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9eb6caa9a297fc2c2fb8862bc5370d0303ddba53ba97e71f08023b6cd73d16a8"},
@@ -627,7 +900,6 @@ files = [
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5773183b6446b2c99bb77e77595dd486303b4faab2b086e7b17bc6bef28865f6"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b786eecbdf8499b9ca1d697215862083bd6d2a99965554781d0d8d1ad31e13a0"},
{file = "PyYAML-6.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc1bf2925a1ecd43da378f4db9e4f799775d6367bdb94671027b73b393a7c42c"},
- {file = "PyYAML-6.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:04ac92ad1925b2cff1db0cfebffb6ffc43457495c9b3c39d3fcae417d7125dc5"},
{file = "PyYAML-6.0.1-cp39-cp39-win32.whl", hash = "sha256:faca3bdcf85b2fc05d06ff3fbc1f83e1391b3e724afa3feba7d13eeab355484c"},
{file = "PyYAML-6.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:510c9deebc5c0225e8c96813043e62b680ba2f9c50a08d3724c7f28a747d1486"},
{file = "PyYAML-6.0.1.tar.gz", hash = "sha256:bfdf460b1736c775f2ba9f6a92bca30bc2095067b8a9d77876d1fad6cc3b4a43"},
@@ -648,6 +920,27 @@ files = [
attrs = ">=22.2.0"
rpds-py = ">=0.7.0"
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+ {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
[[package]]
name = "rpds-py"
version = "0.9.2"
@@ -769,6 +1062,22 @@ basic = ["ipython"]
complete = ["cryptography (>=2.0)", "ipython", "matplotlib", "pyx"]
docs = ["sphinx (>=3.0.0)", "sphinx_rtd_theme (>=0.4.3)", "tox (>=3.0.0)"]
+[[package]]
+name = "setuptools"
+version = "68.2.2"
+description = "Easily download, build, install, upgrade, and uninstall Python packages"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "setuptools-68.2.2-py3-none-any.whl", hash = "sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a"},
+ {file = "setuptools-68.2.2.tar.gz", hash = "sha256:4ac1475276d2f1c48684874089fefcd83bd7162ddaafb81fac866ba0db282a87"},
+]
+
+[package.extras]
+docs = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-hoverxref (<2)", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier"]
+testing = ["build[virtualenv]", "filelock (>=3.4.0)", "flake8-2020", "ini2toml[lite] (>=0.9)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "pip (>=19.1)", "pytest (>=6)", "pytest-black (>=0.3.7)", "pytest-checkdocs (>=2.4)", "pytest-cov", "pytest-enabler (>=2.2)", "pytest-mypy (>=0.9.1)", "pytest-perf", "pytest-ruff", "pytest-timeout", "pytest-xdist", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel"]
+testing-integration = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "packaging (>=23.1)", "pytest", "pytest-enabler", "pytest-xdist", "tomli", "virtualenv (>=13.0.0)", "wheel"]
+
[[package]]
name = "six"
version = "1.16.0"
@@ -791,6 +1100,177 @@ files = [
{file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
]
+[[package]]
+name = "sphinx"
+version = "7.0.0"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+ {file = "Sphinx-7.0.0.tar.gz", hash = "sha256:283c44aa28922bb4223777b44ac0d59af50a279ac7690dfe945bb2b9575dc41b"},
+ {file = "sphinx-7.0.0-py3-none-any.whl", hash = "sha256:3cfc1c6756ef1b132687b813ec6ea2214cb7a7e5d1dcb2772006cb895a0fa469"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "2.0.0"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = ">=3.6"
+files = [
+ {file = "sphinx_rtd_theme-2.0.0-py2.py3-none-any.whl", hash = "sha256:ec93d0856dc280cf3aee9a4c9807c60e027c7f7b461b77aeffed682e68f0e586"},
+ {file = "sphinx_rtd_theme-2.0.0.tar.gz", hash = "sha256:bd5d7b80622406762073a04ef8fadc5f9151261563d47027de09910ce03afe6b"},
+]
+
+[package.dependencies]
+docutils = "<0.21"
+sphinx = ">=5,<8"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.7"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_applehelp-1.0.7-py3-none-any.whl", hash = "sha256:094c4d56209d1734e7d252f6e0b3ccc090bd52ee56807a5d9315b19c122ab15d"},
+ {file = "sphinxcontrib_applehelp-1.0.7.tar.gz", hash = "sha256:39fdc8d762d33b01a7d8f026a3b7d71563ea3b72787d5f00ad8465bd9d6dfbfa"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.5"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_devhelp-1.0.5-py3-none-any.whl", hash = "sha256:fe8009aed765188f08fcaadbb3ea0d90ce8ae2d76710b7e29ea7d047177dae2f"},
+ {file = "sphinxcontrib_devhelp-1.0.5.tar.gz", hash = "sha256:63b41e0d38207ca40ebbeabcf4d8e51f76c03e78cd61abe118cf4435c73d4212"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.4"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_htmlhelp-2.0.4-py3-none-any.whl", hash = "sha256:8001661c077a73c29beaf4a79968d0726103c5605e27db92b9ebed8bab1359e9"},
+ {file = "sphinxcontrib_htmlhelp-2.0.4.tar.gz", hash = "sha256:6c26a118a05b76000738429b724a0568dbde5b72391a688577da08f11891092a"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+ {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+ {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+ {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+ {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.6"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_qthelp-1.0.6-py3-none-any.whl", hash = "sha256:bf76886ee7470b934e363da7a954ea2825650013d367728588732c7350f49ea4"},
+ {file = "sphinxcontrib_qthelp-1.0.6.tar.gz", hash = "sha256:62b9d1a186ab7f5ee3356d906f648cacb7a6bdb94d201ee7adf26db55092982d"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.9"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)"
+optional = false
+python-versions = ">=3.9"
+files = [
+ {file = "sphinxcontrib_serializinghtml-1.1.9-py3-none-any.whl", hash = "sha256:9b36e503703ff04f20e9675771df105e58aa029cfcbc23b8ed716019b7416ae1"},
+ {file = "sphinxcontrib_serializinghtml-1.1.9.tar.gz", hash = "sha256:0c64ff898339e1fac29abd2bf5f11078f3ec413cfe9c046d3120d7ca65530b54"},
+]
+
+[package.dependencies]
+Sphinx = ">=5"
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
[[package]]
name = "toml"
version = "0.10.2"
@@ -835,6 +1315,23 @@ files = [
{file = "typing_extensions-4.11.0.tar.gz", hash = "sha256:83f085bd5ca59c80295fc2a82ab5dac679cbe02b9f33f7d83af68e241bea51b0"},
]
+[[package]]
+name = "urllib3"
+version = "2.0.7"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+ {file = "urllib3-2.0.7-py3-none-any.whl", hash = "sha256:fdb6d215c776278489906c2f8916e6e7d4f5a9b602ccbcfdf7f016fc8da0596e"},
+ {file = "urllib3-2.0.7.tar.gz", hash = "sha256:c97dfde1f7bd43a71c8d2a58e369e9b2bf692d1334ea9f9cae55add7d0dd0f84"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
[[package]]
name = "warlock"
version = "2.0.1"
@@ -853,4 +1350,4 @@ jsonschema = ">=4,<5"
[metadata]
lock-version = "2.0"
python-versions = "^3.10"
-content-hash = "4af4dd49c59e5bd6ed99e8c19c6756aaf00125339d26cfad2ef98551dc765f8b"
+content-hash = "6db17f96cb31fb463b0b0a31dff9c02aa72641e0bffd8a610033fe2324006c43"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 0b9b09805a..38281f0e39 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -35,6 +35,14 @@ pylama = "^8.4.1"
pyflakes = "^2.5.0"
toml = "^0.10.2"
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<=7"
+sphinx-rtd-theme = ">=1.2.2"
+pyelftools = "^0.31"
+
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v19 4/5] dts: add API doc sources
2024-08-21 15:02 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
` (2 preceding siblings ...)
2024-08-21 15:02 ` [PATCH v19 3/5] dts: add doc generation dependencies Juraj Linkeš
@ 2024-08-21 15:02 ` Juraj Linkeš
2024-08-21 15:02 ` [PATCH v19 5/5] dts: add API doc generation Juraj Linkeš
2024-09-18 7:38 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-21 15:02 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman
Cc: dev, Juraj Linkeš, Luca Vizzarro
These sources could be generated with the sphinx-apidoc utility, but
that doesn't give us enough flexibility, such as sorting the order of
modules or changing the headers of the modules.
The sources included in this patch were in fact generated by said
utility, but modified to improve the look of the documentation. The
improvements are mainly in toctree definitions and the titles of the
modules/packages. These were made with specific Sphinx config options in
mind.
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Tested-by: Luca Vizzarro <luca.vizzarro@arm.com>
---
doc/api/dts/conf_yaml_schema.json | 1 +
doc/api/dts/framework.config.rst | 12 ++++++
doc/api/dts/framework.config.types.rst | 6 +++
doc/api/dts/framework.exception.rst | 6 +++
doc/api/dts/framework.logger.rst | 6 +++
doc/api/dts/framework.params.eal.rst | 6 +++
doc/api/dts/framework.params.rst | 14 ++++++
doc/api/dts/framework.params.testpmd.rst | 6 +++
doc/api/dts/framework.params.types.rst | 6 +++
doc/api/dts/framework.parser.rst | 6 +++
.../framework.remote_session.dpdk_shell.rst | 6 +++
...ote_session.interactive_remote_session.rst | 6 +++
...ework.remote_session.interactive_shell.rst | 6 +++
.../framework.remote_session.python_shell.rst | 6 +++
...ramework.remote_session.remote_session.rst | 6 +++
doc/api/dts/framework.remote_session.rst | 18 ++++++++
.../framework.remote_session.ssh_session.rst | 6 +++
...framework.remote_session.testpmd_shell.rst | 6 +++
doc/api/dts/framework.runner.rst | 6 +++
doc/api/dts/framework.settings.rst | 6 +++
doc/api/dts/framework.test_result.rst | 6 +++
doc/api/dts/framework.test_suite.rst | 6 +++
doc/api/dts/framework.testbed_model.cpu.rst | 6 +++
.../framework.testbed_model.linux_session.rst | 6 +++
doc/api/dts/framework.testbed_model.node.rst | 6 +++
.../framework.testbed_model.os_session.rst | 6 +++
doc/api/dts/framework.testbed_model.port.rst | 6 +++
.../framework.testbed_model.posix_session.rst | 6 +++
doc/api/dts/framework.testbed_model.rst | 26 +++++++++++
.../dts/framework.testbed_model.sut_node.rst | 6 +++
.../dts/framework.testbed_model.tg_node.rst | 6 +++
..._generator.capturing_traffic_generator.rst | 6 +++
...mework.testbed_model.traffic_generator.rst | 14 ++++++
....testbed_model.traffic_generator.scapy.rst | 6 +++
...el.traffic_generator.traffic_generator.rst | 6 +++
...framework.testbed_model.virtual_device.rst | 6 +++
doc/api/dts/framework.utils.rst | 6 +++
doc/api/dts/index.rst | 43 +++++++++++++++++++
38 files changed, 314 insertions(+)
create mode 120000 doc/api/dts/conf_yaml_schema.json
create mode 100644 doc/api/dts/framework.config.rst
create mode 100644 doc/api/dts/framework.config.types.rst
create mode 100644 doc/api/dts/framework.exception.rst
create mode 100644 doc/api/dts/framework.logger.rst
create mode 100644 doc/api/dts/framework.params.eal.rst
create mode 100644 doc/api/dts/framework.params.rst
create mode 100644 doc/api/dts/framework.params.testpmd.rst
create mode 100644 doc/api/dts/framework.params.types.rst
create mode 100644 doc/api/dts/framework.parser.rst
create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.rst
create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
create mode 100644 doc/api/dts/framework.runner.rst
create mode 100644 doc/api/dts/framework.settings.rst
create mode 100644 doc/api/dts/framework.test_result.rst
create mode 100644 doc/api/dts/framework.test_suite.rst
create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.node.rst
create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.port.rst
create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
create mode 100644 doc/api/dts/framework.testbed_model.rst
create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
create mode 100644 doc/api/dts/framework.utils.rst
create mode 100644 doc/api/dts/index.rst
diff --git a/doc/api/dts/conf_yaml_schema.json b/doc/api/dts/conf_yaml_schema.json
new file mode 120000
index 0000000000..5978642d76
--- /dev/null
+++ b/doc/api/dts/conf_yaml_schema.json
@@ -0,0 +1 @@
+../../../dts/framework/config/conf_yaml_schema.json
\ No newline at end of file
diff --git a/doc/api/dts/framework.config.rst b/doc/api/dts/framework.config.rst
new file mode 100644
index 0000000000..f765ef0e32
--- /dev/null
+++ b/doc/api/dts/framework.config.rst
@@ -0,0 +1,12 @@
+config - Configuration Package
+==============================
+
+.. automodule:: framework.config
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.config.types
diff --git a/doc/api/dts/framework.config.types.rst b/doc/api/dts/framework.config.types.rst
new file mode 100644
index 0000000000..ed52bf5d3e
--- /dev/null
+++ b/doc/api/dts/framework.config.types.rst
@@ -0,0 +1,6 @@
+config.types - Configuration Types
+==================================
+
+.. automodule:: framework.config.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.exception.rst b/doc/api/dts/framework.exception.rst
new file mode 100644
index 0000000000..ad58bd15de
--- /dev/null
+++ b/doc/api/dts/framework.exception.rst
@@ -0,0 +1,6 @@
+exception - Exceptions
+======================
+
+.. automodule:: framework.exception
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.logger.rst b/doc/api/dts/framework.logger.rst
new file mode 100644
index 0000000000..3c25b34819
--- /dev/null
+++ b/doc/api/dts/framework.logger.rst
@@ -0,0 +1,6 @@
+logger - Logging Facility
+=========================
+
+.. automodule:: framework.logger
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.eal.rst b/doc/api/dts/framework.params.eal.rst
new file mode 100644
index 0000000000..3908f6d471
--- /dev/null
+++ b/doc/api/dts/framework.params.eal.rst
@@ -0,0 +1,6 @@
+eal - EAL Parameters Modelling
+==============================
+
+.. automodule:: framework.params.eal
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.rst b/doc/api/dts/framework.params.rst
new file mode 100644
index 0000000000..a273b6378a
--- /dev/null
+++ b/doc/api/dts/framework.params.rst
@@ -0,0 +1,14 @@
+params - Command Line Parameters Modelling
+==========================================
+
+.. automodule:: framework.params
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.params.eal
+ framework.params.testpmd
+ framework.params.types
diff --git a/doc/api/dts/framework.params.testpmd.rst b/doc/api/dts/framework.params.testpmd.rst
new file mode 100644
index 0000000000..5f25ed5528
--- /dev/null
+++ b/doc/api/dts/framework.params.testpmd.rst
@@ -0,0 +1,6 @@
+testpmd - TestPMD Parameters Modelling
+======================================
+
+.. automodule:: framework.params.testpmd
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.params.types.rst b/doc/api/dts/framework.params.types.rst
new file mode 100644
index 0000000000..9c68a7fab8
--- /dev/null
+++ b/doc/api/dts/framework.params.types.rst
@@ -0,0 +1,6 @@
+params.types - Parameters Modelling Types
+=========================================
+
+.. automodule:: framework.params.types
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.parser.rst b/doc/api/dts/framework.parser.rst
new file mode 100644
index 0000000000..a5e3264f35
--- /dev/null
+++ b/doc/api/dts/framework.parser.rst
@@ -0,0 +1,6 @@
+parser - Text Parsing Utilities
+===============================
+
+.. automodule:: framework.parser
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.dpdk_shell.rst b/doc/api/dts/framework.remote_session.dpdk_shell.rst
new file mode 100644
index 0000000000..4402eba4fd
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.dpdk_shell.rst
@@ -0,0 +1,6 @@
+dpdk\_shell - DPDK Interactive Remote Shell
+===========================================
+
+.. automodule:: framework.remote_session.dpdk_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_remote_session.rst b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
new file mode 100644
index 0000000000..35dc5c4b03
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_remote_session.rst
@@ -0,0 +1,6 @@
+interactive\_remote\_session - SSH Interactive Remote Session
+=============================================================
+
+.. automodule:: framework.remote_session.interactive_remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.interactive_shell.rst b/doc/api/dts/framework.remote_session.interactive_shell.rst
new file mode 100644
index 0000000000..8a59db7b6e
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.interactive_shell.rst
@@ -0,0 +1,6 @@
+interactive\_shell - Base Interactive Remote Shell
+==================================================
+
+.. automodule:: framework.remote_session.interactive_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.python_shell.rst b/doc/api/dts/framework.remote_session.python_shell.rst
new file mode 100644
index 0000000000..a8ec06f281
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.python_shell.rst
@@ -0,0 +1,6 @@
+python\_shell - Python Interactive Remote Shell
+===============================================
+
+.. automodule:: framework.remote_session.python_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.remote_session.rst b/doc/api/dts/framework.remote_session.remote_session.rst
new file mode 100644
index 0000000000..58b0960d07
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.remote_session.rst
@@ -0,0 +1,6 @@
+remote\_session - Remote Session ABC
+====================================
+
+.. automodule:: framework.remote_session.remote_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.rst b/doc/api/dts/framework.remote_session.rst
new file mode 100644
index 0000000000..4e755b1fe3
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.rst
@@ -0,0 +1,18 @@
+remote\_session - Node Connections Package
+==========================================
+
+.. automodule:: framework.remote_session
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.remote_session.remote_session
+ framework.remote_session.ssh_session
+ framework.remote_session.interactive_remote_session
+ framework.remote_session.interactive_shell
+ framework.remote_session.dpdk_shell
+ framework.remote_session.testpmd_shell
+ framework.remote_session.python_shell
diff --git a/doc/api/dts/framework.remote_session.ssh_session.rst b/doc/api/dts/framework.remote_session.ssh_session.rst
new file mode 100644
index 0000000000..05b019bc7c
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.ssh_session.rst
@@ -0,0 +1,6 @@
+ssh\_session - SSH Remote Session
+=================================
+
+.. automodule:: framework.remote_session.ssh_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.remote_session.testpmd_shell.rst b/doc/api/dts/framework.remote_session.testpmd_shell.rst
new file mode 100644
index 0000000000..14510afb2b
--- /dev/null
+++ b/doc/api/dts/framework.remote_session.testpmd_shell.rst
@@ -0,0 +1,6 @@
+testpmd\_shell - Testpmd Interactive Remote Shell
+=================================================
+
+.. automodule:: framework.remote_session.testpmd_shell
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.runner.rst b/doc/api/dts/framework.runner.rst
new file mode 100644
index 0000000000..a1708f0002
--- /dev/null
+++ b/doc/api/dts/framework.runner.rst
@@ -0,0 +1,6 @@
+runner - Testbed Setup and Test Suite Runner
+============================================
+
+.. automodule:: framework.runner
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.settings.rst b/doc/api/dts/framework.settings.rst
new file mode 100644
index 0000000000..96bf194923
--- /dev/null
+++ b/doc/api/dts/framework.settings.rst
@@ -0,0 +1,6 @@
+settings - Command Line Arguments and Environment Variables
+===========================================================
+
+.. automodule:: framework.settings
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_result.rst b/doc/api/dts/framework.test_result.rst
new file mode 100644
index 0000000000..527357a04a
--- /dev/null
+++ b/doc/api/dts/framework.test_result.rst
@@ -0,0 +1,6 @@
+test\_result - Test Results Records
+===================================
+
+.. automodule:: framework.test_result
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.test_suite.rst b/doc/api/dts/framework.test_suite.rst
new file mode 100644
index 0000000000..96f893e465
--- /dev/null
+++ b/doc/api/dts/framework.test_suite.rst
@@ -0,0 +1,6 @@
+test\_suite - Common Test Suite Features
+========================================
+
+.. automodule:: framework.test_suite
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.cpu.rst b/doc/api/dts/framework.testbed_model.cpu.rst
new file mode 100644
index 0000000000..dd2baf09fb
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.cpu.rst
@@ -0,0 +1,6 @@
+cpu - CPU Representation and Utilities
+======================================
+
+.. automodule:: framework.testbed_model.cpu
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.linux_session.rst b/doc/api/dts/framework.testbed_model.linux_session.rst
new file mode 100644
index 0000000000..141f3f49e3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.linux_session.rst
@@ -0,0 +1,6 @@
+linux\_session - Linux Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.linux_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.node.rst b/doc/api/dts/framework.testbed_model.node.rst
new file mode 100644
index 0000000000..2133dd604b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.node.rst
@@ -0,0 +1,6 @@
+node - Base Node
+================
+
+.. automodule:: framework.testbed_model.node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.os_session.rst b/doc/api/dts/framework.testbed_model.os_session.rst
new file mode 100644
index 0000000000..f3574e939a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.os_session.rst
@@ -0,0 +1,6 @@
+os\_session - OS-aware Remote Session ABC
+=========================================
+
+.. automodule:: framework.testbed_model.os_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.port.rst b/doc/api/dts/framework.testbed_model.port.rst
new file mode 100644
index 0000000000..17bd391e63
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.port.rst
@@ -0,0 +1,6 @@
+port - NIC Port Representation
+==============================
+
+.. automodule:: framework.testbed_model.port
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.posix_session.rst b/doc/api/dts/framework.testbed_model.posix_session.rst
new file mode 100644
index 0000000000..308c051ae5
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.posix_session.rst
@@ -0,0 +1,6 @@
+posix\_session - Posix Remote Session
+=====================================
+
+.. automodule:: framework.testbed_model.posix_session
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.rst b/doc/api/dts/framework.testbed_model.rst
new file mode 100644
index 0000000000..4b024e47e6
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.rst
@@ -0,0 +1,26 @@
+testbed\_model - Testbed Modelling Package
+==========================================
+
+.. automodule:: framework.testbed_model
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 2
+
+ framework.testbed_model.traffic_generator
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.os_session
+ framework.testbed_model.linux_session
+ framework.testbed_model.posix_session
+ framework.testbed_model.node
+ framework.testbed_model.sut_node
+ framework.testbed_model.tg_node
+ framework.testbed_model.cpu
+ framework.testbed_model.port
+ framework.testbed_model.virtual_device
diff --git a/doc/api/dts/framework.testbed_model.sut_node.rst b/doc/api/dts/framework.testbed_model.sut_node.rst
new file mode 100644
index 0000000000..7e12b6c87e
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.sut_node.rst
@@ -0,0 +1,6 @@
+sut\_node - System Under Test Node
+==================================
+
+.. automodule:: framework.testbed_model.sut_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.tg_node.rst b/doc/api/dts/framework.testbed_model.tg_node.rst
new file mode 100644
index 0000000000..e56db8e782
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.tg_node.rst
@@ -0,0 +1,6 @@
+tg\_node - Traffic Generator Node
+=================================
+
+.. automodule:: framework.testbed_model.tg_node
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
new file mode 100644
index 0000000000..06c087155a
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
@@ -0,0 +1,6 @@
+capturing\_traffic\_generator - Base Capturing TG ABC
+=====================================================
+
+.. automodule:: framework.testbed_model.traffic_generator.capturing_traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.rst
new file mode 100644
index 0000000000..18b6f1b98b
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.rst
@@ -0,0 +1,14 @@
+traffic\_generator Subpackage
+=============================
+
+.. automodule:: framework.testbed_model.traffic_generator
+ :members:
+ :show-inheritance:
+
+.. toctree::
+ :hidden:
+ :maxdepth: 1
+
+ framework.testbed_model.traffic_generator.traffic_generator
+ framework.testbed_model.traffic_generator.capturing_traffic_generator
+ framework.testbed_model.traffic_generator.scapy
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
new file mode 100644
index 0000000000..7062914ec3
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
@@ -0,0 +1,6 @@
+scapy - Capturing Traffic Generator
+===================================
+
+.. automodule:: framework.testbed_model.traffic_generator.scapy
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
new file mode 100644
index 0000000000..e366d7f222
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
@@ -0,0 +1,6 @@
+traffic\_generator - Base TG ABC
+================================
+
+.. automodule:: framework.testbed_model.traffic_generator.traffic_generator
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.testbed_model.virtual_device.rst b/doc/api/dts/framework.testbed_model.virtual_device.rst
new file mode 100644
index 0000000000..38e6c1d0bc
--- /dev/null
+++ b/doc/api/dts/framework.testbed_model.virtual_device.rst
@@ -0,0 +1,6 @@
+virtual\_device - Virtual Devices
+=================================
+
+.. automodule:: framework.testbed_model.virtual_device
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/framework.utils.rst b/doc/api/dts/framework.utils.rst
new file mode 100644
index 0000000000..0e7bb80666
--- /dev/null
+++ b/doc/api/dts/framework.utils.rst
@@ -0,0 +1,6 @@
+utils - Various Utilities
+=========================
+
+.. automodule:: framework.utils
+ :members:
+ :show-inheritance:
diff --git a/doc/api/dts/index.rst b/doc/api/dts/index.rst
new file mode 100644
index 0000000000..e83fa33e7d
--- /dev/null
+++ b/doc/api/dts/index.rst
@@ -0,0 +1,43 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's API documentation!
+===============================================
+
+.. automodule:: framework
+ :members:
+ :show-inheritance:
+
+Packages
+--------
+
+.. toctree::
+ :includehidden:
+ :maxdepth: 1
+
+ framework.testbed_model
+ framework.remote_session
+ framework.params
+ framework.config
+
+Modules
+-------
+
+.. toctree::
+ :maxdepth: 1
+
+ framework.runner
+ framework.test_suite
+ framework.test_result
+ framework.settings
+ framework.logger
+ framework.parser
+ framework.utils
+ framework.exception
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* [PATCH v19 5/5] dts: add API doc generation
2024-08-21 15:02 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
` (3 preceding siblings ...)
2024-08-21 15:02 ` [PATCH v19 4/5] dts: add API doc sources Juraj Linkeš
@ 2024-08-21 15:02 ` Juraj Linkeš
2024-08-21 15:24 ` Dean Marx
` (2 more replies)
2024-09-18 7:38 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
5 siblings, 3 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-08-21 15:02 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman
Cc: dev, Juraj Linkeš, Bruce Richardson
The tool used to generate DTS API docs is Sphinx, which is already in
use in DPDK. The same configuration is used to preserve style with one
DTS-specific configuration (so that the DPDK docs are unchanged) that
modifies how the sidebar displays the content. There's other Sphinx
configuration related to Python docstrings which doesn't affect DPDK doc
build. All new configuration is in a conditional block, applied only
when DTS API docs are built to not interfere with DPDK doc build.
Sphinx generates the documentation from Python docstrings. The docstring
format is the Google format [0] which requires the sphinx.ext.napoleon
extension. The other extension, sphinx.ext.intersphinx, enables linking
to objects in external documentations, such as the Python documentation.
There is one requirement for building DTS docs - the same Python version
as DTS or higher, because Sphinx's autodoc extension imports the code.
The dependencies needed to import the code don't have to be satisfied,
as the autodoc extension allows us to mock the imports. The missing
packages are taken from the DTS pyproject.toml file.
And finally, the DTS API docs can be accessed from the DPDK API doxygen
page.
[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
Cc: Bruce Richardson <bruce.richardson@intel.com>
Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
Reviewed-by: Jeremy Spewock <jspewock@iol.unh.edu>
Reviewed-by: Dean Marx <dmarx@iol.unh.edu>
---
buildtools/call-sphinx-build.py | 2 +
buildtools/get-dts-runtime-deps.py | 95 +++++++++++++++++++++++
buildtools/meson.build | 1 +
doc/api/doxy-api-index.md | 3 +
doc/api/doxy-api.conf.in | 2 +
doc/api/dts/custom.css | 1 +
doc/api/dts/meson.build | 31 ++++++++
doc/api/meson.build | 6 +-
doc/guides/conf.py | 44 ++++++++++-
doc/guides/contributing/documentation.rst | 2 +
doc/guides/contributing/patches.rst | 4 +
doc/guides/tools/dts.rst | 39 +++++++++-
doc/meson.build | 1 +
13 files changed, 228 insertions(+), 3 deletions(-)
create mode 100755 buildtools/get-dts-runtime-deps.py
create mode 120000 doc/api/dts/custom.css
create mode 100644 doc/api/dts/meson.build
diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 623e7363ee..154e9f907b 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -15,6 +15,8 @@
# set the version in environment for sphinx to pick up
os.environ['DPDK_VERSION'] = version
+if 'dts' in src:
+ os.environ['DTS_BUILD'] = "y"
sphinx_cmd = [sphinx] + extra_args
diff --git a/buildtools/get-dts-runtime-deps.py b/buildtools/get-dts-runtime-deps.py
new file mode 100755
index 0000000000..1636a6dbf5
--- /dev/null
+++ b/buildtools/get-dts-runtime-deps.py
@@ -0,0 +1,95 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 PANTHEON.tech s.r.o.
+#
+
+"""Utilities for DTS dependencies.
+
+The module can be used as an executable script,
+which verifies that the running Python version meets the version requirement of DTS.
+The script exits with the standard exit codes in this mode (0 is success, 1 is failure).
+
+The module also contains a function, get_missing_imports,
+which looks for runtime dependencies in the DTS pyproject.toml file
+and returns a list of module names used in an import statement (import packages) that are missing.
+This function is not used when the module is run as a script and is available to be imported.
+"""
+
+import configparser
+import importlib.metadata
+import importlib.util
+import os.path
+import platform
+
+from packaging.version import Version
+
+_VERSION_COMPARISON_CHARS = '^<>='
+_EXTRA_DEPS = {
+ 'invoke': {'version': '>=1.3'},
+ 'paramiko': {'version': '>=2.4'},
+ 'PyYAML': {'version': '^6.0', 'import_package': 'yaml'}
+}
+_DPDK_ROOT = os.path.dirname(os.path.dirname(__file__))
+_DTS_DEP_FILE_PATH = os.path.join(_DPDK_ROOT, 'dts', 'pyproject.toml')
+
+
+def _get_dependencies(cfg_file_path):
+ cfg = configparser.ConfigParser()
+ with open(cfg_file_path) as f:
+ dts_deps_file_str = f.read()
+ dts_deps_file_str = dts_deps_file_str.replace("\n]", "]")
+ cfg.read_string(dts_deps_file_str)
+
+ deps_section = cfg['tool.poetry.dependencies']
+ return {dep: {'version': deps_section[dep].strip('"\'')} for dep in deps_section}
+
+
+def get_missing_imports():
+ """Get missing DTS import packages from third party libraries.
+
+ Scan the DTS pyproject.toml file for dependencies and find those that are not installed.
+ The dependencies in pyproject.toml are listed by their distribution package names,
+ but the function finds the associated import packages - those used in import statements.
+
+ The function is not used when the module is run as a script. It should be imported.
+
+ Returns:
+ A list of missing import packages.
+ """
+ missing_imports = []
+ req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
+ req_deps.pop('python')
+
+ for req_dep, dep_data in (req_deps | _EXTRA_DEPS).items():
+ req_ver = dep_data['version']
+ try:
+ import_package = dep_data['import_package']
+ except KeyError:
+ import_package = req_dep
+ import_package = import_package.lower().replace('-', '_')
+
+ try:
+ req_ver = Version(req_ver.strip(_VERSION_COMPARISON_CHARS))
+ found_dep_ver = Version(importlib.metadata.version(req_dep))
+ if found_dep_ver < req_ver:
+ print(
+ f'The version "{found_dep_ver}" of package "{req_dep}" '
+ f'is lower than required "{req_ver}".'
+ )
+ except importlib.metadata.PackageNotFoundError:
+ print(f'Package "{req_dep}" not found.')
+ missing_imports.append(import_package)
+
+ return missing_imports
+
+
+if __name__ == '__main__':
+ python_version = _get_dependencies(_DTS_DEP_FILE_PATH).pop('python')
+ if python_version:
+ sys_ver = Version(platform.python_version())
+ req_ver = Version(python_version['version'].strip(_VERSION_COMPARISON_CHARS))
+ if sys_ver < req_ver:
+ print(
+ f'The available Python version "{sys_ver}" is lower than required "{req_ver}".'
+ )
+ exit(1)
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..6b938d767c 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,7 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
get_test_suites_cmd = py3 + files('get-test-suites.py')
has_hugepages_cmd = py3 + files('has-hugepages.py')
cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+get_dts_runtime_deps = py3 + files('get-dts-runtime-deps.py')
# install any build tools that end-users might want also
install_data([
diff --git a/doc/api/doxy-api-index.md b/doc/api/doxy-api-index.md
index f9f0300126..ab223bcdf7 100644
--- a/doc/api/doxy-api-index.md
+++ b/doc/api/doxy-api-index.md
@@ -245,3 +245,6 @@ The public API headers are grouped by topics:
[experimental APIs](@ref rte_compat.h),
[ABI versioning](@ref rte_function_versioning.h),
[version](@ref rte_version.h)
+
+- **tests**:
+ [**DTS**](@dts_api_main_page)
diff --git a/doc/api/doxy-api.conf.in b/doc/api/doxy-api.conf.in
index a8823c046f..c94f02d411 100644
--- a/doc/api/doxy-api.conf.in
+++ b/doc/api/doxy-api.conf.in
@@ -124,6 +124,8 @@ SEARCHENGINE = YES
SORT_MEMBER_DOCS = NO
SOURCE_BROWSER = YES
+ALIASES = "dts_api_main_page=@DTS_API_MAIN_PAGE@"
+
EXAMPLE_PATH = @TOPDIR@/examples
EXAMPLE_PATTERNS = *.c
EXAMPLE_RECURSIVE = YES
diff --git a/doc/api/dts/custom.css b/doc/api/dts/custom.css
new file mode 120000
index 0000000000..3c9480c4a0
--- /dev/null
+++ b/doc/api/dts/custom.css
@@ -0,0 +1 @@
+../../guides/custom.css
\ No newline at end of file
diff --git a/doc/api/dts/meson.build b/doc/api/dts/meson.build
new file mode 100644
index 0000000000..f338eb69bf
--- /dev/null
+++ b/doc/api/dts/meson.build
@@ -0,0 +1,31 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
+if not sphinx.found()
+ subdir_done()
+endif
+
+python_ver_satisfied = run_command(get_dts_runtime_deps, check: false).returncode()
+if python_ver_satisfied != 0
+ subdir_done()
+endif
+
+cdata.set('DTS_API_MAIN_PAGE', join_paths('..', 'dts', 'html', 'index.html'))
+
+extra_sphinx_args = ['-E', '-c', join_paths(doc_source_dir, 'guides')]
+if get_option('werror')
+ extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk', 'dts')
+dts_api_html = custom_target('dts_api_html',
+ output: 'html',
+ command: [sphinx_wrapper, sphinx, meson.project_version(),
+ meson.current_source_dir(), meson.current_build_dir(), extra_sphinx_args],
+ build_by_default: get_option('enable_docs'),
+ install: get_option('enable_docs'),
+ install_dir: htmldir)
+
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 5b50692df9..71b861e42b 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,11 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+# initialize common Doxygen configuration
+cdata = configuration_data()
+
+subdir('dts')
+
doxygen = find_program('doxygen', required: get_option('enable_docs'))
if not doxygen.found()
@@ -30,7 +35,6 @@ example = custom_target('examples.dox',
build_by_default: get_option('enable_docs'))
# set up common Doxygen configuration
-cdata = configuration_data()
cdata.set('VERSION', meson.project_version())
cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..d7f3030838 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -10,7 +10,7 @@
from os.path import basename
from os.path import dirname
from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
import configparser
@@ -58,6 +58,48 @@
("tools/devbind", "dpdk-devbind",
"check device status and bind/unbind them from drivers", "", 8)]
+# DTS API docs additional configuration
+if environ.get('DTS_BUILD'):
+ extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
+ # Napoleon enables the Google format of Python doscstrings.
+ napoleon_numpy_docstring = False
+ napoleon_attr_annotations = True
+ napoleon_preprocess_types = True
+
+ # Autodoc pulls documentation from code.
+ autodoc_default_options = {
+ 'members': True,
+ 'member-order': 'bysource',
+ 'show-inheritance': True,
+ }
+ autodoc_class_signature = 'separated'
+ autodoc_typehints = 'both'
+ autodoc_typehints_format = 'short'
+ autodoc_typehints_description_target = 'documented'
+
+ # Intersphinx allows linking to external projects, such as Python docs.
+ intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
+
+ # DTS docstring options.
+ add_module_names = False
+ toc_object_entries = True
+ toc_object_entries_show_parents = 'hide'
+ # DTS Sidebar config.
+ html_theme_options = {
+ 'collapse_navigation': False,
+ 'navigation_depth': -1, # unlimited depth
+ }
+
+ # Add path to DTS sources so that Sphinx can find them.
+ dpdk_root = dirname(dirname(dirname(__file__)))
+ path.append(path_join(dpdk_root, 'dts'))
+
+ # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
+ path.append(path_join(dpdk_root, 'buildtools'))
+ import importlib
+ # Ignore missing imports from DTS dependencies.
+ autodoc_mock_imports = importlib.import_module('get-dts-runtime-deps').get_missing_imports()
+
# ####### :numref: fallback ########
# The following hook functions add some simple handling for the :numref:
diff --git a/doc/guides/contributing/documentation.rst b/doc/guides/contributing/documentation.rst
index 68454ae0d5..7b287ce631 100644
--- a/doc/guides/contributing/documentation.rst
+++ b/doc/guides/contributing/documentation.rst
@@ -133,6 +133,8 @@ added to by the developer.
Building the Documentation
--------------------------
+.. _doc_dependencies:
+
Dependencies
~~~~~~~~~~~~
diff --git a/doc/guides/contributing/patches.rst b/doc/guides/contributing/patches.rst
index 04c66bebc4..6629928bee 100644
--- a/doc/guides/contributing/patches.rst
+++ b/doc/guides/contributing/patches.rst
@@ -499,6 +499,10 @@ The script usage is::
For both of the above scripts, the -n option is used to specify a number of commits from HEAD,
and the -r option allows the user specify a ``git log`` range.
+Additionally, when contributing to the DTS tool, patches should also be checked using
+the ``dts-check-format.sh`` script in the ``devtools`` directory of the DPDK repo.
+To run the script, extra :ref:`Python dependencies <dts_deps>` are needed.
+
.. _contrib_check_compilation:
Checking Compilation
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 515b15e4d8..9e8929f567 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -54,6 +54,7 @@ DTS uses Poetry as its Python dependency management.
Python build/development and runtime environments are the same and DTS development environment,
DTS runtime environment or just plain DTS environment are used interchangeably.
+.. _dts_deps:
Setting up DTS environment
~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -291,8 +292,15 @@ When adding code to the DTS framework, pay attention to the rest of the code
and try not to divert much from it.
The :ref:`DTS developer tools <dts_dev_tools>` will issue warnings
when some of the basics are not met.
+You should also build the :ref:`API documentation <building_api_docs>`
+to address any issues found during the build.
-The code must be properly documented with docstrings.
+The API documentation, which is a helpful reference when developing, may be accessed
+in the code directly or generated with the :ref:`API docs build steps <building_api_docs>`.
+When adding new files or modifying the directory structure,
+the corresponding changes must be made to DTS api doc sources in ``doc/api/dts``.
+
+Speaking of which, the code must be properly documented with docstrings.
The style must conform to the `Google style
<https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
See an example of the style `here
@@ -427,6 +435,35 @@ the DTS code check and format script.
Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+.. _building_api_docs:
+
+Building DTS API docs
+---------------------
+
+The documentation is built using the standard DPDK build system.
+See :doc:`../linux_gsg/build_dpdk` for more details on compiling DPDK with meson.
+
+The :ref:`doc build dependencies <doc_dependencies>` may be installed with Poetry:
+
+.. code-block:: console
+
+ poetry install --no-root --only docs
+ poetry install --no-root --with docs # an alternative that will also install DTS dependencies
+ poetry shell
+
+After executing the meson command, build the documentation with:
+
+.. code-block:: console
+
+ ninja -C build doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. note::
+
+ Make sure to fix any Sphinx warnings when adding or updating docstrings.
+
+
Configuration Schema
--------------------
diff --git a/doc/meson.build b/doc/meson.build
index 6f74706aa2..1e0cfa4127 100644
--- a/doc/meson.build
+++ b/doc/meson.build
@@ -1,6 +1,7 @@
# SPDX-License-Identifier: BSD-3-Clause
# Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
+doc_source_dir = meson.current_source_dir()
doc_targets = []
doc_target_names = []
subdir('api')
--
2.34.1
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v19 5/5] dts: add API doc generation
2024-08-21 15:02 ` [PATCH v19 5/5] dts: add API doc generation Juraj Linkeš
@ 2024-08-21 15:24 ` Dean Marx
2024-09-02 10:57 ` Luca Vizzarro
2024-09-12 20:09 ` Thomas Monjalon
2 siblings, 0 replies; 393+ messages in thread
From: Dean Marx @ 2024-08-21 15:24 UTC (permalink / raw)
To: Juraj Linkeš
Cc: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, alex.chapman, dev, Bruce Richardson
[-- Attachment #1: Type: text/plain, Size: 1392 bytes --]
On Wed, Aug 21, 2024 at 11:03 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:
> The tool used to generate DTS API docs is Sphinx, which is already in
> use in DPDK. The same configuration is used to preserve style with one
> DTS-specific configuration (so that the DPDK docs are unchanged) that
> modifies how the sidebar displays the content. There's other Sphinx
> configuration related to Python docstrings which doesn't affect DPDK doc
> build. All new configuration is in a conditional block, applied only
> when DTS API docs are built to not interfere with DPDK doc build.
>
> Sphinx generates the documentation from Python docstrings. The docstring
> format is the Google format [0] which requires the sphinx.ext.napoleon
> extension. The other extension, sphinx.ext.intersphinx, enables linking
> to objects in external documentations, such as the Python documentation.
>
> There is one requirement for building DTS docs - the same Python version
> as DTS or higher, because Sphinx's autodoc extension imports the code.
>
> The dependencies needed to import the code don't have to be satisfied,
> as the autodoc extension allows us to mock the imports. The missing
> packages are taken from the DTS pyproject.toml file.
>
> And finally, the DTS API docs can be accessed from the DPDK API doxygen
> page.
>
Tested-by: Dean Marx <dmarx@iol.unh.edu>
[-- Attachment #2: Type: text/html, Size: 1734 bytes --]
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v19 5/5] dts: add API doc generation
2024-08-21 15:02 ` [PATCH v19 5/5] dts: add API doc generation Juraj Linkeš
2024-08-21 15:24 ` Dean Marx
@ 2024-09-02 10:57 ` Luca Vizzarro
2024-09-12 20:09 ` Thomas Monjalon
2 siblings, 0 replies; 393+ messages in thread
From: Luca Vizzarro @ 2024-09-02 10:57 UTC (permalink / raw)
To: Juraj Linkeš,
thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
npratte, dmarx, alex.chapman
Cc: dev, Bruce Richardson
Reviewed-by: Luca Vizzarro <luca.vizzarro@arm.com>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v19 5/5] dts: add API doc generation
2024-08-21 15:02 ` [PATCH v19 5/5] dts: add API doc generation Juraj Linkeš
2024-08-21 15:24 ` Dean Marx
2024-09-02 10:57 ` Luca Vizzarro
@ 2024-09-12 20:09 ` Thomas Monjalon
2024-09-16 8:51 ` Juraj Linkeš
2 siblings, 1 reply; 393+ messages in thread
From: Thomas Monjalon @ 2024-09-12 20:09 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman, dev,
Juraj Linkeš,
Bruce Richardson
21/08/2024 17:02, Juraj Linkeš:
> +if 'dts' in src:
> + os.environ['DTS_BUILD'] = "y"
That's more precisely "DTS doc build".
I think the variable name DTS_BUILD may be confusing.
[...]
> --- /dev/null
> +++ b/buildtools/get-dts-runtime-deps.py
> @@ -0,0 +1,95 @@
> +#!/usr/bin/env python3
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2024 PANTHEON.tech s.r.o.
> +#
extra blank line above can be removed
> +
> +"""Utilities for DTS dependencies.
> +
> +The module can be used as an executable script,
> +which verifies that the running Python version meets the version requirement of DTS.
> +The script exits with the standard exit codes in this mode (0 is success, 1 is failure).
Given it is just doing a check by default,
the script name could be "check-dts-requirements".
> +
> +The module also contains a function, get_missing_imports,
> +which looks for runtime dependencies in the DTS pyproject.toml file
> +and returns a list of module names used in an import statement (import packages) that are missing.
> +This function is not used when the module is run as a script and is available to be imported.
> +"""
[...]
> + req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
> + req_deps.pop('python')
> +
> + for req_dep, dep_data in (req_deps | _EXTRA_DEPS).items():
Please could you explain somewhere why _EXTRA_DEPS is needed?
[...]
> +++ b/doc/api/dts/meson.build
> @@ -0,0 +1,31 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
> +if not sphinx.found()
> + subdir_done()
> +endif
> +
> +python_ver_satisfied = run_command(get_dts_runtime_deps, check: false).returncode()
> +if python_ver_satisfied != 0
> + subdir_done()
> +endif
Looks simple.
So if I have the right Python but some dependencies are missing,
it will still work the same, right?
I feel the need for dependencies should be explained in the script.
[...]
> --- a/doc/api/meson.build
> +++ b/doc/api/meson.build
> @@ -1,6 +1,11 @@
> # SPDX-License-Identifier: BSD-3-Clause
> # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
>
> +# initialize common Doxygen configuration
> +cdata = configuration_data()
> +
> +subdir('dts')
Why inserting DTS first before generating DPDK API doc?
[...]
> # set up common Doxygen configuration
> -cdata = configuration_data()
> cdata.set('VERSION', meson.project_version())
> cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
> cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
> diff --git a/doc/guides/conf.py b/doc/guides/conf.py
> index 0f7ff5282d..d7f3030838 100644
> --- a/doc/guides/conf.py
> +++ b/doc/guides/conf.py
> @@ -10,7 +10,7 @@
> from os.path import basename
> from os.path import dirname
> from os.path import join as path_join
> -from sys import argv, stderr
> +from sys import argv, stderr, path
>
> import configparser
>
> @@ -58,6 +58,48 @@
> ("tools/devbind", "dpdk-devbind",
> "check device status and bind/unbind them from drivers", "", 8)]
>
> +# DTS API docs additional configuration
> +if environ.get('DTS_BUILD'):
> + extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
> + # Napoleon enables the Google format of Python doscstrings.
> + napoleon_numpy_docstring = False
> + napoleon_attr_annotations = True
> + napoleon_preprocess_types = True
> +
> + # Autodoc pulls documentation from code.
> + autodoc_default_options = {
> + 'members': True,
> + 'member-order': 'bysource',
> + 'show-inheritance': True,
> + }
> + autodoc_class_signature = 'separated'
> + autodoc_typehints = 'both'
> + autodoc_typehints_format = 'short'
> + autodoc_typehints_description_target = 'documented'
> +
> + # Intersphinx allows linking to external projects, such as Python docs.
> + intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
I'm not sure about the need for this intersphinx.
> +
> + # DTS docstring options.
> + add_module_names = False
> + toc_object_entries = True
> + toc_object_entries_show_parents = 'hide'
> + # DTS Sidebar config.
> + html_theme_options = {
> + 'collapse_navigation': False,
> + 'navigation_depth': -1, # unlimited depth
> + }
> +
> + # Add path to DTS sources so that Sphinx can find them.
> + dpdk_root = dirname(dirname(dirname(__file__)))
> + path.append(path_join(dpdk_root, 'dts'))
> +
> + # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
> + path.append(path_join(dpdk_root, 'buildtools'))
> + import importlib
> + # Ignore missing imports from DTS dependencies.
> + autodoc_mock_imports = importlib.import_module('get-dts-runtime-deps').get_missing_imports()
[...]
> +the corresponding changes must be made to DTS api doc sources in ``doc/api/dts``.
api -> API
Except minor corrections and explanations, it looks good.
You can add my ack to the final version.
Acked-by: Thomas Monjalon <thomas@monjalon.net>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v19 5/5] dts: add API doc generation
2024-09-12 20:09 ` Thomas Monjalon
@ 2024-09-16 8:51 ` Juraj Linkeš
2024-09-16 12:48 ` Thomas Monjalon
0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2024-09-16 8:51 UTC (permalink / raw)
To: Thomas Monjalon
Cc: Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman, dev,
Bruce Richardson
On 12. 9. 2024 22:09, Thomas Monjalon wrote:
> 21/08/2024 17:02, Juraj Linkeš:
>> +if 'dts' in src:
>> + os.environ['DTS_BUILD'] = "y"
>
> That's more precisely "DTS doc build".
> I think the variable name DTS_BUILD may be confusing.
Ack, I'll rename the variable.
>
> [...]
>> --- /dev/null
>> +++ b/buildtools/get-dts-runtime-deps.py
>> @@ -0,0 +1,95 @@
>> +#!/usr/bin/env python3
>> +# SPDX-License-Identifier: BSD-3-Clause
>> +# Copyright(c) 2024 PANTHEON.tech s.r.o.
>> +#
>
> extra blank line above can be removed
>
Ack.
>> +
>> +"""Utilities for DTS dependencies.
>> +
>> +The module can be used as an executable script,
>> +which verifies that the running Python version meets the version requirement of DTS.
>> +The script exits with the standard exit codes in this mode (0 is success, 1 is failure).
>
> Given it is just doing a check by default,
> the script name could be "check-dts-requirements".
>
Ack.
>> +
>> +The module also contains a function, get_missing_imports,
>> +which looks for runtime dependencies in the DTS pyproject.toml file
>> +and returns a list of module names used in an import statement (import packages) that are missing.
>> +This function is not used when the module is run as a script and is available to be imported.
>> +"""
> [...]
>> + req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
>> + req_deps.pop('python')
>> +
>> + for req_dep, dep_data in (req_deps | _EXTRA_DEPS).items():
>
> Please could you explain somewhere why _EXTRA_DEPS is needed?
I'll add this comment above the variable:
# The names of packages used in import statements may be different from
distribution package names.
# We get distribution package names from pyproject.toml.
# _EXTRA_DEPS adds those import names which don't match their
distribution package name.
>
> [...]
>> +++ b/doc/api/dts/meson.build
>> @@ -0,0 +1,31 @@
>> +# SPDX-License-Identifier: BSD-3-Clause
>> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
>> +
>> +sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
>> +if not sphinx.found()
>> + subdir_done()
>> +endif
>> +
>> +python_ver_satisfied = run_command(get_dts_runtime_deps, check: false).returncode()
>> +if python_ver_satisfied != 0
>> + subdir_done()
>> +endif
>
> Looks simple.
> So if I have the right Python but some dependencies are missing,
> it will still work the same, right?
Yes.
> I feel the need for dependencies should be explained in the script.
From my point of view, the script gets the dependencies and it's up to
the caller how they use the list of dependencies.
The caller is conf.py and there's a bit of an explanation:
# Get missing DTS dependencies. Add path to buildtools to find the
get_missing_imports function.
And then:
# Ignore missing imports from DTS dependencies.
So basically get the dependencies so we know what to ignore.
But I could add something to the script if this is not enough.
>
> [...]
>> --- a/doc/api/meson.build
>> +++ b/doc/api/meson.build
>> @@ -1,6 +1,11 @@
>> # SPDX-License-Identifier: BSD-3-Clause
>> # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
>>
>> +# initialize common Doxygen configuration
>> +cdata = configuration_data()
>> +
>> +subdir('dts')
>
> Why inserting DTS first before generating DPDK API doc?
>
I wanted to put it before subdir_done(). Maybe we could put
subdir('dts') in the else branch and also at the end of the meson.build
file. That could be better.
> [...]
>> # set up common Doxygen configuration
>> -cdata = configuration_data()
>> cdata.set('VERSION', meson.project_version())
>> cdata.set('API_EXAMPLES', join_paths(dpdk_build_root, 'doc', 'api', 'examples.dox'))
>> cdata.set('OUTPUT', join_paths(dpdk_build_root, 'doc', 'api'))
>> diff --git a/doc/guides/conf.py b/doc/guides/conf.py
>> index 0f7ff5282d..d7f3030838 100644
>> --- a/doc/guides/conf.py
>> +++ b/doc/guides/conf.py
>> @@ -10,7 +10,7 @@
>> from os.path import basename
>> from os.path import dirname
>> from os.path import join as path_join
>> -from sys import argv, stderr
>> +from sys import argv, stderr, path
>>
>> import configparser
>>
>> @@ -58,6 +58,48 @@
>> ("tools/devbind", "dpdk-devbind",
>> "check device status and bind/unbind them from drivers", "", 8)]
>>
>> +# DTS API docs additional configuration
>> +if environ.get('DTS_BUILD'):
>> + extensions = ['sphinx.ext.napoleon', 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx']
>> + # Napoleon enables the Google format of Python doscstrings.
>> + napoleon_numpy_docstring = False
>> + napoleon_attr_annotations = True
>> + napoleon_preprocess_types = True
>> +
>> + # Autodoc pulls documentation from code.
>> + autodoc_default_options = {
>> + 'members': True,
>> + 'member-order': 'bysource',
>> + 'show-inheritance': True,
>> + }
>> + autodoc_class_signature = 'separated'
>> + autodoc_typehints = 'both'
>> + autodoc_typehints_format = 'short'
>> + autodoc_typehints_description_target = 'documented'
>> +
>> + # Intersphinx allows linking to external projects, such as Python docs.
>> + intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
>
> I'm not sure about the need for this intersphinx.
It's not stricly needed, but it produces better documentation, with
links to Python docs for classes and other things found there.
For example:
:class:`~argparse.Action` in a docstring will link to
https://docs.python.org/3/library/argparse.html#argparse.Action
>
>> +
>> + # DTS docstring options.
>> + add_module_names = False
>> + toc_object_entries = True
>> + toc_object_entries_show_parents = 'hide'
>> + # DTS Sidebar config.
>> + html_theme_options = {
>> + 'collapse_navigation': False,
>> + 'navigation_depth': -1, # unlimited depth
>> + }
>> +
>> + # Add path to DTS sources so that Sphinx can find them.
>> + dpdk_root = dirname(dirname(dirname(__file__)))
>> + path.append(path_join(dpdk_root, 'dts'))
>> +
>> + # Get missing DTS dependencies. Add path to buildtools to find the get_missing_imports function.
>> + path.append(path_join(dpdk_root, 'buildtools'))
>> + import importlib
>> + # Ignore missing imports from DTS dependencies.
>> + autodoc_mock_imports = importlib.import_module('get-dts-runtime-deps').get_missing_imports()
> [...]
>> +the corresponding changes must be made to DTS api doc sources in ``doc/api/dts``.
>
> api -> API
>
Ack.
>
> Except minor corrections and explanations, it looks good.
> You can add my ack to the final version.
>
> Acked-by: Thomas Monjalon <thomas@monjalon.net>
>
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v19 5/5] dts: add API doc generation
2024-09-16 8:51 ` Juraj Linkeš
@ 2024-09-16 12:48 ` Thomas Monjalon
2024-09-17 15:10 ` Juraj Linkeš
0 siblings, 1 reply; 393+ messages in thread
From: Thomas Monjalon @ 2024-09-16 12:48 UTC (permalink / raw)
To: Juraj Linkeš
Cc: Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman, dev,
Bruce Richardson
16/09/2024 10:51, Juraj Linkeš:
> On 12. 9. 2024 22:09, Thomas Monjalon wrote:
> > 21/08/2024 17:02, Juraj Linkeš:
> >> + req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
> >> + req_deps.pop('python')
> >> +
> >> + for req_dep, dep_data in (req_deps | _EXTRA_DEPS).items():
> >
> > Please could you explain somewhere why _EXTRA_DEPS is needed?
>
> I'll add this comment above the variable:
> # The names of packages used in import statements may be different from
> distribution package names.
> # We get distribution package names from pyproject.toml.
> # _EXTRA_DEPS adds those import names which don't match their
> distribution package name.
Good
> > I feel the need for dependencies should be explained in the script.
>
> From my point of view, the script gets the dependencies and it's up to
> the caller how they use the list of dependencies.
>
> The caller is conf.py and there's a bit of an explanation:
> # Get missing DTS dependencies. Add path to buildtools to find the
> get_missing_imports function.
>
> And then:
> # Ignore missing imports from DTS dependencies.
>
> So basically get the dependencies so we know what to ignore.
>
> But I could add something to the script if this is not enough.
The unclear part is how it works without these dependencies.
> >> +# initialize common Doxygen configuration
> >> +cdata = configuration_data()
> >> +
> >> +subdir('dts')
> >
> > Why inserting DTS first before generating DPDK API doc?
>
> I wanted to put it before subdir_done(). Maybe we could put
> subdir('dts') in the else branch and also at the end of the meson.build
> file. That could be better.
Yes
> >> + # Intersphinx allows linking to external projects, such as Python docs.
> >> + intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
> >
> > I'm not sure about the need for this intersphinx.
>
> It's not stricly needed, but it produces better documentation, with
> links to Python docs for classes and other things found there.
>
> For example:
> :class:`~argparse.Action` in a docstring will link to
> https://docs.python.org/3/library/argparse.html#argparse.Action
If you think it helps, I'm fine.
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v19 5/5] dts: add API doc generation
2024-09-16 12:48 ` Thomas Monjalon
@ 2024-09-17 15:10 ` Juraj Linkeš
0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-09-17 15:10 UTC (permalink / raw)
To: Thomas Monjalon
Cc: Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman, dev,
Bruce Richardson
On 16. 9. 2024 14:48, Thomas Monjalon wrote:
> 16/09/2024 10:51, Juraj Linkeš:
>> On 12. 9. 2024 22:09, Thomas Monjalon wrote:
>>> 21/08/2024 17:02, Juraj Linkeš:
>>>> + req_deps = _get_dependencies(_DTS_DEP_FILE_PATH)
>>>> + req_deps.pop('python')
>>>> +
>>>> + for req_dep, dep_data in (req_deps | _EXTRA_DEPS).items():
>>>
>>> Please could you explain somewhere why _EXTRA_DEPS is needed?
>>
>> I'll add this comment above the variable:
>> # The names of packages used in import statements may be different from
>> distribution package names.
>> # We get distribution package names from pyproject.toml.
>> # _EXTRA_DEPS adds those import names which don't match their
>> distribution package name.
>
> Good
>
Ack.
>
>>> I feel the need for dependencies should be explained in the script.
>>
>> From my point of view, the script gets the dependencies and it's up to
>> the caller how they use the list of dependencies.
>>
>> The caller is conf.py and there's a bit of an explanation:
>> # Get missing DTS dependencies. Add path to buildtools to find the
>> get_missing_imports function.
>>
>> And then:
>> # Ignore missing imports from DTS dependencies.
>>
>> So basically get the dependencies so we know what to ignore.
>>
>> But I could add something to the script if this is not enough.
>
> The unclear part is how it works without these dependencies.
>
It's a bit unclear to me as well. The docs [0] don't say much and the
only thing I found is that object paths from third party libraries are a
bit different:
without dependencies: fabric.Connection
with dependencies: fabric.connection.Connection
Everything else seems to be the same. Do we want to document this and if
so, where would be the best place? In the script or conf.py?
[0]
https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#confval-autodoc_mock_imports
>
>>>> +# initialize common Doxygen configuration
>>>> +cdata = configuration_data()
>>>> +
>>>> +subdir('dts')
>>>
>>> Why inserting DTS first before generating DPDK API doc?
>>
>> I wanted to put it before subdir_done(). Maybe we could put
>> subdir('dts') in the else branch and also at the end of the meson.build
>> file. That could be better.
>
> Yes
>
Ok, I'll make the change.
>
>>>> + # Intersphinx allows linking to external projects, such as Python docs.
>>>> + intersphinx_mapping = {'python': ('https://docs.python.org/3', None)}
>>>
>>> I'm not sure about the need for this intersphinx.
>>
>> It's not stricly needed, but it produces better documentation, with
>> links to Python docs for classes and other things found there.
>>
>> For example:
>> :class:`~argparse.Action` in a docstring will link to
>> https://docs.python.org/3/library/argparse.html#argparse.Action
>
> If you think it helps, I'm fine.
>
Absolutely, it'll make the docs easier to use.
>
^ permalink raw reply [flat|nested] 393+ messages in thread
* Re: [PATCH v19 0/5] DTS API docs generation
2024-08-21 15:02 ` [PATCH v19 0/5] DTS API docs generation Juraj Linkeš
` (4 preceding siblings ...)
2024-08-21 15:02 ` [PATCH v19 5/5] dts: add API doc generation Juraj Linkeš
@ 2024-09-18 7:38 ` Juraj Linkeš
5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2024-09-18 7:38 UTC (permalink / raw)
To: thomas, Honnappa.Nagarahalli, jspewock, probb, paul.szczepanek,
Luca.Vizzarro, npratte, dmarx, alex.chapman
Cc: dev
On 21. 8. 2024 17:02, Juraj Linkeš wrote:
> The generation is done with Sphinx, which DPDK already uses, with
> slightly modified configuration of the sidebar present in an if block.
>
> DTS dependencies do not need to be installed, but there is the option to
> install doc build dependencies with Poetry:
> poetry install --with docs
>
> The build itself may be run with:
> meson setup <meson_build_dir> -Denable_docs=true
> ninja -C <meson_build_dir>
>
> The above will do a full DPDK build with docs. To build just docs:
> meson setup <meson_build_dir>
> ninja -C <meson_build_dir> doc
>
> Python3.10 is required to build the DTS API docs.
>
> The patchset contains the .rst sources which Sphinx uses to generate the
> html pages. These were first generated with the sphinx-apidoc utility
> and modified to provide a better look. The documentation just doesn't
> look that good without the modifications and there isn't enough
> configuration options to achieve that without manual changes to the .rst
> files. This introduces extra maintenance which involves adding new .rst
> files when a new Python module is added or changing the .rst structure
> if the Python directory/file structure is changed (moved, renamed
> files). This small maintenance burden is outweighed by the flexibility
> afforded by the ability to make manual changes to the .rst files.
>
> v10:
> Fix dts doc generation issue: Only copy the custom rss file if it exists.
>
> v11:
> Added the config option autodoc_mock_imports, which eliminates the need
> for DTS dependencies. Added a script that find out which imports need to
> be added to autodoc_mock_imports. The script also check the required
> Python version for building DTS docs.
> Removed tags from the two affected patches which will need to be
> reviewed again.
>
> v12:
> Added paramiko to the required dependencies of get-dts-deps.py.
>
> v13:
> Fixed build error:
> TypeError: unsupported operand type(s) for |: 'NoneType' and 'Transport'
>
> v14:
> Fixed install error:
> ERROR: File 'dts/doc/html' could not be found
> This required me to put the built docs into dts/doc which is outside the
> DPDK API doc dir, resulting in linking between DPDK and DTS api docs not
> working properly. I addressed this by adding a symlink to the build dir.
> This way the link works after installing the docs and the symlink is
> just one extra file in the build dir.
>
> v15:
> Moved DTS API sources to doc/api/dts. This simplifies a lot of things in
> the build, but mainly makes a lot of sense. Now the source, build and
> install paths are the same so there isn't any need for any symlinks or
> other workarounds.
> Also added a symlink to the custom.css file so that it works with
> call-sphinx-build.py without any modifications.
>
> v16:
> Renamed the dependency python file to get-dts-runtime-deps.py a modified
> it to only get runtime dependencies. We don't need to check docs
> dependencies (Sphinx) as we don't need to mock those.
> Also moved all new Sphinx configuration into the DTS if branch to make
> sure it won't ever affect the DPDK doc build.
>
> v17:
> Removed the dts-doc build target to mirror the functionality of using
> -Denable_docs=true.
> Moved DTS-specific meson build code to doc/api/dts/meson.build.
> Added comments to get_missing_imports() and the top level docstring of
> get-dts-runtime-deps.py to explain the function is there to be imported.
>
> v18:
> Added PyYAML to get-dts-runtime-deps.py.
>
> v19:
> Fixed a regression in get-dts-runtime-deps.py introduced in v18:
> AttributeError: 'dict' object has no attribute 'strip'
>
> Juraj Linkeš (5):
> dts: update params and parser docstrings
> dts: replace the or operator in third party types
> dts: add doc generation dependencies
> dts: add API doc sources
> dts: add API doc generation
>
> buildtools/call-sphinx-build.py | 2 +
> buildtools/get-dts-runtime-deps.py | 95 ++++
> buildtools/meson.build | 1 +
> doc/api/doxy-api-index.md | 3 +
> doc/api/doxy-api.conf.in | 2 +
> doc/api/dts/conf_yaml_schema.json | 1 +
> doc/api/dts/custom.css | 1 +
> doc/api/dts/framework.config.rst | 12 +
> doc/api/dts/framework.config.types.rst | 6 +
> doc/api/dts/framework.exception.rst | 6 +
> doc/api/dts/framework.logger.rst | 6 +
> doc/api/dts/framework.params.eal.rst | 6 +
> doc/api/dts/framework.params.rst | 14 +
> doc/api/dts/framework.params.testpmd.rst | 6 +
> doc/api/dts/framework.params.types.rst | 6 +
> doc/api/dts/framework.parser.rst | 6 +
> .../framework.remote_session.dpdk_shell.rst | 6 +
> ...ote_session.interactive_remote_session.rst | 6 +
> ...ework.remote_session.interactive_shell.rst | 6 +
> .../framework.remote_session.python_shell.rst | 6 +
> ...ramework.remote_session.remote_session.rst | 6 +
> doc/api/dts/framework.remote_session.rst | 18 +
> .../framework.remote_session.ssh_session.rst | 6 +
> ...framework.remote_session.testpmd_shell.rst | 6 +
> doc/api/dts/framework.runner.rst | 6 +
> doc/api/dts/framework.settings.rst | 6 +
> doc/api/dts/framework.test_result.rst | 6 +
> doc/api/dts/framework.test_suite.rst | 6 +
> doc/api/dts/framework.testbed_model.cpu.rst | 6 +
> .../framework.testbed_model.linux_session.rst | 6 +
> doc/api/dts/framework.testbed_model.node.rst | 6 +
> .../framework.testbed_model.os_session.rst | 6 +
> doc/api/dts/framework.testbed_model.port.rst | 6 +
> .../framework.testbed_model.posix_session.rst | 6 +
> doc/api/dts/framework.testbed_model.rst | 26 +
> .../dts/framework.testbed_model.sut_node.rst | 6 +
> .../dts/framework.testbed_model.tg_node.rst | 6 +
> ..._generator.capturing_traffic_generator.rst | 6 +
> ...mework.testbed_model.traffic_generator.rst | 14 +
> ....testbed_model.traffic_generator.scapy.rst | 6 +
> ...el.traffic_generator.traffic_generator.rst | 6 +
> ...framework.testbed_model.virtual_device.rst | 6 +
> doc/api/dts/framework.utils.rst | 6 +
> doc/api/dts/index.rst | 43 ++
> doc/api/dts/meson.build | 31 ++
> doc/api/meson.build | 6 +-
> doc/guides/conf.py | 44 +-
> doc/guides/contributing/documentation.rst | 2 +
> doc/guides/contributing/patches.rst | 4 +
> doc/guides/tools/dts.rst | 39 +-
> doc/meson.build | 1 +
> dts/framework/params/__init__.py | 4 +-
> dts/framework/params/eal.py | 7 +-
> dts/framework/params/types.py | 3 +-
> dts/framework/parser.py | 4 +-
> .../interactive_remote_session.py | 3 +-
> dts/poetry.lock | 521 +++++++++++++++++-
> dts/pyproject.toml | 8 +
> 58 files changed, 1072 insertions(+), 23 deletions(-)
> create mode 100755 buildtools/get-dts-runtime-deps.py
> create mode 120000 doc/api/dts/conf_yaml_schema.json
> create mode 120000 doc/api/dts/custom.css
> create mode 100644 doc/api/dts/framework.config.rst
> create mode 100644 doc/api/dts/framework.config.types.rst
> create mode 100644 doc/api/dts/framework.exception.rst
> create mode 100644 doc/api/dts/framework.logger.rst
> create mode 100644 doc/api/dts/framework.params.eal.rst
> create mode 100644 doc/api/dts/framework.params.rst
> create mode 100644 doc/api/dts/framework.params.testpmd.rst
> create mode 100644 doc/api/dts/framework.params.types.rst
> create mode 100644 doc/api/dts/framework.parser.rst
> create mode 100644 doc/api/dts/framework.remote_session.dpdk_shell.rst
> create mode 100644 doc/api/dts/framework.remote_session.interactive_remote_session.rst
> create mode 100644 doc/api/dts/framework.remote_session.interactive_shell.rst
> create mode 100644 doc/api/dts/framework.remote_session.python_shell.rst
> create mode 100644 doc/api/dts/framework.remote_session.remote_session.rst
> create mode 100644 doc/api/dts/framework.remote_session.rst
> create mode 100644 doc/api/dts/framework.remote_session.ssh_session.rst
> create mode 100644 doc/api/dts/framework.remote_session.testpmd_shell.rst
> create mode 100644 doc/api/dts/framework.runner.rst
> create mode 100644 doc/api/dts/framework.settings.rst
> create mode 100644 doc/api/dts/framework.test_result.rst
> create mode 100644 doc/api/dts/framework.test_suite.rst
> create mode 100644 doc/api/dts/framework.testbed_model.cpu.rst
> create mode 100644 doc/api/dts/framework.testbed_model.linux_session.rst
> create mode 100644 doc/api/dts/framework.testbed_model.node.rst
> create mode 100644 doc/api/dts/framework.testbed_model.os_session.rst
> create mode 100644 doc/api/dts/framework.testbed_model.port.rst
> create mode 100644 doc/api/dts/framework.testbed_model.posix_session.rst
> create mode 100644 doc/api/dts/framework.testbed_model.rst
> create mode 100644 doc/api/dts/framework.testbed_model.sut_node.rst
> create mode 100644 doc/api/dts/framework.testbed_model.tg_node.rst
> create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.capturing_traffic_generator.rst
> create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.rst
> create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.scapy.rst
> create mode 100644 doc/api/dts/framework.testbed_model.traffic_generator.traffic_generator.rst
> create mode 100644 doc/api/dts/framework.testbed_model.virtual_device.rst
> create mode 100644 doc/api/dts/framework.utils.rst
> create mode 100644 doc/api/dts/index.rst
> create mode 100644 doc/api/dts/meson.build
>
Applied to dpdk-next-dts with slight modifications suggested by Thomas
in v19. Thanks everyone for reviews.
^ permalink raw reply [flat|nested] 393+ messages in thread