DPDK patches and discussions
 help / color / mirror / Atom feed
* [RFC PATCH v1 0/4] dts: add dts api docs
@ 2023-03-23 10:40 Juraj Linkeš
  2023-03-23 10:40 ` [RFC PATCH v1 1/4] dts: code adjustments for sphinx Juraj Linkeš
                   ` (5 more replies)
  0 siblings, 6 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-03-23 10:40 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock
  Cc: dev, Juraj Linkeš

Augment the meson build system with dts api generation. The api docs are
generated from Python docstrings in DTS using Sphinx. The format of
choice is the Google format [0].

The guide html sphinx configuration is used to preserve the same style.

The build requires the same Python version and dependencies as DTS,
because Sphinx imports the Python modules. Dependencies are installed
using Poetry from the dts directory:

poetry install --with docs

After installing, enter the Poetry shell:

poetry shell

And then run the build:
ninja -C <meson_build_dir> doc

There's only one properly documented module that serves as a
demonstration of the style - framework.testbed_model.node.

I didn't figure out how to separate dts build from the rest of the docs,
which I think is required because of the different dependencies.
I thought the enable_docs option would do this, so I added
enable_dts_docs, but it doesn't seem to be working. Requesting comment
on this.

[0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes

Juraj Linkeš (4):
  dts: code adjustments for sphinx
  dts: add doc generation dependencies
  dts: add doc generation
  dts: format docstrigs to google format

 doc/api/meson.build                           |   1 +
 doc/guides/conf.py                            |  22 +-
 doc/guides/meson.build                        |   1 +
 doc/guides/tools/dts.rst                      |  29 +
 doc/meson.build                               |   5 -
 dts/doc-index.rst                             |  20 +
 dts/framework/config/__init__.py              |  11 +
 .../{testbed_model/hw => config}/cpu.py       |  13 +
 dts/framework/dts.py                          |   8 +-
 dts/framework/remote_session/__init__.py      |   3 +-
 dts/framework/remote_session/linux_session.py |   2 +-
 dts/framework/remote_session/os_session.py    |  12 +-
 .../remote_session/remote/__init__.py         |  16 -
 .../{remote => }/remote_session.py            |   0
 .../{remote => }/ssh_session.py               |   0
 dts/framework/settings.py                     |  55 +-
 dts/framework/testbed_model/__init__.py       |  10 +-
 dts/framework/testbed_model/hw/__init__.py    |  27 -
 dts/framework/testbed_model/node.py           | 164 ++--
 dts/framework/testbed_model/sut_node.py       |   9 +-
 .../testbed_model/{hw => }/virtual_device.py  |   0
 dts/main.py                                   |   3 +-
 dts/meson.build                               |  50 ++
 dts/poetry.lock                               | 770 ++++++++++++++++--
 dts/pyproject.toml                            |   7 +
 dts/tests/TestSuite_hello_world.py            |   6 +-
 meson.build                                   |   6 +
 meson_options.txt                             |   2 +
 28 files changed, 1027 insertions(+), 225 deletions(-)
 create mode 100644 dts/doc-index.rst
 rename dts/framework/{testbed_model/hw => config}/cpu.py (95%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
 create mode 100644 dts/meson.build

-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v1 1/4] dts: code adjustments for sphinx
  2023-03-23 10:40 [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
@ 2023-03-23 10:40 ` Juraj Linkeš
  2023-03-23 10:40 ` [RFC PATCH v1 2/4] dts: add doc generation dependencies Juraj Linkeš
                   ` (4 subsequent siblings)
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-03-23 10:40 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock
  Cc: dev, Juraj Linkeš

sphinx-build only imports the Python modules when building the
documentation; it doesn't run DTS. This requires changes that make the
code importable without running it. This means:
* properly guarding argument parsing in the if __name__ == '__main__'
  block.
* the logger used by DTS runner underwent the same treatment so that it
  doesn't create unnecessary log files.
* however, DTS uses the arguments to construct an object holding global
  variables. The defaults for the global variables needed to be moved
  from argument parsing elsewhere.
* importing the remote_session module from framework resulted in
  circular imports because of one module trying to import another
  module. This is fixed by more granular imports.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/config/__init__.py              | 11 ++++
 .../{testbed_model/hw => config}/cpu.py       | 13 +++++
 dts/framework/dts.py                          |  8 ++-
 dts/framework/remote_session/__init__.py      |  3 +-
 dts/framework/remote_session/linux_session.py |  2 +-
 dts/framework/remote_session/os_session.py    | 12 +++-
 .../remote_session/remote/__init__.py         | 16 ------
 .../{remote => }/remote_session.py            |  0
 .../{remote => }/ssh_session.py               |  0
 dts/framework/settings.py                     | 55 ++++++++++---------
 dts/framework/testbed_model/__init__.py       | 10 +---
 dts/framework/testbed_model/hw/__init__.py    | 27 ---------
 dts/framework/testbed_model/node.py           | 12 ++--
 dts/framework/testbed_model/sut_node.py       |  9 ++-
 .../testbed_model/{hw => }/virtual_device.py  |  0
 dts/main.py                                   |  3 +-
 dts/tests/TestSuite_hello_world.py            |  6 +-
 17 files changed, 88 insertions(+), 99 deletions(-)
 rename dts/framework/{testbed_model/hw => config}/cpu.py (95%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index ebb0823ff5..293c4cb15b 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -7,6 +7,8 @@
 Yaml config parsing methods
 """
 
+# pylama:ignore=W0611
+
 import json
 import os.path
 import pathlib
@@ -19,6 +21,15 @@
 
 from framework.settings import SETTINGS
 
+from .cpu import (
+    LogicalCore,
+    LogicalCoreCount,
+    LogicalCoreCountFilter,
+    LogicalCoreList,
+    LogicalCoreListFilter,
+    lcore_filter,
+)
+
 
 class StrEnum(Enum):
     @staticmethod
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/config/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/config/cpu.py
index d1918a12dc..8fe785dfe4 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/config/cpu.py
@@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
             )
 
         return filtered_lcores
+
+
+def lcore_filter(
+    core_list: list[LogicalCore],
+    filter_specifier: LogicalCoreCount | LogicalCoreList,
+    ascending: bool,
+) -> LogicalCoreFilter:
+    if isinstance(filter_specifier, LogicalCoreList):
+        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+    elif isinstance(filter_specifier, LogicalCoreCount):
+        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+    else:
+        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 0502284580..22a09b7e34 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,7 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
+import logging
 import sys
 
 from .config import CONFIGURATION, BuildTargetConfiguration, ExecutionConfiguration
@@ -12,7 +13,8 @@
 from .testbed_model import SutNode
 from .utils import check_dts_python_version
 
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG | logging.Logger = logging.getLogger("DTSRunner")
 result: DTSResult = DTSResult(dts_logger)
 
 
@@ -24,6 +26,10 @@ def run_all() -> None:
     global dts_logger
     global result
 
+    # create a regular DTS logger and create a new result with it
+    dts_logger = getLogger("DTSRunner")
+    result = DTSResult(dts_logger)
+
     # check the python version of the server that run dts
     check_dts_python_version()
 
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index ee221503df..17ca1459f7 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -17,7 +17,8 @@
 
 from .linux_session import LinuxSession
 from .os_session import OSSession
-from .remote import CommandResult, RemoteSession, SSHSession
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
 
 
 def create_session(
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/remote_session/linux_session.py
index a1e3bc3a92..c8ce5fe6da 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/remote_session/linux_session.py
@@ -2,8 +2,8 @@
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2023 University of New Hampshire
 
+from framework.config import LogicalCore
 from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
 from framework.utils import expand_range
 
 from .posix_session import PosixSession
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/remote_session/os_session.py
index 4c48ae2567..246f0358ea 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/remote_session/os_session.py
@@ -6,13 +6,13 @@
 from collections.abc import Iterable
 from pathlib import PurePath
 
-from framework.config import Architecture, NodeConfiguration
+from framework.config import Architecture, LogicalCore, NodeConfiguration
 from framework.logger import DTSLOG
 from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
 from framework.utils import EnvVarsDict, MesonArgs
 
-from .remote import CommandResult, RemoteSession, create_remote_session
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
 
 
 class OSSession(ABC):
@@ -173,3 +173,9 @@ def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
         if needed and mount the hugepages if needed.
         If force_first_numa is True, configure hugepages just on the first socket.
         """
+
+
+def create_remote_session(
+    node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> RemoteSession:
+    return SSHSession(node_config, name, logger)
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 8a1512210a..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,16 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-
-
-def create_remote_session(
-    node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
-    return SSHSession(node_config, name, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 71955f4581..144f9dea62 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
 import argparse
 import os
 from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Any, TypeVar
 
@@ -59,15 +59,18 @@ def __call__(
 
 @dataclass(slots=True, frozen=True)
 class _Settings:
-    config_file_path: str
-    output_dir: str
-    timeout: float
-    verbose: bool
-    skip_setup: bool
-    dpdk_tarball_path: Path
-    compile_timeout: float
-    test_cases: list
-    re_run: int
+    config_file_path: Path = Path(Path(__file__).parent.parent, "conf.yaml")
+    output_dir: str = "output"
+    timeout: float = 15
+    verbose: bool = False
+    skip_setup: bool = False
+    dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+    compile_timeout: float = 1200
+    test_cases: list[str] = field(default_factory=list)
+    re_run: int = 0
+
+
+SETTINGS: _Settings = _Settings()
 
 
 def _get_parser() -> argparse.ArgumentParser:
@@ -81,7 +84,8 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--config-file",
         action=_env_arg("DTS_CFG_FILE"),
-        default="conf.yaml",
+        default=SETTINGS.config_file_path,
+        type=Path,
         help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
         "and targets.",
     )
@@ -90,7 +94,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--output-dir",
         "--output",
         action=_env_arg("DTS_OUTPUT_DIR"),
-        default="output",
+        default=SETTINGS.output_dir,
         help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
     )
 
@@ -98,7 +102,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-t",
         "--timeout",
         action=_env_arg("DTS_TIMEOUT"),
-        default=15,
+        default=SETTINGS.timeout,
         type=float,
         help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
         "compiling DPDK.",
@@ -108,7 +112,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-v",
         "--verbose",
         action=_env_arg("DTS_VERBOSE"),
-        default="N",
+        default=SETTINGS.verbose,
         help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
         "to the console.",
     )
@@ -117,7 +121,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-s",
         "--skip-setup",
         action=_env_arg("DTS_SKIP_SETUP"),
-        default="N",
+        default=SETTINGS.skip_setup,
         help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
     )
 
@@ -125,7 +129,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--tarball",
         "--snapshot",
         action=_env_arg("DTS_DPDK_TARBALL"),
-        default="dpdk.tar.xz",
+        default=SETTINGS.dpdk_tarball_path,
         type=Path,
         help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball "
         "which will be used in testing.",
@@ -134,7 +138,7 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--compile-timeout",
         action=_env_arg("DTS_COMPILE_TIMEOUT"),
-        default=1200,
+        default=SETTINGS.compile_timeout,
         type=float,
         help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
     )
@@ -142,8 +146,9 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--test-cases",
         action=_env_arg("DTS_TESTCASES"),
-        default="",
-        help="[DTS_TESTCASES] Comma-separated list of test cases to execute. "
+        nargs="*",
+        default=SETTINGS.test_cases,
+        help="[DTS_TESTCASES] A list of test cases to execute. "
         "Unknown test cases will be silently ignored.",
     )
 
@@ -151,7 +156,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--re-run",
         "--re_run",
         action=_env_arg("DTS_RERUN"),
-        default=0,
+        default=SETTINGS.re_run,
         type=int,
         help="[DTS_RERUN] Re-run each test case the specified amount of times "
         "if a test failure occurs",
@@ -165,10 +170,11 @@ def _check_tarball_path(parsed_args: argparse.Namespace) -> None:
         raise ConfigurationError(f"DPDK tarball '{parsed_args.tarball}' doesn't exist.")
 
 
-def _get_settings() -> _Settings:
+def set_settings() -> None:
+    global SETTINGS
     parsed_args = _get_parser().parse_args()
     _check_tarball_path(parsed_args)
-    return _Settings(
+    SETTINGS = _Settings(
         config_file_path=parsed_args.config_file,
         output_dir=parsed_args.output_dir,
         timeout=parsed_args.timeout,
@@ -176,9 +182,6 @@ def _get_settings() -> _Settings:
         skip_setup=(parsed_args.skip_setup == "Y"),
         dpdk_tarball_path=parsed_args.tarball,
         compile_timeout=parsed_args.compile_timeout,
-        test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+        test_cases=parsed_args.test_cases,
         re_run=parsed_args.re_run,
     )
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index f54a947051..148f81993d 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,14 +9,6 @@
 
 # pylama:ignore=W0611
 
-from .hw import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-    VirtualDevice,
-    lcore_filter,
-)
 from .node import Node
 from .sut_node import SutNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
-    core_list: list[LogicalCore],
-    filter_specifier: LogicalCoreCount | LogicalCoreList,
-    ascending: bool,
-) -> LogicalCoreFilter:
-    if isinstance(filter_specifier, LogicalCoreList):
-        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
-    elif isinstance(filter_specifier, LogicalCoreCount):
-        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
-    else:
-        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index d48fafe65d..90467981c3 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,19 +12,17 @@
 from framework.config import (
     BuildTargetConfiguration,
     ExecutionConfiguration,
-    NodeConfiguration,
-)
-from framework.logger import DTSLOG, getLogger
-from framework.remote_session import OSSession, create_session
-from framework.settings import SETTINGS
-
-from .hw import (
     LogicalCore,
     LogicalCoreCount,
     LogicalCoreList,
     LogicalCoreListFilter,
+    NodeConfiguration,
     lcore_filter,
 )
+from framework.logger import DTSLOG, getLogger
+from framework.remote_session import create_session
+from framework.remote_session.os_session import OSSession
+from framework.settings import SETTINGS
 
 
 class Node(object):
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 2b2b50d982..6db4a505bb 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -7,13 +7,18 @@
 import time
 from pathlib import PurePath
 
-from framework.config import BuildTargetConfiguration, NodeConfiguration
+from framework.config import (
+    BuildTargetConfiguration,
+    LogicalCoreCount,
+    LogicalCoreList,
+    NodeConfiguration,
+)
 from framework.remote_session import CommandResult, OSSession
 from framework.settings import SETTINGS
 from framework.utils import EnvVarsDict, MesonArgs
 
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
 from .node import Node
+from .virtual_device import VirtualDevice
 
 
 class SutNode(Node):
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..060ff1b19a 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,11 @@
 
 import logging
 
-from framework import dts
+from framework import dts, settings
 
 
 def main() -> None:
+    settings.set_settings()
     dts.run_all()
 
 
diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
index 7e3d95c0cf..96c31a6c8c 100644
--- a/dts/tests/TestSuite_hello_world.py
+++ b/dts/tests/TestSuite_hello_world.py
@@ -6,12 +6,8 @@
 No other EAL parameters apart from cores are used.
 """
 
+from framework.config import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
 from framework.test_suite import TestSuite
-from framework.testbed_model import (
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreList,
-)
 
 
 class TestHelloWorld(TestSuite):
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v1 2/4] dts: add doc generation dependencies
  2023-03-23 10:40 [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
  2023-03-23 10:40 ` [RFC PATCH v1 1/4] dts: code adjustments for sphinx Juraj Linkeš
@ 2023-03-23 10:40 ` Juraj Linkeš
  2023-03-23 10:40 ` [RFC PATCH v1 3/4] dts: add doc generation Juraj Linkeš
                   ` (3 subsequent siblings)
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-03-23 10:40 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock
  Cc: dev, Juraj Linkeš

Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/poetry.lock    | 770 +++++++++++++++++++++++++++++++++++++++++----
 dts/pyproject.toml |   7 +
 2 files changed, 710 insertions(+), 67 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 0b2a007d4d..500f89dac1 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,24 +1,69 @@
+# This file is automatically @generated by Poetry and should not be changed by hand.
+
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+    {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
 [[package]]
 name = "attrs"
-version = "22.1.0"
+version = "22.2.0"
 description = "Classes Without Boilerplate"
 category = "main"
 optional = false
-python-versions = ">=3.5"
+python-versions = ">=3.6"
+files = [
+    {file = "attrs-22.2.0-py3-none-any.whl", hash = "sha256:29e95c7f6778868dbd49170f98f8818f78f3dc5e0e37c0b1f474e3561b240836"},
+    {file = "attrs-22.2.0.tar.gz", hash = "sha256:c9227bfc2f01993c03f68db37d1d15c9690188323c067c641f1a35ca58185f99"},
+]
 
 [package.extras]
-dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit", "cloudpickle"]
-docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
-tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "cloudpickle"]
-tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "cloudpickle"]
+cov = ["attrs[tests]", "coverage-enable-subprocess", "coverage[toml] (>=5.3)"]
+dev = ["attrs[docs,tests]"]
+docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope.interface"]
+tests = ["attrs[tests-no-zope]", "zope.interface"]
+tests-no-zope = ["cloudpickle", "cloudpickle", "hypothesis", "hypothesis", "mypy (>=0.971,<0.990)", "mypy (>=0.971,<0.990)", "pympler", "pympler", "pytest (>=4.3.0)", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-mypy-plugins", "pytest-xdist[psutil]", "pytest-xdist[psutil]"]
+
+[[package]]
+name = "babel"
+version = "2.12.1"
+description = "Internationalization utilities"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Babel-2.12.1-py3-none-any.whl", hash = "sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610"},
+    {file = "Babel-2.12.1.tar.gz", hash = "sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455"},
+]
 
 [[package]]
 name = "black"
-version = "22.10.0"
+version = "22.12.0"
 description = "The uncompromising code formatter."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "black-22.12.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9eedd20838bd5d75b80c9f5487dbcb06836a43833a37846cf1d8c1cc01cef59d"},
+    {file = "black-22.12.0-cp310-cp310-win_amd64.whl", hash = "sha256:159a46a4947f73387b4d83e87ea006dbb2337eab6c879620a3ba52699b1f4351"},
+    {file = "black-22.12.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d30b212bffeb1e252b31dd269dfae69dd17e06d92b87ad26e23890f3efea366f"},
+    {file = "black-22.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:7412e75863aa5c5411886804678b7d083c7c28421210180d67dfd8cf1221e1f4"},
+    {file = "black-22.12.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c116eed0efb9ff870ded8b62fe9f28dd61ef6e9ddd28d83d7d264a38417dcee2"},
+    {file = "black-22.12.0-cp37-cp37m-win_amd64.whl", hash = "sha256:1f58cbe16dfe8c12b7434e50ff889fa479072096d79f0a7f25e4ab8e94cd8350"},
+    {file = "black-22.12.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:77d86c9f3db9b1bf6761244bc0b3572a546f5fe37917a044e02f3166d5aafa7d"},
+    {file = "black-22.12.0-cp38-cp38-win_amd64.whl", hash = "sha256:82d9fe8fee3401e02e79767016b4907820a7dc28d70d137eb397b92ef3cc5bfc"},
+    {file = "black-22.12.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:101c69b23df9b44247bd88e1d7e90154336ac4992502d4197bdac35dd7ee3320"},
+    {file = "black-22.12.0-cp39-cp39-win_amd64.whl", hash = "sha256:559c7a1ba9a006226f09e4916060982fd27334ae1998e7a38b3f33a37f7a2148"},
+    {file = "black-22.12.0-py3-none-any.whl", hash = "sha256:436cc9167dd28040ad90d3b404aec22cedf24a6e4d7de221bec2730ec0c97bcf"},
+    {file = "black-22.12.0.tar.gz", hash = "sha256:229351e5a18ca30f447bf724d007f890f97e13af070bb6ad4c0a441cd7596a2f"},
+]
 
 [package.dependencies]
 click = ">=8.0.0"
@@ -33,6 +78,103 @@ d = ["aiohttp (>=3.7.4)"]
 jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
 uvloop = ["uvloop (>=0.15.2)"]
 
+[[package]]
+name = "certifi"
+version = "2022.12.7"
+description = "Python package for providing Mozilla's CA Bundle."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "certifi-2022.12.7-py3-none-any.whl", hash = "sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"},
+    {file = "certifi-2022.12.7.tar.gz", hash = "sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3"},
+]
+
+[[package]]
+name = "charset-normalizer"
+version = "3.1.0"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+category = "dev"
+optional = false
+python-versions = ">=3.7.0"
+files = [
+    {file = "charset-normalizer-3.1.0.tar.gz", hash = "sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-win32.whl", hash = "sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-win32.whl", hash = "sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-win32.whl", hash = "sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-win32.whl", hash = "sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-win32.whl", hash = "sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b"},
+    {file = "charset_normalizer-3.1.0-py3-none-any.whl", hash = "sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d"},
+]
+
 [[package]]
 name = "click"
 version = "8.1.3"
@@ -40,6 +182,10 @@ description = "Composable command line interface toolkit"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"},
+    {file = "click-8.1.3.tar.gz", hash = "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e"},
+]
 
 [package.dependencies]
 colorama = {version = "*", markers = "platform_system == \"Windows\""}
@@ -51,20 +197,82 @@ description = "Cross-platform colored terminal text."
 category = "dev"
 optional = false
 python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
+files = [
+    {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
+    {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
+]
+
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+    {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+    {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+    {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+    {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+    {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
 
 [[package]]
 name = "isort"
-version = "5.10.1"
+version = "5.12.0"
 description = "A Python utility / library to sort Python imports."
 category = "dev"
 optional = false
-python-versions = ">=3.6.1,<4.0"
+python-versions = ">=3.8.0"
+files = [
+    {file = "isort-5.12.0-py3-none-any.whl", hash = "sha256:f84c2818376e66cf843d497486ea8fed8700b340f308f076c6fb1229dff318b6"},
+    {file = "isort-5.12.0.tar.gz", hash = "sha256:8bef7dde241278824a6d83f44a544709b065191b95b6e50894bdc722fcba0504"},
+]
 
 [package.extras]
-pipfile_deprecated_finder = ["pipreqs", "requirementslib"]
-requirements_deprecated_finder = ["pipreqs", "pip-api"]
-colors = ["colorama (>=0.4.3,<0.5.0)"]
+colors = ["colorama (>=0.4.3)"]
+pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"]
 plugins = ["setuptools"]
+requirements-deprecated-finder = ["pip-api", "pipreqs"]
+
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+    {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
 
 [[package]]
 name = "jsonpatch"
@@ -73,6 +281,10 @@ description = "Apply JSON-Patches (RFC 6902)"
 category = "main"
 optional = false
 python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+    {file = "jsonpatch-1.32-py2.py3-none-any.whl", hash = "sha256:26ac385719ac9f54df8a2f0827bb8253aa3ea8ab7b3368457bcdb8c14595a397"},
+    {file = "jsonpatch-1.32.tar.gz", hash = "sha256:b6ddfe6c3db30d81a96aaeceb6baf916094ffa23d7dd5fa2c13e13f8b6e600c2"},
+]
 
 [package.dependencies]
 jsonpointer = ">=1.9"
@@ -84,14 +296,22 @@ description = "Identify specific nodes in a JSON document (RFC 6901)"
 category = "main"
 optional = false
 python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+    {file = "jsonpointer-2.3-py2.py3-none-any.whl", hash = "sha256:51801e558539b4e9cd268638c078c6c5746c9ac96bc38152d443400e4f3793e9"},
+    {file = "jsonpointer-2.3.tar.gz", hash = "sha256:97cba51526c829282218feb99dab1b1e6bdf8efd1c43dc9d57be093c0d69c99a"},
+]
 
 [[package]]
 name = "jsonschema"
-version = "4.17.0"
+version = "4.17.3"
 description = "An implementation of JSON Schema validation for Python"
 category = "main"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "jsonschema-4.17.3-py3-none-any.whl", hash = "sha256:a870ad254da1a8ca84b6a2905cac29d265f805acc57af304784962a2aa6508f6"},
+    {file = "jsonschema-4.17.3.tar.gz", hash = "sha256:0f864437ab8b6076ba6707453ef8f98a6a0d512a80e93f8abdb676f737ecb60d"},
+]
 
 [package.dependencies]
 attrs = ">=17.4.0"
@@ -101,6 +321,66 @@ pyrsistent = ">=0.14.0,<0.17.0 || >0.17.0,<0.17.1 || >0.17.1,<0.17.2 || >0.17.2"
 format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
 format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
 
+[[package]]
+name = "markupsafe"
+version = "2.1.2"
+description = "Safely add untrusted strings to HTML/XML markup."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:665a36ae6f8f20a4676b53224e33d456a6f5a72657d9c83c2aa00765072f31f7"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:340bea174e9761308703ae988e982005aedf427de816d1afe98147668cc03036"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22152d00bf4a9c7c83960521fc558f55a1adbc0631fbb00a9471e097b19d72e1"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:28057e985dace2f478e042eaa15606c7efccb700797660629da387eb289b9323"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca244fa73f50a800cf8c3ebf7fd93149ec37f5cb9596aa8873ae2c1d23498601"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d9d971ec1e79906046aa3ca266de79eac42f1dbf3612a05dc9368125952bd1a1"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7e007132af78ea9df29495dbf7b5824cb71648d7133cf7848a2a5dd00d36f9ff"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7313ce6a199651c4ed9d7e4cfb4aa56fe923b1adf9af3b420ee14e6d9a73df65"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-win32.whl", hash = "sha256:c4a549890a45f57f1ebf99c067a4ad0cb423a05544accaf2b065246827ed9603"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:835fb5e38fd89328e9c81067fd642b3593c33e1e17e2fdbf77f5676abb14a156"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:2ec4f2d48ae59bbb9d1f9d7efb9236ab81429a764dedca114f5fdabbc3788013"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:608e7073dfa9e38a85d38474c082d4281f4ce276ac0010224eaba11e929dd53a"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65608c35bfb8a76763f37036547f7adfd09270fbdbf96608be2bead319728fcd"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2bfb563d0211ce16b63c7cb9395d2c682a23187f54c3d79bfec33e6705473c6"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:da25303d91526aac3672ee6d49a2f3db2d9502a4a60b55519feb1a4c7714e07d"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:9cad97ab29dfc3f0249b483412c85c8ef4766d96cdf9dcf5a1e3caa3f3661cf1"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:085fd3201e7b12809f9e6e9bc1e5c96a368c8523fad5afb02afe3c051ae4afcc"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1bea30e9bf331f3fef67e0a3877b2288593c98a21ccb2cf29b74c581a4eb3af0"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-win32.whl", hash = "sha256:7df70907e00c970c60b9ef2938d894a9381f38e6b9db73c5be35e59d92e06625"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:e55e40ff0cc8cc5c07996915ad367fa47da6b3fc091fdadca7f5403239c5fec3"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a6e40afa7f45939ca356f348c8e23048e02cb109ced1eb8420961b2f40fb373a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cf877ab4ed6e302ec1d04952ca358b381a882fbd9d1b07cccbfd61783561f98a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63ba06c9941e46fa389d389644e2d8225e0e3e5ebcc4ff1ea8506dce646f8c8a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f1cd098434e83e656abf198f103a8207a8187c0fc110306691a2e94a78d0abb2"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:55f44b440d491028addb3b88f72207d71eeebfb7b5dbf0643f7c023ae1fba619"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:a6f2fcca746e8d5910e18782f976489939d54a91f9411c32051b4aab2bd7c513"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:0b462104ba25f1ac006fdab8b6a01ebbfbce9ed37fd37fd4acd70c67c973e460"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-win32.whl", hash = "sha256:7668b52e102d0ed87cb082380a7e2e1e78737ddecdde129acadb0eccc5423859"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6d6607f98fcf17e534162f0709aaad3ab7a96032723d8ac8750ffe17ae5a0666"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a806db027852538d2ad7555b203300173dd1b77ba116de92da9afbc3a3be3eed"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:a4abaec6ca3ad8660690236d11bfe28dfd707778e2442b45addd2f086d6ef094"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f03a532d7dee1bed20bc4884194a16160a2de9ffc6354b3878ec9682bb623c54"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4cf06cdc1dda95223e9d2d3c58d3b178aa5dacb35ee7e3bbac10e4e1faacb419"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:22731d79ed2eb25059ae3df1dfc9cb1546691cc41f4e3130fe6bfbc3ecbbecfa"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f8ffb705ffcf5ddd0e80b65ddf7bed7ee4f5a441ea7d3419e861a12eaf41af58"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8db032bf0ce9022a8e41a22598eefc802314e81b879ae093f36ce9ddf39ab1ba"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2298c859cfc5463f1b64bd55cb3e602528db6fa0f3cfd568d3605c50678f8f03"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-win32.whl", hash = "sha256:50c42830a633fa0cf9e7d27664637532791bfc31c731a87b202d2d8ac40c3ea2"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-win_amd64.whl", hash = "sha256:bb06feb762bade6bf3c8b844462274db0c76acc95c52abe8dbed28ae3d44a147"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:99625a92da8229df6d44335e6fcc558a5037dd0a760e11d84be2260e6f37002f"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8bca7e26c1dd751236cfb0c6c72d4ad61d986e9a41bbf76cb445f69488b2a2bd"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40627dcf047dadb22cd25ea7ecfe9cbf3bbbad0482ee5920b582f3809c97654f"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40dfd3fefbef579ee058f139733ac336312663c6706d1163b82b3003fb1925c4"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:090376d812fb6ac5f171e5938e82e7f2d7adc2b629101cec0db8b267815c85e2"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2e7821bffe00aa6bd07a23913b7f4e01328c3d5cc0b40b36c0bd81d362faeb65"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:c0a33bc9f02c2b17c3ea382f91b4db0e6cde90b63b296422a939886a7a80de1c"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:b8526c6d437855442cdd3d87eede9c425c4445ea011ca38d937db299382e6fa3"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-win32.whl", hash = "sha256:137678c63c977754abe9086a3ec011e8fd985ab90631145dfb9294ad09c102a7"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:0576fe974b40a400449768941d5d0858cc624e3249dfd1e0c33674e5c7ca7aed"},
+    {file = "MarkupSafe-2.1.2.tar.gz", hash = "sha256:abcabc8c2b26036d62d4c746381a6f7cf60aafcc653198ad678306986b09450d"},
+]
+
 [[package]]
 name = "mccabe"
 version = "0.7.0"
@@ -108,6 +388,10 @@ description = "McCabe checker, plugin for flake8"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "mccabe-0.7.0-py2.py3-none-any.whl", hash = "sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"},
+    {file = "mccabe-0.7.0.tar.gz", hash = "sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325"},
+]
 
 [[package]]
 name = "mypy"
@@ -116,6 +400,31 @@ description = "Optional static typing for Python"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "mypy-0.961-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:697540876638ce349b01b6786bc6094ccdaba88af446a9abb967293ce6eaa2b0"},
+    {file = "mypy-0.961-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b117650592e1782819829605a193360a08aa99f1fc23d1d71e1a75a142dc7e15"},
+    {file = "mypy-0.961-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bdd5ca340beffb8c44cb9dc26697628d1b88c6bddf5c2f6eb308c46f269bb6f3"},
+    {file = "mypy-0.961-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3e09f1f983a71d0672bbc97ae33ee3709d10c779beb613febc36805a6e28bb4e"},
+    {file = "mypy-0.961-cp310-cp310-win_amd64.whl", hash = "sha256:e999229b9f3198c0c880d5e269f9f8129c8862451ce53a011326cad38b9ccd24"},
+    {file = "mypy-0.961-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:b24be97351084b11582fef18d79004b3e4db572219deee0212078f7cf6352723"},
+    {file = "mypy-0.961-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f4a21d01fc0ba4e31d82f0fff195682e29f9401a8bdb7173891070eb260aeb3b"},
+    {file = "mypy-0.961-cp36-cp36m-win_amd64.whl", hash = "sha256:439c726a3b3da7ca84a0199a8ab444cd8896d95012c4a6c4a0d808e3147abf5d"},
+    {file = "mypy-0.961-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5a0b53747f713f490affdceef835d8f0cb7285187a6a44c33821b6d1f46ed813"},
+    {file = "mypy-0.961-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:0e9f70df36405c25cc530a86eeda1e0867863d9471fe76d1273c783df3d35c2e"},
+    {file = "mypy-0.961-cp37-cp37m-win_amd64.whl", hash = "sha256:b88f784e9e35dcaa075519096dc947a388319cb86811b6af621e3523980f1c8a"},
+    {file = "mypy-0.961-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:d5aaf1edaa7692490f72bdb9fbd941fbf2e201713523bdb3f4038be0af8846c6"},
+    {file = "mypy-0.961-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9f5f5a74085d9a81a1f9c78081d60a0040c3efb3f28e5c9912b900adf59a16e6"},
+    {file = "mypy-0.961-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f4b794db44168a4fc886e3450201365c9526a522c46ba089b55e1f11c163750d"},
+    {file = "mypy-0.961-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:64759a273d590040a592e0f4186539858c948302c653c2eac840c7a3cd29e51b"},
+    {file = "mypy-0.961-cp38-cp38-win_amd64.whl", hash = "sha256:63e85a03770ebf403291ec50097954cc5caf2a9205c888ce3a61bd3f82e17569"},
+    {file = "mypy-0.961-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5f1332964963d4832a94bebc10f13d3279be3ce8f6c64da563d6ee6e2eeda932"},
+    {file = "mypy-0.961-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:006be38474216b833eca29ff6b73e143386f352e10e9c2fbe76aa8549e5554f5"},
+    {file = "mypy-0.961-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9940e6916ed9371809b35b2154baf1f684acba935cd09928952310fbddaba648"},
+    {file = "mypy-0.961-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:a5ea0875a049de1b63b972456542f04643daf320d27dc592d7c3d9cd5d9bf950"},
+    {file = "mypy-0.961-cp39-cp39-win_amd64.whl", hash = "sha256:1ece702f29270ec6af25db8cf6185c04c02311c6bb21a69f423d40e527b75c56"},
+    {file = "mypy-0.961-py3-none-any.whl", hash = "sha256:03c6cc893e7563e7b2949b969e63f02c000b32502a1b4d1314cabe391aa87d66"},
+    {file = "mypy-0.961.tar.gz", hash = "sha256:f730d56cb924d371c26b8eaddeea3cc07d78ff51c521c6d04899ac6904b75492"},
+]
 
 [package.dependencies]
 mypy-extensions = ">=0.4.3"
@@ -129,19 +438,39 @@ reports = ["lxml"]
 
 [[package]]
 name = "mypy-extensions"
-version = "0.4.3"
-description = "Experimental type system extensions for programs checked with the mypy typechecker."
+version = "1.0.0"
+description = "Type system extensions for programs checked with the mypy type checker."
 category = "dev"
 optional = false
-python-versions = "*"
+python-versions = ">=3.5"
+files = [
+    {file = "mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d"},
+    {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
+]
+
+[[package]]
+name = "packaging"
+version = "23.0"
+description = "Core utilities for Python packages"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "packaging-23.0-py3-none-any.whl", hash = "sha256:714ac14496c3e68c99c29b00845f7a2b85f3bb6f1078fd9f72fd20f0570002b2"},
+    {file = "packaging-23.0.tar.gz", hash = "sha256:b6ad297f8907de0fa2fe1ccbd26fdaf387f5f47c7275fedf8cce89f99446cf97"},
+]
 
 [[package]]
 name = "pathspec"
-version = "0.10.1"
+version = "0.11.1"
 description = "Utility library for gitignore style pattern matching of file paths."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pathspec-0.11.1-py3-none-any.whl", hash = "sha256:d8af70af76652554bd134c22b3e8a1cc46ed7d91edcdd721ef1a0c51a84a5293"},
+    {file = "pathspec-0.11.1.tar.gz", hash = "sha256:2798de800fa92780e33acca925945e9a19a133b715067cf165b8866c15a31687"},
+]
 
 [[package]]
 name = "pexpect"
@@ -150,21 +479,29 @@ description = "Pexpect allows easy control of interactive console applications."
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
+    {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
+]
 
 [package.dependencies]
 ptyprocess = ">=0.5"
 
 [[package]]
 name = "platformdirs"
-version = "2.5.2"
-description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
+version = "3.1.1"
+description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "platformdirs-3.1.1-py3-none-any.whl", hash = "sha256:e5986afb596e4bb5bde29a79ac9061aa955b94fca2399b7aaac4090860920dd8"},
+    {file = "platformdirs-3.1.1.tar.gz", hash = "sha256:024996549ee88ec1a9aa99ff7f8fc819bb59e2c3477b410d90a16d32d6e707aa"},
+]
 
 [package.extras]
-docs = ["furo (>=2021.7.5b38)", "proselint (>=0.10.2)", "sphinx-autodoc-typehints (>=1.12)", "sphinx (>=4)"]
-test = ["appdirs (==1.4.4)", "pytest-cov (>=2.7)", "pytest-mock (>=3.6)", "pytest (>=6)"]
+docs = ["furo (>=2022.12.7)", "proselint (>=0.13)", "sphinx (>=6.1.3)", "sphinx-autodoc-typehints (>=1.22,!=1.23.4)"]
+test = ["appdirs (==1.4.4)", "covdefaults (>=2.2.2)", "pytest (>=7.2.1)", "pytest-cov (>=4)", "pytest-mock (>=3.10)"]
 
 [[package]]
 name = "ptyprocess"
@@ -173,28 +510,40 @@ description = "Run a subprocess in a pseudo terminal"
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35"},
+    {file = "ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220"},
+]
 
 [[package]]
 name = "pycodestyle"
-version = "2.9.1"
+version = "2.10.0"
 description = "Python style guide checker"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pycodestyle-2.10.0-py2.py3-none-any.whl", hash = "sha256:8a4eaf0d0495c7395bdab3589ac2db602797d76207242c17d470186815706610"},
+    {file = "pycodestyle-2.10.0.tar.gz", hash = "sha256:347187bdb476329d98f695c213d7295a846d1152ff4fe9bacb8a9590b8ee7053"},
+]
 
 [[package]]
 name = "pydocstyle"
-version = "6.1.1"
+version = "6.3.0"
 description = "Python docstring style checker"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
+    {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+]
 
 [package.dependencies]
-snowballstemmer = "*"
+snowballstemmer = ">=2.2.0"
 
 [package.extras]
-toml = ["toml"]
+toml = ["tomli (>=1.2.3)"]
 
 [[package]]
 name = "pyflakes"
@@ -203,6 +552,25 @@ description = "passive checker of Python programs"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pyflakes-2.5.0-py2.py3-none-any.whl", hash = "sha256:4579f67d887f804e67edb544428f264b7b24f435b263c4614f384135cea553d2"},
+    {file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
+]
+
+[[package]]
+name = "pygments"
+version = "2.14.0"
+description = "Pygments is a syntax highlighting package written in Python."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "Pygments-2.14.0-py3-none-any.whl", hash = "sha256:fa7bd7bd2771287c0de303af8bfdfc731f51bd2c6a47ab69d117138893b82717"},
+    {file = "Pygments-2.14.0.tar.gz", hash = "sha256:b3ed06a9e8ac9a9aae5a6f5dbe78a8a58655d17b43b93c078f094ddc476ae297"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
 
 [[package]]
 name = "pylama"
@@ -211,6 +579,10 @@ description = "Code audit tool for python"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pylama-8.4.1-py3-none-any.whl", hash = "sha256:5bbdbf5b620aba7206d688ed9fc917ecd3d73e15ec1a89647037a09fa3a86e60"},
+    {file = "pylama-8.4.1.tar.gz", hash = "sha256:2d4f7aecfb5b7466216d48610c7d6bad1c3990c29cdd392ad08259b161e486f6"},
+]
 
 [package.dependencies]
 mccabe = ">=0.7.0"
@@ -219,22 +591,51 @@ pydocstyle = ">=6.1.1"
 pyflakes = ">=2.5.0"
 
 [package.extras]
-all = ["pylint", "eradicate", "radon", "mypy", "vulture"]
+all = ["eradicate", "mypy", "pylint", "radon", "vulture"]
 eradicate = ["eradicate"]
 mypy = ["mypy"]
 pylint = ["pylint"]
 radon = ["radon"]
-tests = ["pytest (>=7.1.2)", "pytest-mypy", "eradicate (>=2.0.0)", "radon (>=5.1.0)", "mypy", "pylint (>=2.11.1)", "pylama-quotes", "toml", "vulture", "types-setuptools", "types-toml"]
+tests = ["eradicate (>=2.0.0)", "mypy", "pylama-quotes", "pylint (>=2.11.1)", "pytest (>=7.1.2)", "pytest-mypy", "radon (>=5.1.0)", "toml", "types-setuptools", "types-toml", "vulture"]
 toml = ["toml (>=0.10.2)"]
 vulture = ["vulture"]
 
 [[package]]
 name = "pyrsistent"
-version = "0.19.1"
+version = "0.19.3"
 description = "Persistent/Functional/Immutable data structures"
 category = "main"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pyrsistent-0.19.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:20460ac0ea439a3e79caa1dbd560344b64ed75e85d8703943e0b66c2a6150e4a"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4c18264cb84b5e68e7085a43723f9e4c1fd1d935ab240ce02c0324a8e01ccb64"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b774f9288dda8d425adb6544e5903f1fb6c273ab3128a355c6b972b7df39dcf"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-win32.whl", hash = "sha256:5a474fb80f5e0d6c9394d8db0fc19e90fa540b82ee52dba7d246a7791712f74a"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-win_amd64.whl", hash = "sha256:49c32f216c17148695ca0e02a5c521e28a4ee6c5089f97e34fe24163113722da"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:f0774bf48631f3a20471dd7c5989657b639fd2d285b861237ea9e82c36a415a9"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ab2204234c0ecd8b9368dbd6a53e83c3d4f3cab10ecaf6d0e772f456c442393"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e42296a09e83028b3476f7073fcb69ffebac0e66dbbfd1bd847d61f74db30f19"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-win32.whl", hash = "sha256:64220c429e42a7150f4bfd280f6f4bb2850f95956bde93c6fda1b70507af6ef3"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-win_amd64.whl", hash = "sha256:016ad1afadf318eb7911baa24b049909f7f3bb2c5b1ed7b6a8f21db21ea3faa8"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c4db1bd596fefd66b296a3d5d943c94f4fac5bcd13e99bffe2ba6a759d959a28"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aeda827381f5e5d65cced3024126529ddc4289d944f75e090572c77ceb19adbf"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:42ac0b2f44607eb92ae88609eda931a4f0dfa03038c44c772e07f43e738bcac9"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-win32.whl", hash = "sha256:e8f2b814a3dc6225964fa03d8582c6e0b6650d68a232df41e3cc1b66a5d2f8d1"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-win_amd64.whl", hash = "sha256:c9bb60a40a0ab9aba40a59f68214eed5a29c6274c83b2cc206a359c4a89fa41b"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a2471f3f8693101975b1ff85ffd19bb7ca7dd7c38f8a81701f67d6b4f97b87d8"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc5d149f31706762c1f8bda2e8c4f8fead6e80312e3692619a75301d3dbb819a"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3311cb4237a341aa52ab8448c27e3a9931e2ee09561ad150ba94e4cfd3fc888c"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-win32.whl", hash = "sha256:f0e7c4b2f77593871e918be000b96c8107da48444d57005b6a6bc61fb4331b2c"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-win_amd64.whl", hash = "sha256:c147257a92374fde8498491f53ffa8f4822cd70c0d85037e09028e478cababb7"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b735e538f74ec31378f5a1e3886a26d2ca6351106b4dfde376a26fc32a044edc"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99abb85579e2165bd8522f0c0138864da97847875ecbd45f3e7e2af569bfc6f2"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3a8cb235fa6d3fd7aae6a4f1429bbb1fec1577d978098da1252f0489937786f3"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-win32.whl", hash = "sha256:c74bed51f9b41c48366a286395c67f4e894374306b197e62810e0fdaf2364da2"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-win_amd64.whl", hash = "sha256:878433581fc23e906d947a6814336eee031a00e6defba224234169ae3d3d6a98"},
+    {file = "pyrsistent-0.19.3-py3-none-any.whl", hash = "sha256:ccf0d6bd208f8111179f0c26fdf84ed7c3891982f2edaeae7422575f47e66b64"},
+    {file = "pyrsistent-0.19.3.tar.gz", hash = "sha256:1a2994773706bbb4995c31a97bc94f1418314923bd1048c6d964837040376440"},
+]
 
 [[package]]
 name = "pyyaml"
@@ -243,6 +644,70 @@ description = "YAML parser and emitter for Python"
 category = "main"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "PyYAML-6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4db7c7aef085872ef65a8fd7d6d09a14ae91f691dec3e87ee5ee0539d516f53"},
+    {file = "PyYAML-6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9df7ed3b3d2e0ecfe09e14741b857df43adb5a3ddadc919a2d94fbdf78fea53c"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77f396e6ef4c73fdc33a9157446466f1cff553d979bd00ecb64385760c6babdc"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a80a78046a72361de73f8f395f1f1e49f956c6be882eed58505a15f3e430962b"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f84fbc98b019fef2ee9a1cb3ce93e3187a6df0b2538a651bfb890254ba9f90b5"},
+    {file = "PyYAML-6.0-cp310-cp310-win32.whl", hash = "sha256:2cd5df3de48857ed0544b34e2d40e9fac445930039f3cfe4bcc592a1f836d513"},
+    {file = "PyYAML-6.0-cp310-cp310-win_amd64.whl", hash = "sha256:daf496c58a8c52083df09b80c860005194014c3698698d1a57cbcfa182142a3a"},
+    {file = "PyYAML-6.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d4b0ba9512519522b118090257be113b9468d804b19d63c71dbcf4a48fa32358"},
+    {file = "PyYAML-6.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:81957921f441d50af23654aa6c5e5eaf9b06aba7f0a19c18a538dc7ef291c5a1"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:afa17f5bc4d1b10afd4466fd3a44dc0e245382deca5b3c353d8b757f9e3ecb8d"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dbad0e9d368bb989f4515da330b88a057617d16b6a8245084f1b05400f24609f"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:432557aa2c09802be39460360ddffd48156e30721f5e8d917f01d31694216782"},
+    {file = "PyYAML-6.0-cp311-cp311-win32.whl", hash = "sha256:bfaef573a63ba8923503d27530362590ff4f576c626d86a9fed95822a8255fd7"},
+    {file = "PyYAML-6.0-cp311-cp311-win_amd64.whl", hash = "sha256:01b45c0191e6d66c470b6cf1b9531a771a83c1c4208272ead47a3ae4f2f603bf"},
+    {file = "PyYAML-6.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:897b80890765f037df3403d22bab41627ca8811ae55e9a722fd0392850ec4d86"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50602afada6d6cbfad699b0c7bb50d5ccffa7e46a3d738092afddc1f9758427f"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:48c346915c114f5fdb3ead70312bd042a953a8ce5c7106d5bfb1a5254e47da92"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:98c4d36e99714e55cfbaaee6dd5badbc9a1ec339ebfc3b1f52e293aee6bb71a4"},
+    {file = "PyYAML-6.0-cp36-cp36m-win32.whl", hash = "sha256:0283c35a6a9fbf047493e3a0ce8d79ef5030852c51e9d911a27badfde0605293"},
+    {file = "PyYAML-6.0-cp36-cp36m-win_amd64.whl", hash = "sha256:07751360502caac1c067a8132d150cf3d61339af5691fe9e87803040dbc5db57"},
+    {file = "PyYAML-6.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:819b3830a1543db06c4d4b865e70ded25be52a2e0631ccd2f6a47a2822f2fd7c"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:473f9edb243cb1935ab5a084eb238d842fb8f404ed2193a915d1784b5a6b5fc0"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0ce82d761c532fe4ec3f87fc45688bdd3a4c1dc5e0b4a19814b9009a29baefd4"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:231710d57adfd809ef5d34183b8ed1eeae3f76459c18fb4a0b373ad56bedcdd9"},
+    {file = "PyYAML-6.0-cp37-cp37m-win32.whl", hash = "sha256:c5687b8d43cf58545ade1fe3e055f70eac7a5a1a0bf42824308d868289a95737"},
+    {file = "PyYAML-6.0-cp37-cp37m-win_amd64.whl", hash = "sha256:d15a181d1ecd0d4270dc32edb46f7cb7733c7c508857278d3d378d14d606db2d"},
+    {file = "PyYAML-6.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0b4624f379dab24d3725ffde76559cff63d9ec94e1736b556dacdfebe5ab6d4b"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:213c60cd50106436cc818accf5baa1aba61c0189ff610f64f4a3e8c6726218ba"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9fa600030013c4de8165339db93d182b9431076eb98eb40ee068700c9c813e34"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:277a0ef2981ca40581a47093e9e2d13b3f1fbbeffae064c1d21bfceba2030287"},
+    {file = "PyYAML-6.0-cp38-cp38-win32.whl", hash = "sha256:d4eccecf9adf6fbcc6861a38015c2a64f38b9d94838ac1810a9023a0609e1b78"},
+    {file = "PyYAML-6.0-cp38-cp38-win_amd64.whl", hash = "sha256:1e4747bc279b4f613a09eb64bba2ba602d8a6664c6ce6396a4d0cd413a50ce07"},
+    {file = "PyYAML-6.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:055d937d65826939cb044fc8c9b08889e8c743fdc6a32b33e2390f66013e449b"},
+    {file = "PyYAML-6.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e61ceaab6f49fb8bdfaa0f92c4b57bcfbea54c09277b1b4f7ac376bfb7a7c174"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d67d839ede4ed1b28a4e8909735fc992a923cdb84e618544973d7dfc71540803"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cba8c411ef271aa037d7357a2bc8f9ee8b58b9965831d9e51baf703280dc73d3"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:40527857252b61eacd1d9af500c3337ba8deb8fc298940291486c465c8b46ec0"},
+    {file = "PyYAML-6.0-cp39-cp39-win32.whl", hash = "sha256:b5b9eccad747aabaaffbc6064800670f0c297e52c12754eb1d976c57e4f74dcb"},
+    {file = "PyYAML-6.0-cp39-cp39-win_amd64.whl", hash = "sha256:b3d267842bf12586ba6c734f89d1f5b871df0273157918b0ccefa29deb05c21c"},
+    {file = "PyYAML-6.0.tar.gz", hash = "sha256:68fb519c14306fec9720a2a5b45bc9f0c8d1b9c72adf45c37baedfcd949c35a2"},
+]
+
+[[package]]
+name = "requests"
+version = "2.28.2"
+description = "Python HTTP for Humans."
+category = "dev"
+optional = false
+python-versions = ">=3.7, <4"
+files = [
+    {file = "requests-2.28.2-py3-none-any.whl", hash = "sha256:64299f4909223da747622c030b781c0d7811e359c37124b4bd368fb8c6518baa"},
+    {file = "requests-2.28.2.tar.gz", hash = "sha256:98b1b2782e3c6c4904938b84c0eb932721069dfdb9134313beff7c83c2df24bf"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<1.27"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
 
 [[package]]
 name = "snowballstemmer"
@@ -251,6 +716,175 @@ description = "This package provides 29 stemmers for 28 languages generated from
 category = "dev"
 optional = false
 python-versions = "*"
+files = [
+    {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"},
+    {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
+]
+
+[[package]]
+name = "sphinx"
+version = "6.1.3"
+description = "Python documentation generator"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "Sphinx-6.1.3.tar.gz", hash = "sha256:0dac3b698538ffef41716cf97ba26c1c7788dba73ce6f150c1ff5b4720786dd2"},
+    {file = "sphinx-6.1.3-py3-none-any.whl", hash = "sha256:807d1cb3d6be87eb78a381c3e70ebd8d346b9a25f3753e9947e866b2786865fc"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.0"
+description = "Read the Docs theme for Sphinx"
+category = "dev"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+    {file = "sphinx_rtd_theme-1.2.0-py2.py3-none-any.whl", hash = "sha256:f823f7e71890abe0ac6aaa6013361ea2696fc8d3e1fa798f463e82bdb77eeff2"},
+    {file = "sphinx_rtd_theme-1.2.0.tar.gz", hash = "sha256:a0d8bd1a2ed52e0b338cbe19c4b2eef3c5e7a048769753dac6a9f059c7b641b8"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = {version = ">=2.0.0,<3.0.0 || >3.0.0", markers = "python_version > \"3\""}
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.4"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "sphinxcontrib-applehelp-1.0.4.tar.gz", hash = "sha256:828f867945bbe39817c210a1abfd1bc4895c8b73fcaade56d45357a348a07d7e"},
+    {file = "sphinxcontrib_applehelp-1.0.4-py3-none-any.whl", hash = "sha256:29d341f67fb0f6f586b23ad80e072c8e6ad0b48417db2bde114a4c9746feb228"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.2"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"},
+    {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.1"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "sphinxcontrib-htmlhelp-2.0.1.tar.gz", hash = "sha256:0cbdd302815330058422b98a113195c9249825d681e18f11e8b1f78a2f11efff"},
+    {file = "sphinxcontrib_htmlhelp-2.0.1-py3-none-any.whl", hash = "sha256:c38cb46dccf316c79de6e5515e1770414b797162b23cd3d06e67020e1d2a6903"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+category = "dev"
+optional = false
+python-versions = ">=2.7"
+files = [
+    {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+    {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+    {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.3"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"},
+    {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.5"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"},
+    {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
 
 [[package]]
 name = "toml"
@@ -259,6 +893,10 @@ description = "Python Library for Tom's Obvious, Minimal Language"
 category = "dev"
 optional = false
 python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
+files = [
+    {file = "toml-0.10.2-py2.py3-none-any.whl", hash = "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b"},
+    {file = "toml-0.10.2.tar.gz", hash = "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"},
+]
 
 [[package]]
 name = "tomli"
@@ -267,22 +905,51 @@ description = "A lil' TOML parser"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"},
+    {file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"},
+]
 
 [[package]]
 name = "types-pyyaml"
-version = "6.0.12.1"
+version = "6.0.12.8"
 description = "Typing stubs for PyYAML"
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "types-PyYAML-6.0.12.8.tar.gz", hash = "sha256:19304869a89d49af00be681e7b267414df213f4eb89634c4495fa62e8f942b9f"},
+    {file = "types_PyYAML-6.0.12.8-py3-none-any.whl", hash = "sha256:5314a4b2580999b2ea06b2e5f9a7763d860d6e09cdf21c0e9561daa9cbd60178"},
+]
 
 [[package]]
 name = "typing-extensions"
-version = "4.4.0"
+version = "4.5.0"
 description = "Backported and Experimental Type Hints for Python 3.7+"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "typing_extensions-4.5.0-py3-none-any.whl", hash = "sha256:fb33085c39dd998ac16d1431ebc293a8b3eedd00fd4a32de0ff79002c19511b4"},
+    {file = "typing_extensions-4.5.0.tar.gz", hash = "sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb"},
+]
+
+[[package]]
+name = "urllib3"
+version = "1.26.15"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
+files = [
+    {file = "urllib3-1.26.15-py2.py3-none-any.whl", hash = "sha256:aa751d169e23c7479ce47a0cb0da579e3ede798f994f5816a74e4f4500dcea42"},
+    {file = "urllib3-1.26.15.tar.gz", hash = "sha256:8a388717b9476f934a21484e8c8e61875ab60644d29b9b39e11e4b9dc1c6b305"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)", "brotlipy (>=0.6.0)"]
+secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress", "pyOpenSSL (>=0.14)", "urllib3-secure-extra"]
+socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
 
 [[package]]
 name = "warlock"
@@ -291,47 +958,16 @@ description = "Python object model built on JSON schema and JSON patch."
 category = "main"
 optional = false
 python-versions = ">=3.7,<4.0"
+files = [
+    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
+    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
+]
 
 [package.dependencies]
 jsonpatch = ">=1,<2"
 jsonschema = ">=4,<5"
 
 [metadata]
-lock-version = "1.1"
+lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "a0f040b07fc6ce4deb0be078b9a88c2a465cb6bccb9e260a67e92c2403e2319f"
-
-[metadata.files]
-attrs = []
-black = []
-click = []
-colorama = []
-isort = []
-jsonpatch = []
-jsonpointer = []
-jsonschema = []
-mccabe = []
-mypy = []
-mypy-extensions = []
-pathspec = []
-pexpect = [
-    {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
-    {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
-]
-platformdirs = [
-    {file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
-    {file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
-]
-ptyprocess = []
-pycodestyle = []
-pydocstyle = []
-pyflakes = []
-pylama = []
-pyrsistent = []
-pyyaml = []
-snowballstemmer = []
-toml = []
-tomli = []
-types-pyyaml = []
-typing-extensions = []
-warlock = []
+content-hash = "b3f428e987713d7875434c4b43cadadcb7d77dd3d62fd6855fb8e77ec946f082"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index a136c91e5e..c0fe323272 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -22,6 +22,13 @@ pylama = "^8.4.1"
 pyflakes = "2.5.0"
 toml = "^0.10.2"
 
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+Sphinx = "^6.1.3"
+sphinx-rtd-theme = "^1.2.0"
+
 [tool.poetry.scripts]
 dts = "main:main"
 
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v1 3/4] dts: add doc generation
  2023-03-23 10:40 [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
  2023-03-23 10:40 ` [RFC PATCH v1 1/4] dts: code adjustments for sphinx Juraj Linkeš
  2023-03-23 10:40 ` [RFC PATCH v1 2/4] dts: add doc generation dependencies Juraj Linkeš
@ 2023-03-23 10:40 ` Juraj Linkeš
  2023-03-23 10:40 ` [RFC PATCH v1 4/4] dts: format docstrigs to google format Juraj Linkeš
                   ` (2 subsequent siblings)
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-03-23 10:40 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock
  Cc: dev, Juraj Linkeš

The tool used to generate developer docs is sphinx, which is already
used in DPDK. The configuration is kept the same to preserve the style.

Sphinx generates the documentation from Python docstrings. The docstring
format most suitable for DTS seems to be the Google format [0] which
requires the sphinx.ext.napoleon extension.

There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx import the
  code.
* Also the same Python packages as DTS, for the same reason.

[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 doc/api/meson.build      |  1 +
 doc/guides/conf.py       | 22 ++++++++++++++----
 doc/guides/meson.build   |  1 +
 doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
 doc/meson.build          |  5 ----
 dts/doc-index.rst        | 20 ++++++++++++++++
 dts/meson.build          | 50 ++++++++++++++++++++++++++++++++++++++++
 meson.build              |  6 +++++
 meson_options.txt        |  2 ++
 9 files changed, 126 insertions(+), 10 deletions(-)
 create mode 100644 dts/doc-index.rst
 create mode 100644 dts/meson.build

diff --git a/doc/api/meson.build b/doc/api/meson.build
index 2876a78a7e..ee70f09ef7 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
 
+api_build_dir = meson.current_build_dir()
 doxygen = find_program('doxygen', required: get_option('enable_docs'))
 
 if not doxygen.found()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index a55ce38800..04c842b67a 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
 from sphinx import __version__ as sphinx_version
 from os import listdir
 from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
 from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
 
 import configparser
 
@@ -24,6 +23,19 @@
           file=stderr)
     pass
 
+extensions = ['sphinx.ext.napoleon']
+
+# Python docstring options
+autodoc_member_order = 'bysource'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_use_ivar = True
+napoleon_use_rtype = False
+add_module_names = False
+toc_object_entries_show_parents = 'hide'
+
 stop_on_error = ('-W' in argv)
 
 project = 'Data Plane Development Kit'
@@ -35,8 +47,8 @@
 html_show_copyright = False
 highlight_language = 'none'
 
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+path.append(environ.setdefault('DTS_ROOT', '.'))
+version = environ.setdefault('DPDK_VERSION', "None")
 
 master_doc = 'index'
 
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..fed361060f 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Intel Corporation
 
+guides_source_dir = meson.current_source_dir()
 sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
 
 if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index ebd6dceb6a..332e2187a6 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -282,3 +282,32 @@ There are three tools used in DTS to help with code checking, style and formatti
 These three tools are all used in ``devtools/dts-check-format.sh``,
 the DTS code check and format script.
 Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+   .. code-block:: console
+
+   poetry install --with docs
+   poetry shell
+
+
+Build commands
+~~~~~~~~~~~~~~
+
+The documentation is built using the standard DPDK build system.
+
+After entering Poetry's shell, build the documentation with:
+
+   .. code-block:: console
+
+   ninja -C build doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+   Make sure to fix any Sphinx warnings when adding or updating docstrings.
diff --git a/doc/meson.build b/doc/meson.build
index 6f74706aa2..5e08bb7b80 100644
--- a/doc/meson.build
+++ b/doc/meson.build
@@ -1,11 +1,6 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
 
-doc_targets = []
-doc_target_names = []
-subdir('api')
-subdir('guides')
-
 if doc_targets.length() == 0
     message = 'No docs targets found'
 else
diff --git a/dts/doc-index.rst b/dts/doc-index.rst
new file mode 100644
index 0000000000..10151c6851
--- /dev/null
+++ b/dts/doc-index.rst
@@ -0,0 +1,20 @@
+.. DPDK Test Suite documentation master file, created by
+   sphinx-quickstart on Tue Mar 14 12:23:52 2023.
+   You can adapt this file completely to your liking, but it should at least
+   contain the root `toctree` directive.
+
+Welcome to DPDK Test Suite's documentation!
+===========================================
+
+.. toctree::
+   :maxdepth: 4
+   :caption: Contents:
+
+   modules
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..6ea7887f4b
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,50 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
+sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
+
+if sphinx.found() and sphinx_apidoc.found()
+endif
+
+dts_api_framework_dir = join_paths(meson.current_source_dir(), 'framework')
+dts_api_build_dir = join_paths(api_build_dir, 'dts')
+dts_api_src = custom_target('dts_api_src',
+        output: 'modules.rst',
+        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
+            sphinx_apidoc, '--append-syspath', '--force',
+            '--module-first', '--separate',
+            '--doc-project', 'DTS', '-V', meson.project_version(),
+            '-o', dts_api_build_dir,
+            dts_api_framework_dir],
+        build_by_default: get_option('enable_docs'))
+doc_targets += dts_api_src
+doc_target_names += 'DTS_API_sources'
+
+cp = find_program('cp', required: get_option('enable_docs'))
+cp_index = custom_target('cp_index',
+        input: 'doc-index.rst',
+        output: 'index.rst',
+        depends: dts_api_src,
+        command: [cp, '@INPUT@', join_paths(dts_api_build_dir, 'index.rst')],
+        build_by_default: get_option('enable_docs'))
+doc_targets += cp_index
+doc_target_names += 'DTS_API_cp_index'
+
+extra_sphinx_args = ['-a', '-c', guides_source_dir]
+if get_option('werror')
+    extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
+dts_api_html = custom_target('dts_api_html',
+        output: 'html',
+        depends: cp_index,
+        command: ['DTS_ROOT=@0@'.format(meson.current_source_dir()),
+            sphinx_wrapper, sphinx, meson.project_version(),
+            dts_api_build_dir, dts_api_build_dir, extra_sphinx_args],
+        build_by_default: get_option('enable_docs'),
+        install: get_option('enable_docs'),
+        install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_html'
diff --git a/meson.build b/meson.build
index f91d652bc5..48a4e12402 100644
--- a/meson.build
+++ b/meson.build
@@ -82,6 +82,12 @@ subdir('drivers')
 subdir('usertools')
 subdir('app')
 
+# define doc targets
+doc_targets = []
+doc_target_names = []
+subdir('doc/api')
+subdir('doc/guides')
+subdir('dts')
 # build docs
 subdir('doc')
 
diff --git a/meson_options.txt b/meson_options.txt
index 82c8297065..415b49fc78 100644
--- a/meson_options.txt
+++ b/meson_options.txt
@@ -16,6 +16,8 @@ option('drivers_install_subdir', type: 'string', value: 'dpdk/pmds-<VERSION>', d
        'Subdirectory of libdir where to install PMDs. Defaults to using a versioned subdirectory.')
 option('enable_docs', type: 'boolean', value: false, description:
        'build documentation')
+option('enable_dts_docs', type: 'boolean', value: false, description:
+       'Build DTS developer documentation.')
 option('enable_apps', type: 'string', value: '', description:
        'Comma-separated list of apps to build. If unspecified, build all apps.')
 option('enable_drivers', type: 'string', value: '', description:
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v1 4/4] dts: format docstrigs to google format
  2023-03-23 10:40 [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
                   ` (2 preceding siblings ...)
  2023-03-23 10:40 ` [RFC PATCH v1 3/4] dts: add doc generation Juraj Linkeš
@ 2023-03-23 10:40 ` Juraj Linkeš
  2023-04-28 19:33   ` Jeremy Spewock
  2023-04-03  9:17 ` [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
  2023-05-04 12:37 ` [RFC PATCH v2 " Juraj Linkeš
  5 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-03-23 10:40 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock
  Cc: dev, Juraj Linkeš

WIP: only one module is reformatted to serve as a demonstration.

The google format is documented here [0].

[0]: https://google.github.io/styleguide/pyguide.html

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/testbed_model/node.py | 152 +++++++++++++++++++---------
 1 file changed, 103 insertions(+), 49 deletions(-)

diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 90467981c3..ad8ef442af 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+There's a base class, Node, that's supposed to be extended by other classes
+with functionality specific to that node type.
+The only part that can be used standalone is the Node.skip_setup static method,
+which is a decorator used to skip method execution
+if skip_setup is passed by the user on the cmdline or in an env variable.
 """
 
 from typing import Any, Callable
@@ -26,10 +31,25 @@
 
 
 class Node(object):
-    """
-    Basic class for node management. This class implements methods that
-    manage a node, such as information gathering (of CPU/PCI/NIC) and
-    environment setup.
+    """The base class for node management.
+
+    It shouldn't be instantiated, but rather extended.
+    It implements common methods to manage any node:
+
+       * connection to the node
+       * information gathering of CPU
+       * hugepages setup
+
+    Arguments:
+        node_config: The config from the input configuration file.
+
+    Attributes:
+        main_session: The primary OS-agnostic remote session used
+            to communicate with the node.
+        config: The configuration used to create the node.
+        name: The name of the node.
+        lcores: The list of logical cores that DTS can use on the node.
+            It's derived from logical cores present on the node and user configuration.
     """
 
     main_session: OSSession
@@ -56,65 +76,89 @@ def __init__(self, node_config: NodeConfiguration):
         self._logger.info(f"Created node: {self.name}")
 
     def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
-        """
-        Perform the execution setup that will be done for each execution
-        this node is part of.
+        """Execution setup steps.
+
+        Configure hugepages and call self._set_up_execution where
+        the rest of the configuration steps (if any) are implemented.
+
+        Args:
+            execution_config: The execution configuration according to which
+                the setup steps will be taken.
         """
         self._setup_hugepages()
         self._set_up_execution(execution_config)
 
     def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional execution setup steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional execution setup steps.
         """
 
     def tear_down_execution(self) -> None:
-        """
-        Perform the execution teardown that will be done after each execution
-        this node is part of concludes.
+        """Execution teardown steps.
+
+        There are currently no common execution teardown steps
+        common to all DTS node types.
         """
         self._tear_down_execution()
 
     def _tear_down_execution(self) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional execution teardown steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional execution teardown steps.
         """
 
     def set_up_build_target(
         self, build_target_config: BuildTargetConfiguration
     ) -> None:
-        """
-        Perform the build target setup that will be done for each build target
-        tested on this node.
+        """Build target setup steps.
+
+        There are currently no common build target setup steps
+        common to all DTS node types.
+
+        Args:
+            build_target_config: The build target configuration according to which
+                the setup steps will be taken.
         """
         self._set_up_build_target(build_target_config)
 
     def _set_up_build_target(
         self, build_target_config: BuildTargetConfiguration
     ) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional build target setup steps for derived classes.
+
+        Derived classes should optionally overwrite this
+        if they want to add additional build target setup steps.
         """
 
     def tear_down_build_target(self) -> None:
-        """
-        Perform the build target teardown that will be done after each build target
-        tested on this node.
+        """Build target teardown steps.
+
+        There are currently no common build target teardown steps
+        common to all DTS node types.
         """
         self._tear_down_build_target()
 
     def _tear_down_build_target(self) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional build target teardown steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional build target teardown steps.
         """
 
     def create_session(self, name: str) -> OSSession:
-        """
-        Create and return a new OSSession tailored to the remote OS.
+        """Create and return a new OS-agnostic remote session.
+
+        The returned session won't be used by the object creating it.
+        Will be cleaned up automatically.
+
+        Args:
+            name: The name of the session.
+
+        Returns:
+            A new OS-agnostic remote session.
         """
         session_name = f"{self.name} {name}"
         connection = create_session(
@@ -130,14 +174,24 @@ def filter_lcores(
         filter_specifier: LogicalCoreCount | LogicalCoreList,
         ascending: bool = True,
     ) -> list[LogicalCore]:
-        """
-        Filter the LogicalCores found on the Node according to
-        a LogicalCoreCount or a LogicalCoreList.
+        """Filter the node's logical cores that DTS can use.
 
-        If ascending is True, use cores with the lowest numerical id first
-        and continue in ascending order. If False, start with the highest
-        id and continue in descending order. This ordering affects which
-        sockets to consider first as well.
+        Logical cores that DTS can use are ones that are present on the node,
+        but filtered according to user config.
+        The filter_specifier will filter cores from those logical cores.
+
+        Args:
+            filter_specifier: Two different filters can be used, one that specifies
+                the number of logical cores per core, cores per socket and
+                the number of sockets,
+                the other that specifies a logical core list.
+            ascending: If True, use cores with the lowest numerical id first
+                and continue in ascending order. If False, start with the highest
+                id and continue in descending order. This ordering affects which
+                sockets to consider first as well.
+
+        Returns:
+            A list of logical cores.
         """
         self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
         return lcore_filter(
@@ -147,17 +201,14 @@ def filter_lcores(
         ).filter()
 
     def _get_remote_cpus(self) -> None:
-        """
-        Scan CPUs in the remote OS and store a list of LogicalCores.
-        """
+        """Scan CPUs in the remote OS and store a list of LogicalCores."""
         self._logger.info("Getting CPU information.")
         self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
 
     def _setup_hugepages(self):
-        """
-        Setup hugepages on the Node. Different architectures can supply different
-        amounts of memory for hugepages and numa-based hugepage allocation may need
-        to be considered.
+        """Setup hugepages on the Node.
+
+        Configure the hugepages only if they're specified in user configuration.
         """
         if self.config.hugepages:
             self.main_session.setup_hugepages(
@@ -165,9 +216,7 @@ def _setup_hugepages(self):
             )
 
     def close(self) -> None:
-        """
-        Close all connections and free other resources.
-        """
+        """Close all connections and free other resources."""
         if self.main_session:
             self.main_session.close()
         for session in self._other_sessions:
@@ -176,6 +225,11 @@ def close(self) -> None:
 
     @staticmethod
     def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+        """A decorator that skips the decorated function.
+
+        When used, the decorator executes an empty lambda function
+        instead of the decorated function.
+        """
         if SETTINGS.skip_setup:
             return lambda *args: None
         else:
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v1 0/4] dts: add dts api docs
  2023-03-23 10:40 [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
                   ` (3 preceding siblings ...)
  2023-03-23 10:40 ` [RFC PATCH v1 4/4] dts: format docstrigs to google format Juraj Linkeš
@ 2023-04-03  9:17 ` Juraj Linkeš
  2023-04-03  9:42   ` Bruce Richardson
  2023-05-04 12:37 ` [RFC PATCH v2 " Juraj Linkeš
  5 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-04-03  9:17 UTC (permalink / raw)
  To: bruce.richardson; +Cc: dev

[-- Attachment #1: Type: text/plain, Size: 4508 bytes --]

Hi Bruce, Thomas,

The meson integration is kinda all over the place. I wanted to use the
existing conf.py Sphinx config file, but I also wanted to keep the docs
separated (because of extra DTS api docs dependencies), so the various
pieces are in different places (the config file in one place, meson code in
dts directory and generated Sphinx docs are in a new directory in the api
build dir, separate from the rest of the Sphinx html).

The big thing here is that I didn't figure out how to separate the dts api
build from the rest of the docs. I don't know how the -Denable_docs option
is supposed to work. I wanted to use -Denable_dts_docs in the same fashion
to decouple the builds, but it doesn't seem to work. Reading the code I
think the original option doesn't actually do anything - does it work? How
is it supposed to work?

Thanks,
Juraj

On Thu, Mar 23, 2023 at 11:40 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:

> Augment the meson build system with dts api generation. The api docs are
> generated from Python docstrings in DTS using Sphinx. The format of
> choice is the Google format [0].
>
> The guide html sphinx configuration is used to preserve the same style.
>
> The build requires the same Python version and dependencies as DTS,
> because Sphinx imports the Python modules. Dependencies are installed
> using Poetry from the dts directory:
>
> poetry install --with docs
>
> After installing, enter the Poetry shell:
>
> poetry shell
>
> And then run the build:
> ninja -C <meson_build_dir> doc
>
> There's only one properly documented module that serves as a
> demonstration of the style - framework.testbed_model.node.
>
> I didn't figure out how to separate dts build from the rest of the docs,
> which I think is required because of the different dependencies.
> I thought the enable_docs option would do this, so I added
> enable_dts_docs, but it doesn't seem to be working. Requesting comment
> on this.
>
> [0]
> https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
>
> Juraj Linkeš (4):
>   dts: code adjustments for sphinx
>   dts: add doc generation dependencies
>   dts: add doc generation
>   dts: format docstrigs to google format
>
>  doc/api/meson.build                           |   1 +
>  doc/guides/conf.py                            |  22 +-
>  doc/guides/meson.build                        |   1 +
>  doc/guides/tools/dts.rst                      |  29 +
>  doc/meson.build                               |   5 -
>  dts/doc-index.rst                             |  20 +
>  dts/framework/config/__init__.py              |  11 +
>  .../{testbed_model/hw => config}/cpu.py       |  13 +
>  dts/framework/dts.py                          |   8 +-
>  dts/framework/remote_session/__init__.py      |   3 +-
>  dts/framework/remote_session/linux_session.py |   2 +-
>  dts/framework/remote_session/os_session.py    |  12 +-
>  .../remote_session/remote/__init__.py         |  16 -
>  .../{remote => }/remote_session.py            |   0
>  .../{remote => }/ssh_session.py               |   0
>  dts/framework/settings.py                     |  55 +-
>  dts/framework/testbed_model/__init__.py       |  10 +-
>  dts/framework/testbed_model/hw/__init__.py    |  27 -
>  dts/framework/testbed_model/node.py           | 164 ++--
>  dts/framework/testbed_model/sut_node.py       |   9 +-
>  .../testbed_model/{hw => }/virtual_device.py  |   0
>  dts/main.py                                   |   3 +-
>  dts/meson.build                               |  50 ++
>  dts/poetry.lock                               | 770 ++++++++++++++++--
>  dts/pyproject.toml                            |   7 +
>  dts/tests/TestSuite_hello_world.py            |   6 +-
>  meson.build                                   |   6 +
>  meson_options.txt                             |   2 +
>  28 files changed, 1027 insertions(+), 225 deletions(-)
>  create mode 100644 dts/doc-index.rst
>  rename dts/framework/{testbed_model/hw => config}/cpu.py (95%)
>  delete mode 100644 dts/framework/remote_session/remote/__init__.py
>  rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
>  rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
>  delete mode 100644 dts/framework/testbed_model/hw/__init__.py
>  rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
>  create mode 100644 dts/meson.build
>
> --
> 2.30.2
>
>

[-- Attachment #2: Type: text/html, Size: 5515 bytes --]

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v1 0/4] dts: add dts api docs
  2023-04-03  9:17 ` [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
@ 2023-04-03  9:42   ` Bruce Richardson
  2023-04-25  8:20     ` Juraj Linkeš
  0 siblings, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-04-03  9:42 UTC (permalink / raw)
  To: Juraj Linkeš; +Cc: dev

On Mon, Apr 03, 2023 at 11:17:06AM +0200, Juraj Linkeš wrote:
>    Hi Bruce, Thomas,
>    The meson integration is kinda all over the place. I wanted to use the
>    existing conf.py Sphinx config file, but I also wanted to keep the docs
>    separated (because of extra DTS api docs dependencies), so the various
>    pieces are in different places (the config file in one place, meson
>    code in dts directory and generated Sphinx docs are in a new directory
>    in the api build dir, separate from the rest of the Sphinx html).
>    The big thing here is that I didn't figure out how to separate the dts
>    api build from the rest of the docs. I don't know how the -Denable_docs
>    option is supposed to work. I wanted to use -Denable_dts_docs in the
>    same fashion to decouple the builds, but it doesn't seem to work.
>    Reading the code I think the original option doesn't actually do
>    anything - does it work? How is it supposed to work?
>    Thanks,
>    Juraj

The enable_docs option works by selectively enabling the doc build tasks
using the "build_by_default" parameter on them. 
See http://git.dpdk.org/dpdk/tree/doc/guides/meson.build#n23 for an
example. The custom_target for sphinx is not a dependency of any other
task, so whether it gets run or not depends entirely on whether the
"build_by_default" and/or "install" options are set.

As usual, there may be other stuff that needs cleaning up on this, but
that's how it works for now, anyway. [And it does actually work, last I
tested it :-)]

/Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v1 0/4] dts: add dts api docs
  2023-04-03  9:42   ` Bruce Richardson
@ 2023-04-25  8:20     ` Juraj Linkeš
  2023-04-25  8:44       ` Bruce Richardson
  0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-04-25  8:20 UTC (permalink / raw)
  To: Bruce Richardson; +Cc: dev

On Mon, Apr 3, 2023 at 11:42 AM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Mon, Apr 03, 2023 at 11:17:06AM +0200, Juraj Linkeš wrote:
> >    Hi Bruce, Thomas,
> >    The meson integration is kinda all over the place. I wanted to use the
> >    existing conf.py Sphinx config file, but I also wanted to keep the docs
> >    separated (because of extra DTS api docs dependencies), so the various
> >    pieces are in different places (the config file in one place, meson
> >    code in dts directory and generated Sphinx docs are in a new directory
> >    in the api build dir, separate from the rest of the Sphinx html).
> >    The big thing here is that I didn't figure out how to separate the dts
> >    api build from the rest of the docs. I don't know how the -Denable_docs
> >    option is supposed to work. I wanted to use -Denable_dts_docs in the
> >    same fashion to decouple the builds, but it doesn't seem to work.
> >    Reading the code I think the original option doesn't actually do
> >    anything - does it work? How is it supposed to work?
> >    Thanks,
> >    Juraj
>
> The enable_docs option works by selectively enabling the doc build tasks
> using the "build_by_default" parameter on them.
> See http://git.dpdk.org/dpdk/tree/doc/guides/meson.build#n23 for an
> example. The custom_target for sphinx is not a dependency of any other
> task, so whether it gets run or not depends entirely on whether the
> "build_by_default" and/or "install" options are set.
>
> As usual, there may be other stuff that needs cleaning up on this, but
> that's how it works for now, anyway. [And it does actually work, last I
> tested it :-)]

I looked into this and as is so frequently the case, we're both right. :-)

When running according to docs, that is with:
1. meson setup doc_build
2. ninja -C doc_build doc

it doesn't matter what enable_docs is set to, it always builds the docs.

But in the full build it does control whether docs are built, i.e.:

1. meson setup doc_build
2. ninja -C doc_build
doesn't build the docs, whereas:

1. meson setup doc_build -Denable_docs=true
2. ninja -C doc_build
builds the docs.

Now the problem in this version is when doing just the doc build
(ninja -C doc_build doc) both DPDK and DTS docs are built and I'd like
to separate those (because DTS doc build has additional dependencies).
I'm thinking the following would be a good solution within the current
paradigm:
1. The -Denable_docs=true and -Denable_dts_docs=true options to
separate doc builds for the full build.
2. Separate dts doc dir for the doc build ("ninja -C doc_build doc"
for DPDK docs and "ninja -C doc_build dts" (or maybe some other dir)
for DTS docs).

What do you think?
Juraj

>
> /Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v1 0/4] dts: add dts api docs
  2023-04-25  8:20     ` Juraj Linkeš
@ 2023-04-25  8:44       ` Bruce Richardson
  2023-04-25  8:57         ` Juraj Linkeš
  0 siblings, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-04-25  8:44 UTC (permalink / raw)
  To: Juraj Linkeš; +Cc: dev

On Tue, Apr 25, 2023 at 10:20:36AM +0200, Juraj Linkeš wrote:
> On Mon, Apr 3, 2023 at 11:42 AM Bruce Richardson
> <bruce.richardson@intel.com> wrote:
> >
> > On Mon, Apr 03, 2023 at 11:17:06AM +0200, Juraj Linkeš wrote:
> > >    Hi Bruce, Thomas,
> > >    The meson integration is kinda all over the place. I wanted to use the
> > >    existing conf.py Sphinx config file, but I also wanted to keep the docs
> > >    separated (because of extra DTS api docs dependencies), so the various
> > >    pieces are in different places (the config file in one place, meson
> > >    code in dts directory and generated Sphinx docs are in a new directory
> > >    in the api build dir, separate from the rest of the Sphinx html).
> > >    The big thing here is that I didn't figure out how to separate the dts
> > >    api build from the rest of the docs. I don't know how the -Denable_docs
> > >    option is supposed to work. I wanted to use -Denable_dts_docs in the
> > >    same fashion to decouple the builds, but it doesn't seem to work.
> > >    Reading the code I think the original option doesn't actually do
> > >    anything - does it work? How is it supposed to work?
> > >    Thanks,
> > >    Juraj
> >
> > The enable_docs option works by selectively enabling the doc build tasks
> > using the "build_by_default" parameter on them.
> > See http://git.dpdk.org/dpdk/tree/doc/guides/meson.build#n23 for an
> > example. The custom_target for sphinx is not a dependency of any other
> > task, so whether it gets run or not depends entirely on whether the
> > "build_by_default" and/or "install" options are set.
> >
> > As usual, there may be other stuff that needs cleaning up on this, but
> > that's how it works for now, anyway. [And it does actually work, last I
> > tested it :-)]
> 
> I looked into this and as is so frequently the case, we're both right. :-)
> 
> When running according to docs, that is with:
> 1. meson setup doc_build
> 2. ninja -C doc_build doc
> 
> it doesn't matter what enable_docs is set to, it always builds the docs.
> 

Yes, I'd forgotten that. That was deliberately done so one could always
request a doc build directly, without having to worry about DPDK config or
building the rest of DPDK.

> But in the full build it does control whether docs are built, i.e.:
> 
> 1. meson setup doc_build
> 2. ninja -C doc_build
> doesn't build the docs, whereas:
> 
> 1. meson setup doc_build -Denable_docs=true
> 2. ninja -C doc_build
> builds the docs.
> 
> Now the problem in this version is when doing just the doc build
> (ninja -C doc_build doc) both DPDK and DTS docs are built and I'd like
> to separate those (because DTS doc build has additional dependencies).
> I'm thinking the following would be a good solution within the current
> paradigm:
> 1. The -Denable_docs=true and -Denable_dts_docs=true options to
> separate doc builds for the full build.
> 2. Separate dts doc dir for the doc build ("ninja -C doc_build doc"
> for DPDK docs and "ninja -C doc_build dts" (or maybe some other dir)
> for DTS docs).

How important is it to separate out the dts docs from the regular docs?
What are the additional dependencies, and how hard are they to get? If
possible I'd rather not have an additional build config option added for
this.

If we are separating them out, I think the dts doc target should be
"dts_doc" rather than "dts" for clarity.

/Bruce


^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v1 0/4] dts: add dts api docs
  2023-04-25  8:44       ` Bruce Richardson
@ 2023-04-25  8:57         ` Juraj Linkeš
  2023-04-25  9:43           ` Bruce Richardson
  0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-04-25  8:57 UTC (permalink / raw)
  To: Bruce Richardson; +Cc: dev

On Tue, Apr 25, 2023 at 10:44 AM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Tue, Apr 25, 2023 at 10:20:36AM +0200, Juraj Linkeš wrote:
> > On Mon, Apr 3, 2023 at 11:42 AM Bruce Richardson
> > <bruce.richardson@intel.com> wrote:
> > >
> > > On Mon, Apr 03, 2023 at 11:17:06AM +0200, Juraj Linkeš wrote:
> > > >    Hi Bruce, Thomas,
> > > >    The meson integration is kinda all over the place. I wanted to use the
> > > >    existing conf.py Sphinx config file, but I also wanted to keep the docs
> > > >    separated (because of extra DTS api docs dependencies), so the various
> > > >    pieces are in different places (the config file in one place, meson
> > > >    code in dts directory and generated Sphinx docs are in a new directory
> > > >    in the api build dir, separate from the rest of the Sphinx html).
> > > >    The big thing here is that I didn't figure out how to separate the dts
> > > >    api build from the rest of the docs. I don't know how the -Denable_docs
> > > >    option is supposed to work. I wanted to use -Denable_dts_docs in the
> > > >    same fashion to decouple the builds, but it doesn't seem to work.
> > > >    Reading the code I think the original option doesn't actually do
> > > >    anything - does it work? How is it supposed to work?
> > > >    Thanks,
> > > >    Juraj
> > >
> > > The enable_docs option works by selectively enabling the doc build tasks
> > > using the "build_by_default" parameter on them.
> > > See http://git.dpdk.org/dpdk/tree/doc/guides/meson.build#n23 for an
> > > example. The custom_target for sphinx is not a dependency of any other
> > > task, so whether it gets run or not depends entirely on whether the
> > > "build_by_default" and/or "install" options are set.
> > >
> > > As usual, there may be other stuff that needs cleaning up on this, but
> > > that's how it works for now, anyway. [And it does actually work, last I
> > > tested it :-)]
> >
> > I looked into this and as is so frequently the case, we're both right. :-)
> >
> > When running according to docs, that is with:
> > 1. meson setup doc_build
> > 2. ninja -C doc_build doc
> >
> > it doesn't matter what enable_docs is set to, it always builds the docs.
> >
>
> Yes, I'd forgotten that. That was deliberately done so one could always
> request a doc build directly, without having to worry about DPDK config or
> building the rest of DPDK.
>
> > But in the full build it does control whether docs are built, i.e.:
> >
> > 1. meson setup doc_build
> > 2. ninja -C doc_build
> > doesn't build the docs, whereas:
> >
> > 1. meson setup doc_build -Denable_docs=true
> > 2. ninja -C doc_build
> > builds the docs.
> >
> > Now the problem in this version is when doing just the doc build
> > (ninja -C doc_build doc) both DPDK and DTS docs are built and I'd like
> > to separate those (because DTS doc build has additional dependencies).
> > I'm thinking the following would be a good solution within the current
> > paradigm:
> > 1. The -Denable_docs=true and -Denable_dts_docs=true options to
> > separate doc builds for the full build.
> > 2. Separate dts doc dir for the doc build ("ninja -C doc_build doc"
> > for DPDK docs and "ninja -C doc_build dts" (or maybe some other dir)
> > for DTS docs).
>
> How important is it to separate out the dts docs from the regular docs?

It is mostly a matter of dependencies.

> What are the additional dependencies, and how hard are they to get? If
> possible I'd rather not have an additional build config option added for
> this.

The same dependencies as for running DTS, which are the proper Python
version (3.10 and newer) with DTS depencies obtained with Poetry
(which is a matter of installing Poetry and running it). As is
standard with Python projects, this is all set up in a virtual
environment, which needs to be activated before running the doc build.
It's documented in more detail in the tools docs:
https://doc.dpdk.org/guides/tools/dts.html#setting-up-dts-environment

This may be too much of a hassle for DPDK devs to build non-DTS docs,
but I don't know whether DPDK devs actually build docs at all.

>
> If we are separating them out, I think the dts doc target should be
> "dts_doc" rather than "dts" for clarity.

Agreed, but "dts_doc" would be a new top-level dir. I think we could
do dts/doc (a dir inside dts).

Juraj

>
> /Bruce
>

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v1 0/4] dts: add dts api docs
  2023-04-25  8:57         ` Juraj Linkeš
@ 2023-04-25  9:43           ` Bruce Richardson
  2023-05-03 11:33             ` Juraj Linkeš
  0 siblings, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-04-25  9:43 UTC (permalink / raw)
  To: Juraj Linkeš; +Cc: dev

On Tue, Apr 25, 2023 at 10:57:12AM +0200, Juraj Linkeš wrote:
> On Tue, Apr 25, 2023 at 10:44 AM Bruce Richardson
> <bruce.richardson@intel.com> wrote:
> >
> > On Tue, Apr 25, 2023 at 10:20:36AM +0200, Juraj Linkeš wrote:
> > > On Mon, Apr 3, 2023 at 11:42 AM Bruce Richardson
> > > <bruce.richardson@intel.com> wrote:
> > > >
> > > > On Mon, Apr 03, 2023 at 11:17:06AM +0200, Juraj Linkeš wrote:
> > > > >    Hi Bruce, Thomas,
> > > > >    The meson integration is kinda all over the place. I wanted to use the
> > > > >    existing conf.py Sphinx config file, but I also wanted to keep the docs
> > > > >    separated (because of extra DTS api docs dependencies), so the various
> > > > >    pieces are in different places (the config file in one place, meson
> > > > >    code in dts directory and generated Sphinx docs are in a new directory
> > > > >    in the api build dir, separate from the rest of the Sphinx html).
> > > > >    The big thing here is that I didn't figure out how to separate the dts
> > > > >    api build from the rest of the docs. I don't know how the -Denable_docs
> > > > >    option is supposed to work. I wanted to use -Denable_dts_docs in the
> > > > >    same fashion to decouple the builds, but it doesn't seem to work.
> > > > >    Reading the code I think the original option doesn't actually do
> > > > >    anything - does it work? How is it supposed to work?
> > > > >    Thanks,
> > > > >    Juraj
> > > >
> > > > The enable_docs option works by selectively enabling the doc build tasks
> > > > using the "build_by_default" parameter on them.
> > > > See http://git.dpdk.org/dpdk/tree/doc/guides/meson.build#n23 for an
> > > > example. The custom_target for sphinx is not a dependency of any other
> > > > task, so whether it gets run or not depends entirely on whether the
> > > > "build_by_default" and/or "install" options are set.
> > > >
> > > > As usual, there may be other stuff that needs cleaning up on this, but
> > > > that's how it works for now, anyway. [And it does actually work, last I
> > > > tested it :-)]
> > >
> > > I looked into this and as is so frequently the case, we're both right. :-)
> > >
> > > When running according to docs, that is with:
> > > 1. meson setup doc_build
> > > 2. ninja -C doc_build doc
> > >
> > > it doesn't matter what enable_docs is set to, it always builds the docs.
> > >
> >
> > Yes, I'd forgotten that. That was deliberately done so one could always
> > request a doc build directly, without having to worry about DPDK config or
> > building the rest of DPDK.
> >
> > > But in the full build it does control whether docs are built, i.e.:
> > >
> > > 1. meson setup doc_build
> > > 2. ninja -C doc_build
> > > doesn't build the docs, whereas:
> > >
> > > 1. meson setup doc_build -Denable_docs=true
> > > 2. ninja -C doc_build
> > > builds the docs.
> > >
> > > Now the problem in this version is when doing just the doc build
> > > (ninja -C doc_build doc) both DPDK and DTS docs are built and I'd like
> > > to separate those (because DTS doc build has additional dependencies).
> > > I'm thinking the following would be a good solution within the current
> > > paradigm:
> > > 1. The -Denable_docs=true and -Denable_dts_docs=true options to
> > > separate doc builds for the full build.
> > > 2. Separate dts doc dir for the doc build ("ninja -C doc_build doc"
> > > for DPDK docs and "ninja -C doc_build dts" (or maybe some other dir)
> > > for DTS docs).
> >
> > How important is it to separate out the dts docs from the regular docs?
> 
> It is mostly a matter of dependencies.
> 
> > What are the additional dependencies, and how hard are they to get? If
> > possible I'd rather not have an additional build config option added for
> > this.
> 
> The same dependencies as for running DTS, which are the proper Python
> version (3.10 and newer) with DTS depencies obtained with Poetry
> (which is a matter of installing Poetry and running it). As is
> standard with Python projects, this is all set up in a virtual
> environment, which needs to be activated before running the doc build.
> It's documented in more detail in the tools docs:
> https://doc.dpdk.org/guides/tools/dts.html#setting-up-dts-environment
> 
> This may be too much of a hassle for DPDK devs to build non-DTS docs,
> but I don't know whether DPDK devs actually build docs at all.
> 

Can't really say for sure. I suspect most don't build them on a daily
basis, but would often need to build them before submitting patches with a
doc change included.

What format are the DTS docs in? I agree that as described above the
requirements are pretty different than those for the regular DPDK docs.

> >
> > If we are separating them out, I think the dts doc target should be
> > "dts_doc" rather than "dts" for clarity.
> 
> Agreed, but "dts_doc" would be a new top-level dir. I think we could
> do dts/doc (a dir inside dts).
> 
That path seems reasonable to me.

/Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v1 4/4] dts: format docstrigs to google format
  2023-03-23 10:40 ` [RFC PATCH v1 4/4] dts: format docstrigs to google format Juraj Linkeš
@ 2023-04-28 19:33   ` Jeremy Spewock
  0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2023-04-28 19:33 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, dev

[-- Attachment #1: Type: text/plain, Size: 10848 bytes --]

Acked-by: Jeremy Spweock <jspweock@iol.unh.edu>

On Thu, Mar 23, 2023 at 6:40 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:

> WIP: only one module is reformatted to serve as a demonstration.
>
> The google format is documented here [0].
>
> [0]: https://google.github.io/styleguide/pyguide.html
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
>  dts/framework/testbed_model/node.py | 152 +++++++++++++++++++---------
>  1 file changed, 103 insertions(+), 49 deletions(-)
>
> diff --git a/dts/framework/testbed_model/node.py
> b/dts/framework/testbed_model/node.py
> index 90467981c3..ad8ef442af 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -3,8 +3,13 @@
>  # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>  # Copyright(c) 2022-2023 University of New Hampshire
>
> -"""
> -A node is a generic host that DTS connects to and manages.
> +"""Common functionality for node management.
> +
> +There's a base class, Node, that's supposed to be extended by other
> classes
> +with functionality specific to that node type.
> +The only part that can be used standalone is the Node.skip_setup static
> method,
> +which is a decorator used to skip method execution
> +if skip_setup is passed by the user on the cmdline or in an env variable.
>  """
>
>  from typing import Any, Callable
> @@ -26,10 +31,25 @@
>
>
>  class Node(object):
> -    """
> -    Basic class for node management. This class implements methods that
> -    manage a node, such as information gathering (of CPU/PCI/NIC) and
> -    environment setup.
> +    """The base class for node management.
> +
> +    It shouldn't be instantiated, but rather extended.
> +    It implements common methods to manage any node:
> +
> +       * connection to the node
> +       * information gathering of CPU
> +       * hugepages setup
> +
> +    Arguments:
> +        node_config: The config from the input configuration file.
> +
> +    Attributes:
> +        main_session: The primary OS-agnostic remote session used
> +            to communicate with the node.
> +        config: The configuration used to create the node.
> +        name: The name of the node.
> +        lcores: The list of logical cores that DTS can use on the node.
> +            It's derived from logical cores present on the node and user
> configuration.
>      """
>
>      main_session: OSSession
> @@ -56,65 +76,89 @@ def __init__(self, node_config: NodeConfiguration):
>          self._logger.info(f"Created node: {self.name}")
>
>      def set_up_execution(self, execution_config: ExecutionConfiguration)
> -> None:
> -        """
> -        Perform the execution setup that will be done for each execution
> -        this node is part of.
> +        """Execution setup steps.
> +
> +        Configure hugepages and call self._set_up_execution where
> +        the rest of the configuration steps (if any) are implemented.
> +
> +        Args:
> +            execution_config: The execution configuration according to
> which
> +                the setup steps will be taken.
>          """
>          self._setup_hugepages()
>          self._set_up_execution(execution_config)
>
>      def _set_up_execution(self, execution_config: ExecutionConfiguration)
> -> None:
> -        """
> -        This method exists to be optionally overwritten by derived
> classes and
> -        is not decorated so that the derived class doesn't have to use
> the decorator.
> +        """Optional additional execution setup steps for derived classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional execution setup steps.
>          """
>
>      def tear_down_execution(self) -> None:
> -        """
> -        Perform the execution teardown that will be done after each
> execution
> -        this node is part of concludes.
> +        """Execution teardown steps.
> +
> +        There are currently no common execution teardown steps
> +        common to all DTS node types.
>          """
>          self._tear_down_execution()
>
>      def _tear_down_execution(self) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived
> classes and
> -        is not decorated so that the derived class doesn't have to use
> the decorator.
> +        """Optional additional execution teardown steps for derived
> classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional execution teardown steps.
>          """
>
>      def set_up_build_target(
>          self, build_target_config: BuildTargetConfiguration
>      ) -> None:
> -        """
> -        Perform the build target setup that will be done for each build
> target
> -        tested on this node.
> +        """Build target setup steps.
> +
> +        There are currently no common build target setup steps
> +        common to all DTS node types.
> +
> +        Args:
> +            build_target_config: The build target configuration according
> to which
> +                the setup steps will be taken.
>          """
>          self._set_up_build_target(build_target_config)
>
>      def _set_up_build_target(
>          self, build_target_config: BuildTargetConfiguration
>      ) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived
> classes and
> -        is not decorated so that the derived class doesn't have to use
> the decorator.
> +        """Optional additional build target setup steps for derived
> classes.
> +
> +        Derived classes should optionally overwrite this
> +        if they want to add additional build target setup steps.
>          """
>
>      def tear_down_build_target(self) -> None:
> -        """
> -        Perform the build target teardown that will be done after each
> build target
> -        tested on this node.
> +        """Build target teardown steps.
> +
> +        There are currently no common build target teardown steps
> +        common to all DTS node types.
>          """
>          self._tear_down_build_target()
>
>      def _tear_down_build_target(self) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived
> classes and
> -        is not decorated so that the derived class doesn't have to use
> the decorator.
> +        """Optional additional build target teardown steps for derived
> classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional build target teardown steps.
>          """
>
>      def create_session(self, name: str) -> OSSession:
> -        """
> -        Create and return a new OSSession tailored to the remote OS.
> +        """Create and return a new OS-agnostic remote session.
> +
> +        The returned session won't be used by the object creating it.
> +        Will be cleaned up automatically.
> +
> +        Args:
> +            name: The name of the session.
> +
> +        Returns:
> +            A new OS-agnostic remote session.
>          """
>          session_name = f"{self.name} {name}"
>          connection = create_session(
> @@ -130,14 +174,24 @@ def filter_lcores(
>          filter_specifier: LogicalCoreCount | LogicalCoreList,
>          ascending: bool = True,
>      ) -> list[LogicalCore]:
> -        """
> -        Filter the LogicalCores found on the Node according to
> -        a LogicalCoreCount or a LogicalCoreList.
> +        """Filter the node's logical cores that DTS can use.
>
> -        If ascending is True, use cores with the lowest numerical id first
> -        and continue in ascending order. If False, start with the highest
> -        id and continue in descending order. This ordering affects which
> -        sockets to consider first as well.
> +        Logical cores that DTS can use are ones that are present on the
> node,
> +        but filtered according to user config.
> +        The filter_specifier will filter cores from those logical cores.
> +
> +        Args:
> +            filter_specifier: Two different filters can be used, one that
> specifies
> +                the number of logical cores per core, cores per socket and
> +                the number of sockets,
> +                the other that specifies a logical core list.
> +            ascending: If True, use cores with the lowest numerical id
> first
> +                and continue in ascending order. If False, start with the
> highest
> +                id and continue in descending order. This ordering
> affects which
> +                sockets to consider first as well.
> +
> +        Returns:
> +            A list of logical cores.
>          """
>          self._logger.debug(f"Filtering {filter_specifier} from
> {self.lcores}.")
>          return lcore_filter(
> @@ -147,17 +201,14 @@ def filter_lcores(
>          ).filter()
>
>      def _get_remote_cpus(self) -> None:
> -        """
> -        Scan CPUs in the remote OS and store a list of LogicalCores.
> -        """
> +        """Scan CPUs in the remote OS and store a list of LogicalCores."""
>          self._logger.info("Getting CPU information.")
>          self.lcores =
> self.main_session.get_remote_cpus(self.config.use_first_core)
>
>      def _setup_hugepages(self):
> -        """
> -        Setup hugepages on the Node. Different architectures can supply
> different
> -        amounts of memory for hugepages and numa-based hugepage
> allocation may need
> -        to be considered.
> +        """Setup hugepages on the Node.
> +
> +        Configure the hugepages only if they're specified in user
> configuration.
>          """
>          if self.config.hugepages:
>              self.main_session.setup_hugepages(
> @@ -165,9 +216,7 @@ def _setup_hugepages(self):
>              )
>
>      def close(self) -> None:
> -        """
> -        Close all connections and free other resources.
> -        """
> +        """Close all connections and free other resources."""
>          if self.main_session:
>              self.main_session.close()
>          for session in self._other_sessions:
> @@ -176,6 +225,11 @@ def close(self) -> None:
>
>      @staticmethod
>      def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> +        """A decorator that skips the decorated function.
> +
> +        When used, the decorator executes an empty lambda function
> +        instead of the decorated function.
> +        """
>          if SETTINGS.skip_setup:
>              return lambda *args: None
>          else:
> --
> 2.30.2
>
>

[-- Attachment #2: Type: text/html, Size: 13569 bytes --]

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v1 0/4] dts: add dts api docs
  2023-04-25  9:43           ` Bruce Richardson
@ 2023-05-03 11:33             ` Juraj Linkeš
  0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-03 11:33 UTC (permalink / raw)
  To: Bruce Richardson; +Cc: dev

On Tue, Apr 25, 2023 at 11:43 AM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Tue, Apr 25, 2023 at 10:57:12AM +0200, Juraj Linkeš wrote:
> > On Tue, Apr 25, 2023 at 10:44 AM Bruce Richardson
> > <bruce.richardson@intel.com> wrote:
> > >
> > > On Tue, Apr 25, 2023 at 10:20:36AM +0200, Juraj Linkeš wrote:
> > > > On Mon, Apr 3, 2023 at 11:42 AM Bruce Richardson
> > > > <bruce.richardson@intel.com> wrote:
> > > > >
> > > > > On Mon, Apr 03, 2023 at 11:17:06AM +0200, Juraj Linkeš wrote:
> > > > > >    Hi Bruce, Thomas,
> > > > > >    The meson integration is kinda all over the place. I wanted to use the
> > > > > >    existing conf.py Sphinx config file, but I also wanted to keep the docs
> > > > > >    separated (because of extra DTS api docs dependencies), so the various
> > > > > >    pieces are in different places (the config file in one place, meson
> > > > > >    code in dts directory and generated Sphinx docs are in a new directory
> > > > > >    in the api build dir, separate from the rest of the Sphinx html).
> > > > > >    The big thing here is that I didn't figure out how to separate the dts
> > > > > >    api build from the rest of the docs. I don't know how the -Denable_docs
> > > > > >    option is supposed to work. I wanted to use -Denable_dts_docs in the
> > > > > >    same fashion to decouple the builds, but it doesn't seem to work.
> > > > > >    Reading the code I think the original option doesn't actually do
> > > > > >    anything - does it work? How is it supposed to work?
> > > > > >    Thanks,
> > > > > >    Juraj
> > > > >
> > > > > The enable_docs option works by selectively enabling the doc build tasks
> > > > > using the "build_by_default" parameter on them.
> > > > > See http://git.dpdk.org/dpdk/tree/doc/guides/meson.build#n23 for an
> > > > > example. The custom_target for sphinx is not a dependency of any other
> > > > > task, so whether it gets run or not depends entirely on whether the
> > > > > "build_by_default" and/or "install" options are set.
> > > > >
> > > > > As usual, there may be other stuff that needs cleaning up on this, but
> > > > > that's how it works for now, anyway. [And it does actually work, last I
> > > > > tested it :-)]
> > > >
> > > > I looked into this and as is so frequently the case, we're both right. :-)
> > > >
> > > > When running according to docs, that is with:
> > > > 1. meson setup doc_build
> > > > 2. ninja -C doc_build doc
> > > >
> > > > it doesn't matter what enable_docs is set to, it always builds the docs.
> > > >
> > >
> > > Yes, I'd forgotten that. That was deliberately done so one could always
> > > request a doc build directly, without having to worry about DPDK config or
> > > building the rest of DPDK.
> > >
> > > > But in the full build it does control whether docs are built, i.e.:
> > > >
> > > > 1. meson setup doc_build
> > > > 2. ninja -C doc_build
> > > > doesn't build the docs, whereas:
> > > >
> > > > 1. meson setup doc_build -Denable_docs=true
> > > > 2. ninja -C doc_build
> > > > builds the docs.
> > > >
> > > > Now the problem in this version is when doing just the doc build
> > > > (ninja -C doc_build doc) both DPDK and DTS docs are built and I'd like
> > > > to separate those (because DTS doc build has additional dependencies).
> > > > I'm thinking the following would be a good solution within the current
> > > > paradigm:
> > > > 1. The -Denable_docs=true and -Denable_dts_docs=true options to
> > > > separate doc builds for the full build.
> > > > 2. Separate dts doc dir for the doc build ("ninja -C doc_build doc"
> > > > for DPDK docs and "ninja -C doc_build dts" (or maybe some other dir)
> > > > for DTS docs).
> > >
> > > How important is it to separate out the dts docs from the regular docs?
> >
> > It is mostly a matter of dependencies.
> >
> > > What are the additional dependencies, and how hard are they to get? If
> > > possible I'd rather not have an additional build config option added for
> > > this.
> >
> > The same dependencies as for running DTS, which are the proper Python
> > version (3.10 and newer) with DTS depencies obtained with Poetry
> > (which is a matter of installing Poetry and running it). As is
> > standard with Python projects, this is all set up in a virtual
> > environment, which needs to be activated before running the doc build.
> > It's documented in more detail in the tools docs:
> > https://doc.dpdk.org/guides/tools/dts.html#setting-up-dts-environment
> >
> > This may be too much of a hassle for DPDK devs to build non-DTS docs,
> > but I don't know whether DPDK devs actually build docs at all.
> >
>
> Can't really say for sure. I suspect most don't build them on a daily
> basis, but would often need to build them before submitting patches with a
> doc change included.
>
> What format are the DTS docs in? I agree that as described above the
> requirements are pretty different than those for the regular DPDK docs.
>

The resulting html docs are using the same Sphinx conf file (with
extension configuration and two more config options - see patch 3/4)
as we're using for regular docs.

> > >
> > > If we are separating them out, I think the dts doc target should be
> > > "dts_doc" rather than "dts" for clarity.
> >
> > Agreed, but "dts_doc" would be a new top-level dir. I think we could
> > do dts/doc (a dir inside dts).
> >
> That path seems reasonable to me.
>
> /Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v2 0/4] dts: add dts api docs
  2023-03-23 10:40 [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
                   ` (4 preceding siblings ...)
  2023-04-03  9:17 ` [RFC PATCH v1 0/4] dts: add dts api docs Juraj Linkeš
@ 2023-05-04 12:37 ` Juraj Linkeš
  2023-05-04 12:37   ` [RFC PATCH v2 1/4] dts: code adjustments for sphinx Juraj Linkeš
                     ` (5 more replies)
  5 siblings, 6 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-04 12:37 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

Augment the meson build system with dts api generation. The api docs are
generated from Python docstrings in DTS using Sphinx. The format of
choice is the Google format [0].

The guide html sphinx configuration is used to preserve the same style.

The build requires the same Python version and dependencies as DTS,
because Sphinx imports the Python modules. Dependencies are installed
using Poetry from the dts directory:

poetry install --with docs

After installing, enter the Poetry shell:

poetry shell

And then run the build:
ninja -C <meson_build_dir> dts/doc

There's only one properly documented module that serves as a
demonstration of the style - framework.testbed_model.node.

[0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes

Juraj Linkeš (4):
  dts: code adjustments for sphinx
  dts: add doc generation dependencies
  dts: add doc generation
  dts: format docstrigs to google format

 doc/api/meson.build                           |   1 +
 doc/guides/conf.py                            |  22 +-
 doc/guides/meson.build                        |   1 +
 doc/guides/tools/dts.rst                      |  29 +
 dts/doc/doc-index.rst                         |  20 +
 dts/doc/meson.build                           |  50 ++
 dts/framework/config/__init__.py              |  11 +
 .../{testbed_model/hw => config}/cpu.py       |  13 +
 dts/framework/dts.py                          |   8 +-
 dts/framework/remote_session/__init__.py      |   3 +-
 dts/framework/remote_session/linux_session.py |   2 +-
 dts/framework/remote_session/os_session.py    |  12 +-
 .../remote_session/remote/__init__.py         |  16 -
 .../{remote => }/remote_session.py            |   0
 .../{remote => }/ssh_session.py               |   0
 dts/framework/settings.py                     |  55 +-
 dts/framework/testbed_model/__init__.py       |  10 +-
 dts/framework/testbed_model/hw/__init__.py    |  27 -
 dts/framework/testbed_model/node.py           | 164 ++--
 dts/framework/testbed_model/sut_node.py       |   9 +-
 .../testbed_model/{hw => }/virtual_device.py  |   0
 dts/main.py                                   |   3 +-
 dts/meson.build                               |  16 +
 dts/poetry.lock                               | 770 ++++++++++++++++--
 dts/pyproject.toml                            |   7 +
 dts/tests/TestSuite_hello_world.py            |   6 +-
 meson.build                                   |   1 +
 meson_options.txt                             |   2 +
 28 files changed, 1038 insertions(+), 220 deletions(-)
 create mode 100644 dts/doc/doc-index.rst
 create mode 100644 dts/doc/meson.build
 rename dts/framework/{testbed_model/hw => config}/cpu.py (95%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
 create mode 100644 dts/meson.build

-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v2 1/4] dts: code adjustments for sphinx
  2023-05-04 12:37 ` [RFC PATCH v2 " Juraj Linkeš
@ 2023-05-04 12:37   ` Juraj Linkeš
  2023-05-04 12:37   ` [RFC PATCH v2 2/4] dts: add doc generation dependencies Juraj Linkeš
                     ` (4 subsequent siblings)
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-04 12:37 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

sphinx-build only imports the Python modules when building the
documentation; it doesn't run DTS. This requires changes that make the
code importable without running it. This means:
* properly guarding argument parsing in the if __name__ == '__main__'
  block.
* the logger used by DTS runner underwent the same treatment so that it
  doesn't create unnecessary log files.
* however, DTS uses the arguments to construct an object holding global
  variables. The defaults for the global variables needed to be moved
  from argument parsing elsewhere.
* importing the remote_session module from framework resulted in
  circular imports because of one module trying to import another
  module. This is fixed by more granular imports.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/config/__init__.py              | 11 ++++
 .../{testbed_model/hw => config}/cpu.py       | 13 +++++
 dts/framework/dts.py                          |  8 ++-
 dts/framework/remote_session/__init__.py      |  3 +-
 dts/framework/remote_session/linux_session.py |  2 +-
 dts/framework/remote_session/os_session.py    | 12 +++-
 .../remote_session/remote/__init__.py         | 16 ------
 .../{remote => }/remote_session.py            |  0
 .../{remote => }/ssh_session.py               |  0
 dts/framework/settings.py                     | 55 ++++++++++---------
 dts/framework/testbed_model/__init__.py       | 10 +---
 dts/framework/testbed_model/hw/__init__.py    | 27 ---------
 dts/framework/testbed_model/node.py           | 12 ++--
 dts/framework/testbed_model/sut_node.py       |  9 ++-
 .../testbed_model/{hw => }/virtual_device.py  |  0
 dts/main.py                                   |  3 +-
 dts/tests/TestSuite_hello_world.py            |  6 +-
 17 files changed, 88 insertions(+), 99 deletions(-)
 rename dts/framework/{testbed_model/hw => config}/cpu.py (95%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index ebb0823ff5..293c4cb15b 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -7,6 +7,8 @@
 Yaml config parsing methods
 """
 
+# pylama:ignore=W0611
+
 import json
 import os.path
 import pathlib
@@ -19,6 +21,15 @@
 
 from framework.settings import SETTINGS
 
+from .cpu import (
+    LogicalCore,
+    LogicalCoreCount,
+    LogicalCoreCountFilter,
+    LogicalCoreList,
+    LogicalCoreListFilter,
+    lcore_filter,
+)
+
 
 class StrEnum(Enum):
     @staticmethod
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/config/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/config/cpu.py
index d1918a12dc..8fe785dfe4 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/config/cpu.py
@@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
             )
 
         return filtered_lcores
+
+
+def lcore_filter(
+    core_list: list[LogicalCore],
+    filter_specifier: LogicalCoreCount | LogicalCoreList,
+    ascending: bool,
+) -> LogicalCoreFilter:
+    if isinstance(filter_specifier, LogicalCoreList):
+        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+    elif isinstance(filter_specifier, LogicalCoreCount):
+        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+    else:
+        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 0502284580..22a09b7e34 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,7 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
+import logging
 import sys
 
 from .config import CONFIGURATION, BuildTargetConfiguration, ExecutionConfiguration
@@ -12,7 +13,8 @@
 from .testbed_model import SutNode
 from .utils import check_dts_python_version
 
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG | logging.Logger = logging.getLogger("DTSRunner")
 result: DTSResult = DTSResult(dts_logger)
 
 
@@ -24,6 +26,10 @@ def run_all() -> None:
     global dts_logger
     global result
 
+    # create a regular DTS logger and create a new result with it
+    dts_logger = getLogger("DTSRunner")
+    result = DTSResult(dts_logger)
+
     # check the python version of the server that run dts
     check_dts_python_version()
 
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index ee221503df..17ca1459f7 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -17,7 +17,8 @@
 
 from .linux_session import LinuxSession
 from .os_session import OSSession
-from .remote import CommandResult, RemoteSession, SSHSession
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
 
 
 def create_session(
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/remote_session/linux_session.py
index a1e3bc3a92..c8ce5fe6da 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/remote_session/linux_session.py
@@ -2,8 +2,8 @@
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2023 University of New Hampshire
 
+from framework.config import LogicalCore
 from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
 from framework.utils import expand_range
 
 from .posix_session import PosixSession
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/remote_session/os_session.py
index 4c48ae2567..246f0358ea 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/remote_session/os_session.py
@@ -6,13 +6,13 @@
 from collections.abc import Iterable
 from pathlib import PurePath
 
-from framework.config import Architecture, NodeConfiguration
+from framework.config import Architecture, LogicalCore, NodeConfiguration
 from framework.logger import DTSLOG
 from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
 from framework.utils import EnvVarsDict, MesonArgs
 
-from .remote import CommandResult, RemoteSession, create_remote_session
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
 
 
 class OSSession(ABC):
@@ -173,3 +173,9 @@ def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
         if needed and mount the hugepages if needed.
         If force_first_numa is True, configure hugepages just on the first socket.
         """
+
+
+def create_remote_session(
+    node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> RemoteSession:
+    return SSHSession(node_config, name, logger)
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 8a1512210a..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,16 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-
-
-def create_remote_session(
-    node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
-    return SSHSession(node_config, name, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 71955f4581..144f9dea62 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
 import argparse
 import os
 from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Any, TypeVar
 
@@ -59,15 +59,18 @@ def __call__(
 
 @dataclass(slots=True, frozen=True)
 class _Settings:
-    config_file_path: str
-    output_dir: str
-    timeout: float
-    verbose: bool
-    skip_setup: bool
-    dpdk_tarball_path: Path
-    compile_timeout: float
-    test_cases: list
-    re_run: int
+    config_file_path: Path = Path(Path(__file__).parent.parent, "conf.yaml")
+    output_dir: str = "output"
+    timeout: float = 15
+    verbose: bool = False
+    skip_setup: bool = False
+    dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+    compile_timeout: float = 1200
+    test_cases: list[str] = field(default_factory=list)
+    re_run: int = 0
+
+
+SETTINGS: _Settings = _Settings()
 
 
 def _get_parser() -> argparse.ArgumentParser:
@@ -81,7 +84,8 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--config-file",
         action=_env_arg("DTS_CFG_FILE"),
-        default="conf.yaml",
+        default=SETTINGS.config_file_path,
+        type=Path,
         help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
         "and targets.",
     )
@@ -90,7 +94,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--output-dir",
         "--output",
         action=_env_arg("DTS_OUTPUT_DIR"),
-        default="output",
+        default=SETTINGS.output_dir,
         help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
     )
 
@@ -98,7 +102,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-t",
         "--timeout",
         action=_env_arg("DTS_TIMEOUT"),
-        default=15,
+        default=SETTINGS.timeout,
         type=float,
         help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
         "compiling DPDK.",
@@ -108,7 +112,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-v",
         "--verbose",
         action=_env_arg("DTS_VERBOSE"),
-        default="N",
+        default=SETTINGS.verbose,
         help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
         "to the console.",
     )
@@ -117,7 +121,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-s",
         "--skip-setup",
         action=_env_arg("DTS_SKIP_SETUP"),
-        default="N",
+        default=SETTINGS.skip_setup,
         help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
     )
 
@@ -125,7 +129,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--tarball",
         "--snapshot",
         action=_env_arg("DTS_DPDK_TARBALL"),
-        default="dpdk.tar.xz",
+        default=SETTINGS.dpdk_tarball_path,
         type=Path,
         help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball "
         "which will be used in testing.",
@@ -134,7 +138,7 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--compile-timeout",
         action=_env_arg("DTS_COMPILE_TIMEOUT"),
-        default=1200,
+        default=SETTINGS.compile_timeout,
         type=float,
         help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
     )
@@ -142,8 +146,9 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--test-cases",
         action=_env_arg("DTS_TESTCASES"),
-        default="",
-        help="[DTS_TESTCASES] Comma-separated list of test cases to execute. "
+        nargs="*",
+        default=SETTINGS.test_cases,
+        help="[DTS_TESTCASES] A list of test cases to execute. "
         "Unknown test cases will be silently ignored.",
     )
 
@@ -151,7 +156,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--re-run",
         "--re_run",
         action=_env_arg("DTS_RERUN"),
-        default=0,
+        default=SETTINGS.re_run,
         type=int,
         help="[DTS_RERUN] Re-run each test case the specified amount of times "
         "if a test failure occurs",
@@ -165,10 +170,11 @@ def _check_tarball_path(parsed_args: argparse.Namespace) -> None:
         raise ConfigurationError(f"DPDK tarball '{parsed_args.tarball}' doesn't exist.")
 
 
-def _get_settings() -> _Settings:
+def set_settings() -> None:
+    global SETTINGS
     parsed_args = _get_parser().parse_args()
     _check_tarball_path(parsed_args)
-    return _Settings(
+    SETTINGS = _Settings(
         config_file_path=parsed_args.config_file,
         output_dir=parsed_args.output_dir,
         timeout=parsed_args.timeout,
@@ -176,9 +182,6 @@ def _get_settings() -> _Settings:
         skip_setup=(parsed_args.skip_setup == "Y"),
         dpdk_tarball_path=parsed_args.tarball,
         compile_timeout=parsed_args.compile_timeout,
-        test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+        test_cases=parsed_args.test_cases,
         re_run=parsed_args.re_run,
     )
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index f54a947051..148f81993d 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,14 +9,6 @@
 
 # pylama:ignore=W0611
 
-from .hw import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-    VirtualDevice,
-    lcore_filter,
-)
 from .node import Node
 from .sut_node import SutNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
-    core_list: list[LogicalCore],
-    filter_specifier: LogicalCoreCount | LogicalCoreList,
-    ascending: bool,
-) -> LogicalCoreFilter:
-    if isinstance(filter_specifier, LogicalCoreList):
-        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
-    elif isinstance(filter_specifier, LogicalCoreCount):
-        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
-    else:
-        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index d48fafe65d..90467981c3 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,19 +12,17 @@
 from framework.config import (
     BuildTargetConfiguration,
     ExecutionConfiguration,
-    NodeConfiguration,
-)
-from framework.logger import DTSLOG, getLogger
-from framework.remote_session import OSSession, create_session
-from framework.settings import SETTINGS
-
-from .hw import (
     LogicalCore,
     LogicalCoreCount,
     LogicalCoreList,
     LogicalCoreListFilter,
+    NodeConfiguration,
     lcore_filter,
 )
+from framework.logger import DTSLOG, getLogger
+from framework.remote_session import create_session
+from framework.remote_session.os_session import OSSession
+from framework.settings import SETTINGS
 
 
 class Node(object):
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 2b2b50d982..6db4a505bb 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -7,13 +7,18 @@
 import time
 from pathlib import PurePath
 
-from framework.config import BuildTargetConfiguration, NodeConfiguration
+from framework.config import (
+    BuildTargetConfiguration,
+    LogicalCoreCount,
+    LogicalCoreList,
+    NodeConfiguration,
+)
 from framework.remote_session import CommandResult, OSSession
 from framework.settings import SETTINGS
 from framework.utils import EnvVarsDict, MesonArgs
 
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
 from .node import Node
+from .virtual_device import VirtualDevice
 
 
 class SutNode(Node):
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..060ff1b19a 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,11 @@
 
 import logging
 
-from framework import dts
+from framework import dts, settings
 
 
 def main() -> None:
+    settings.set_settings()
     dts.run_all()
 
 
diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
index 7e3d95c0cf..96c31a6c8c 100644
--- a/dts/tests/TestSuite_hello_world.py
+++ b/dts/tests/TestSuite_hello_world.py
@@ -6,12 +6,8 @@
 No other EAL parameters apart from cores are used.
 """
 
+from framework.config import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
 from framework.test_suite import TestSuite
-from framework.testbed_model import (
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreList,
-)
 
 
 class TestHelloWorld(TestSuite):
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v2 2/4] dts: add doc generation dependencies
  2023-05-04 12:37 ` [RFC PATCH v2 " Juraj Linkeš
  2023-05-04 12:37   ` [RFC PATCH v2 1/4] dts: code adjustments for sphinx Juraj Linkeš
@ 2023-05-04 12:37   ` Juraj Linkeš
  2023-05-04 12:37   ` [RFC PATCH v2 3/4] dts: add doc generation Juraj Linkeš
                     ` (3 subsequent siblings)
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-04 12:37 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/poetry.lock    | 770 +++++++++++++++++++++++++++++++++++++++++----
 dts/pyproject.toml |   7 +
 2 files changed, 710 insertions(+), 67 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 0b2a007d4d..500f89dac1 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,24 +1,69 @@
+# This file is automatically @generated by Poetry and should not be changed by hand.
+
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+    {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
 [[package]]
 name = "attrs"
-version = "22.1.0"
+version = "22.2.0"
 description = "Classes Without Boilerplate"
 category = "main"
 optional = false
-python-versions = ">=3.5"
+python-versions = ">=3.6"
+files = [
+    {file = "attrs-22.2.0-py3-none-any.whl", hash = "sha256:29e95c7f6778868dbd49170f98f8818f78f3dc5e0e37c0b1f474e3561b240836"},
+    {file = "attrs-22.2.0.tar.gz", hash = "sha256:c9227bfc2f01993c03f68db37d1d15c9690188323c067c641f1a35ca58185f99"},
+]
 
 [package.extras]
-dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit", "cloudpickle"]
-docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
-tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "cloudpickle"]
-tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "cloudpickle"]
+cov = ["attrs[tests]", "coverage-enable-subprocess", "coverage[toml] (>=5.3)"]
+dev = ["attrs[docs,tests]"]
+docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope.interface"]
+tests = ["attrs[tests-no-zope]", "zope.interface"]
+tests-no-zope = ["cloudpickle", "cloudpickle", "hypothesis", "hypothesis", "mypy (>=0.971,<0.990)", "mypy (>=0.971,<0.990)", "pympler", "pympler", "pytest (>=4.3.0)", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-mypy-plugins", "pytest-xdist[psutil]", "pytest-xdist[psutil]"]
+
+[[package]]
+name = "babel"
+version = "2.12.1"
+description = "Internationalization utilities"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Babel-2.12.1-py3-none-any.whl", hash = "sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610"},
+    {file = "Babel-2.12.1.tar.gz", hash = "sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455"},
+]
 
 [[package]]
 name = "black"
-version = "22.10.0"
+version = "22.12.0"
 description = "The uncompromising code formatter."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "black-22.12.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9eedd20838bd5d75b80c9f5487dbcb06836a43833a37846cf1d8c1cc01cef59d"},
+    {file = "black-22.12.0-cp310-cp310-win_amd64.whl", hash = "sha256:159a46a4947f73387b4d83e87ea006dbb2337eab6c879620a3ba52699b1f4351"},
+    {file = "black-22.12.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d30b212bffeb1e252b31dd269dfae69dd17e06d92b87ad26e23890f3efea366f"},
+    {file = "black-22.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:7412e75863aa5c5411886804678b7d083c7c28421210180d67dfd8cf1221e1f4"},
+    {file = "black-22.12.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c116eed0efb9ff870ded8b62fe9f28dd61ef6e9ddd28d83d7d264a38417dcee2"},
+    {file = "black-22.12.0-cp37-cp37m-win_amd64.whl", hash = "sha256:1f58cbe16dfe8c12b7434e50ff889fa479072096d79f0a7f25e4ab8e94cd8350"},
+    {file = "black-22.12.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:77d86c9f3db9b1bf6761244bc0b3572a546f5fe37917a044e02f3166d5aafa7d"},
+    {file = "black-22.12.0-cp38-cp38-win_amd64.whl", hash = "sha256:82d9fe8fee3401e02e79767016b4907820a7dc28d70d137eb397b92ef3cc5bfc"},
+    {file = "black-22.12.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:101c69b23df9b44247bd88e1d7e90154336ac4992502d4197bdac35dd7ee3320"},
+    {file = "black-22.12.0-cp39-cp39-win_amd64.whl", hash = "sha256:559c7a1ba9a006226f09e4916060982fd27334ae1998e7a38b3f33a37f7a2148"},
+    {file = "black-22.12.0-py3-none-any.whl", hash = "sha256:436cc9167dd28040ad90d3b404aec22cedf24a6e4d7de221bec2730ec0c97bcf"},
+    {file = "black-22.12.0.tar.gz", hash = "sha256:229351e5a18ca30f447bf724d007f890f97e13af070bb6ad4c0a441cd7596a2f"},
+]
 
 [package.dependencies]
 click = ">=8.0.0"
@@ -33,6 +78,103 @@ d = ["aiohttp (>=3.7.4)"]
 jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
 uvloop = ["uvloop (>=0.15.2)"]
 
+[[package]]
+name = "certifi"
+version = "2022.12.7"
+description = "Python package for providing Mozilla's CA Bundle."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "certifi-2022.12.7-py3-none-any.whl", hash = "sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"},
+    {file = "certifi-2022.12.7.tar.gz", hash = "sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3"},
+]
+
+[[package]]
+name = "charset-normalizer"
+version = "3.1.0"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+category = "dev"
+optional = false
+python-versions = ">=3.7.0"
+files = [
+    {file = "charset-normalizer-3.1.0.tar.gz", hash = "sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-win32.whl", hash = "sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-win32.whl", hash = "sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-win32.whl", hash = "sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-win32.whl", hash = "sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-win32.whl", hash = "sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b"},
+    {file = "charset_normalizer-3.1.0-py3-none-any.whl", hash = "sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d"},
+]
+
 [[package]]
 name = "click"
 version = "8.1.3"
@@ -40,6 +182,10 @@ description = "Composable command line interface toolkit"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"},
+    {file = "click-8.1.3.tar.gz", hash = "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e"},
+]
 
 [package.dependencies]
 colorama = {version = "*", markers = "platform_system == \"Windows\""}
@@ -51,20 +197,82 @@ description = "Cross-platform colored terminal text."
 category = "dev"
 optional = false
 python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
+files = [
+    {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
+    {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
+]
+
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+    {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+    {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+    {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+    {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+    {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
 
 [[package]]
 name = "isort"
-version = "5.10.1"
+version = "5.12.0"
 description = "A Python utility / library to sort Python imports."
 category = "dev"
 optional = false
-python-versions = ">=3.6.1,<4.0"
+python-versions = ">=3.8.0"
+files = [
+    {file = "isort-5.12.0-py3-none-any.whl", hash = "sha256:f84c2818376e66cf843d497486ea8fed8700b340f308f076c6fb1229dff318b6"},
+    {file = "isort-5.12.0.tar.gz", hash = "sha256:8bef7dde241278824a6d83f44a544709b065191b95b6e50894bdc722fcba0504"},
+]
 
 [package.extras]
-pipfile_deprecated_finder = ["pipreqs", "requirementslib"]
-requirements_deprecated_finder = ["pipreqs", "pip-api"]
-colors = ["colorama (>=0.4.3,<0.5.0)"]
+colors = ["colorama (>=0.4.3)"]
+pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"]
 plugins = ["setuptools"]
+requirements-deprecated-finder = ["pip-api", "pipreqs"]
+
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+    {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
 
 [[package]]
 name = "jsonpatch"
@@ -73,6 +281,10 @@ description = "Apply JSON-Patches (RFC 6902)"
 category = "main"
 optional = false
 python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+    {file = "jsonpatch-1.32-py2.py3-none-any.whl", hash = "sha256:26ac385719ac9f54df8a2f0827bb8253aa3ea8ab7b3368457bcdb8c14595a397"},
+    {file = "jsonpatch-1.32.tar.gz", hash = "sha256:b6ddfe6c3db30d81a96aaeceb6baf916094ffa23d7dd5fa2c13e13f8b6e600c2"},
+]
 
 [package.dependencies]
 jsonpointer = ">=1.9"
@@ -84,14 +296,22 @@ description = "Identify specific nodes in a JSON document (RFC 6901)"
 category = "main"
 optional = false
 python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+    {file = "jsonpointer-2.3-py2.py3-none-any.whl", hash = "sha256:51801e558539b4e9cd268638c078c6c5746c9ac96bc38152d443400e4f3793e9"},
+    {file = "jsonpointer-2.3.tar.gz", hash = "sha256:97cba51526c829282218feb99dab1b1e6bdf8efd1c43dc9d57be093c0d69c99a"},
+]
 
 [[package]]
 name = "jsonschema"
-version = "4.17.0"
+version = "4.17.3"
 description = "An implementation of JSON Schema validation for Python"
 category = "main"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "jsonschema-4.17.3-py3-none-any.whl", hash = "sha256:a870ad254da1a8ca84b6a2905cac29d265f805acc57af304784962a2aa6508f6"},
+    {file = "jsonschema-4.17.3.tar.gz", hash = "sha256:0f864437ab8b6076ba6707453ef8f98a6a0d512a80e93f8abdb676f737ecb60d"},
+]
 
 [package.dependencies]
 attrs = ">=17.4.0"
@@ -101,6 +321,66 @@ pyrsistent = ">=0.14.0,<0.17.0 || >0.17.0,<0.17.1 || >0.17.1,<0.17.2 || >0.17.2"
 format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
 format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
 
+[[package]]
+name = "markupsafe"
+version = "2.1.2"
+description = "Safely add untrusted strings to HTML/XML markup."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:665a36ae6f8f20a4676b53224e33d456a6f5a72657d9c83c2aa00765072f31f7"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:340bea174e9761308703ae988e982005aedf427de816d1afe98147668cc03036"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22152d00bf4a9c7c83960521fc558f55a1adbc0631fbb00a9471e097b19d72e1"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:28057e985dace2f478e042eaa15606c7efccb700797660629da387eb289b9323"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca244fa73f50a800cf8c3ebf7fd93149ec37f5cb9596aa8873ae2c1d23498601"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d9d971ec1e79906046aa3ca266de79eac42f1dbf3612a05dc9368125952bd1a1"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7e007132af78ea9df29495dbf7b5824cb71648d7133cf7848a2a5dd00d36f9ff"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7313ce6a199651c4ed9d7e4cfb4aa56fe923b1adf9af3b420ee14e6d9a73df65"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-win32.whl", hash = "sha256:c4a549890a45f57f1ebf99c067a4ad0cb423a05544accaf2b065246827ed9603"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:835fb5e38fd89328e9c81067fd642b3593c33e1e17e2fdbf77f5676abb14a156"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:2ec4f2d48ae59bbb9d1f9d7efb9236ab81429a764dedca114f5fdabbc3788013"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:608e7073dfa9e38a85d38474c082d4281f4ce276ac0010224eaba11e929dd53a"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65608c35bfb8a76763f37036547f7adfd09270fbdbf96608be2bead319728fcd"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2bfb563d0211ce16b63c7cb9395d2c682a23187f54c3d79bfec33e6705473c6"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:da25303d91526aac3672ee6d49a2f3db2d9502a4a60b55519feb1a4c7714e07d"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:9cad97ab29dfc3f0249b483412c85c8ef4766d96cdf9dcf5a1e3caa3f3661cf1"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:085fd3201e7b12809f9e6e9bc1e5c96a368c8523fad5afb02afe3c051ae4afcc"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1bea30e9bf331f3fef67e0a3877b2288593c98a21ccb2cf29b74c581a4eb3af0"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-win32.whl", hash = "sha256:7df70907e00c970c60b9ef2938d894a9381f38e6b9db73c5be35e59d92e06625"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:e55e40ff0cc8cc5c07996915ad367fa47da6b3fc091fdadca7f5403239c5fec3"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a6e40afa7f45939ca356f348c8e23048e02cb109ced1eb8420961b2f40fb373a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cf877ab4ed6e302ec1d04952ca358b381a882fbd9d1b07cccbfd61783561f98a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63ba06c9941e46fa389d389644e2d8225e0e3e5ebcc4ff1ea8506dce646f8c8a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f1cd098434e83e656abf198f103a8207a8187c0fc110306691a2e94a78d0abb2"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:55f44b440d491028addb3b88f72207d71eeebfb7b5dbf0643f7c023ae1fba619"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:a6f2fcca746e8d5910e18782f976489939d54a91f9411c32051b4aab2bd7c513"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:0b462104ba25f1ac006fdab8b6a01ebbfbce9ed37fd37fd4acd70c67c973e460"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-win32.whl", hash = "sha256:7668b52e102d0ed87cb082380a7e2e1e78737ddecdde129acadb0eccc5423859"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6d6607f98fcf17e534162f0709aaad3ab7a96032723d8ac8750ffe17ae5a0666"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a806db027852538d2ad7555b203300173dd1b77ba116de92da9afbc3a3be3eed"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:a4abaec6ca3ad8660690236d11bfe28dfd707778e2442b45addd2f086d6ef094"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f03a532d7dee1bed20bc4884194a16160a2de9ffc6354b3878ec9682bb623c54"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4cf06cdc1dda95223e9d2d3c58d3b178aa5dacb35ee7e3bbac10e4e1faacb419"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:22731d79ed2eb25059ae3df1dfc9cb1546691cc41f4e3130fe6bfbc3ecbbecfa"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f8ffb705ffcf5ddd0e80b65ddf7bed7ee4f5a441ea7d3419e861a12eaf41af58"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8db032bf0ce9022a8e41a22598eefc802314e81b879ae093f36ce9ddf39ab1ba"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2298c859cfc5463f1b64bd55cb3e602528db6fa0f3cfd568d3605c50678f8f03"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-win32.whl", hash = "sha256:50c42830a633fa0cf9e7d27664637532791bfc31c731a87b202d2d8ac40c3ea2"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-win_amd64.whl", hash = "sha256:bb06feb762bade6bf3c8b844462274db0c76acc95c52abe8dbed28ae3d44a147"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:99625a92da8229df6d44335e6fcc558a5037dd0a760e11d84be2260e6f37002f"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8bca7e26c1dd751236cfb0c6c72d4ad61d986e9a41bbf76cb445f69488b2a2bd"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40627dcf047dadb22cd25ea7ecfe9cbf3bbbad0482ee5920b582f3809c97654f"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40dfd3fefbef579ee058f139733ac336312663c6706d1163b82b3003fb1925c4"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:090376d812fb6ac5f171e5938e82e7f2d7adc2b629101cec0db8b267815c85e2"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2e7821bffe00aa6bd07a23913b7f4e01328c3d5cc0b40b36c0bd81d362faeb65"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:c0a33bc9f02c2b17c3ea382f91b4db0e6cde90b63b296422a939886a7a80de1c"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:b8526c6d437855442cdd3d87eede9c425c4445ea011ca38d937db299382e6fa3"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-win32.whl", hash = "sha256:137678c63c977754abe9086a3ec011e8fd985ab90631145dfb9294ad09c102a7"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:0576fe974b40a400449768941d5d0858cc624e3249dfd1e0c33674e5c7ca7aed"},
+    {file = "MarkupSafe-2.1.2.tar.gz", hash = "sha256:abcabc8c2b26036d62d4c746381a6f7cf60aafcc653198ad678306986b09450d"},
+]
+
 [[package]]
 name = "mccabe"
 version = "0.7.0"
@@ -108,6 +388,10 @@ description = "McCabe checker, plugin for flake8"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "mccabe-0.7.0-py2.py3-none-any.whl", hash = "sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"},
+    {file = "mccabe-0.7.0.tar.gz", hash = "sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325"},
+]
 
 [[package]]
 name = "mypy"
@@ -116,6 +400,31 @@ description = "Optional static typing for Python"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "mypy-0.961-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:697540876638ce349b01b6786bc6094ccdaba88af446a9abb967293ce6eaa2b0"},
+    {file = "mypy-0.961-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b117650592e1782819829605a193360a08aa99f1fc23d1d71e1a75a142dc7e15"},
+    {file = "mypy-0.961-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bdd5ca340beffb8c44cb9dc26697628d1b88c6bddf5c2f6eb308c46f269bb6f3"},
+    {file = "mypy-0.961-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3e09f1f983a71d0672bbc97ae33ee3709d10c779beb613febc36805a6e28bb4e"},
+    {file = "mypy-0.961-cp310-cp310-win_amd64.whl", hash = "sha256:e999229b9f3198c0c880d5e269f9f8129c8862451ce53a011326cad38b9ccd24"},
+    {file = "mypy-0.961-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:b24be97351084b11582fef18d79004b3e4db572219deee0212078f7cf6352723"},
+    {file = "mypy-0.961-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f4a21d01fc0ba4e31d82f0fff195682e29f9401a8bdb7173891070eb260aeb3b"},
+    {file = "mypy-0.961-cp36-cp36m-win_amd64.whl", hash = "sha256:439c726a3b3da7ca84a0199a8ab444cd8896d95012c4a6c4a0d808e3147abf5d"},
+    {file = "mypy-0.961-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5a0b53747f713f490affdceef835d8f0cb7285187a6a44c33821b6d1f46ed813"},
+    {file = "mypy-0.961-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:0e9f70df36405c25cc530a86eeda1e0867863d9471fe76d1273c783df3d35c2e"},
+    {file = "mypy-0.961-cp37-cp37m-win_amd64.whl", hash = "sha256:b88f784e9e35dcaa075519096dc947a388319cb86811b6af621e3523980f1c8a"},
+    {file = "mypy-0.961-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:d5aaf1edaa7692490f72bdb9fbd941fbf2e201713523bdb3f4038be0af8846c6"},
+    {file = "mypy-0.961-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9f5f5a74085d9a81a1f9c78081d60a0040c3efb3f28e5c9912b900adf59a16e6"},
+    {file = "mypy-0.961-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f4b794db44168a4fc886e3450201365c9526a522c46ba089b55e1f11c163750d"},
+    {file = "mypy-0.961-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:64759a273d590040a592e0f4186539858c948302c653c2eac840c7a3cd29e51b"},
+    {file = "mypy-0.961-cp38-cp38-win_amd64.whl", hash = "sha256:63e85a03770ebf403291ec50097954cc5caf2a9205c888ce3a61bd3f82e17569"},
+    {file = "mypy-0.961-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5f1332964963d4832a94bebc10f13d3279be3ce8f6c64da563d6ee6e2eeda932"},
+    {file = "mypy-0.961-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:006be38474216b833eca29ff6b73e143386f352e10e9c2fbe76aa8549e5554f5"},
+    {file = "mypy-0.961-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9940e6916ed9371809b35b2154baf1f684acba935cd09928952310fbddaba648"},
+    {file = "mypy-0.961-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:a5ea0875a049de1b63b972456542f04643daf320d27dc592d7c3d9cd5d9bf950"},
+    {file = "mypy-0.961-cp39-cp39-win_amd64.whl", hash = "sha256:1ece702f29270ec6af25db8cf6185c04c02311c6bb21a69f423d40e527b75c56"},
+    {file = "mypy-0.961-py3-none-any.whl", hash = "sha256:03c6cc893e7563e7b2949b969e63f02c000b32502a1b4d1314cabe391aa87d66"},
+    {file = "mypy-0.961.tar.gz", hash = "sha256:f730d56cb924d371c26b8eaddeea3cc07d78ff51c521c6d04899ac6904b75492"},
+]
 
 [package.dependencies]
 mypy-extensions = ">=0.4.3"
@@ -129,19 +438,39 @@ reports = ["lxml"]
 
 [[package]]
 name = "mypy-extensions"
-version = "0.4.3"
-description = "Experimental type system extensions for programs checked with the mypy typechecker."
+version = "1.0.0"
+description = "Type system extensions for programs checked with the mypy type checker."
 category = "dev"
 optional = false
-python-versions = "*"
+python-versions = ">=3.5"
+files = [
+    {file = "mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d"},
+    {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
+]
+
+[[package]]
+name = "packaging"
+version = "23.0"
+description = "Core utilities for Python packages"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "packaging-23.0-py3-none-any.whl", hash = "sha256:714ac14496c3e68c99c29b00845f7a2b85f3bb6f1078fd9f72fd20f0570002b2"},
+    {file = "packaging-23.0.tar.gz", hash = "sha256:b6ad297f8907de0fa2fe1ccbd26fdaf387f5f47c7275fedf8cce89f99446cf97"},
+]
 
 [[package]]
 name = "pathspec"
-version = "0.10.1"
+version = "0.11.1"
 description = "Utility library for gitignore style pattern matching of file paths."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pathspec-0.11.1-py3-none-any.whl", hash = "sha256:d8af70af76652554bd134c22b3e8a1cc46ed7d91edcdd721ef1a0c51a84a5293"},
+    {file = "pathspec-0.11.1.tar.gz", hash = "sha256:2798de800fa92780e33acca925945e9a19a133b715067cf165b8866c15a31687"},
+]
 
 [[package]]
 name = "pexpect"
@@ -150,21 +479,29 @@ description = "Pexpect allows easy control of interactive console applications."
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
+    {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
+]
 
 [package.dependencies]
 ptyprocess = ">=0.5"
 
 [[package]]
 name = "platformdirs"
-version = "2.5.2"
-description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
+version = "3.1.1"
+description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "platformdirs-3.1.1-py3-none-any.whl", hash = "sha256:e5986afb596e4bb5bde29a79ac9061aa955b94fca2399b7aaac4090860920dd8"},
+    {file = "platformdirs-3.1.1.tar.gz", hash = "sha256:024996549ee88ec1a9aa99ff7f8fc819bb59e2c3477b410d90a16d32d6e707aa"},
+]
 
 [package.extras]
-docs = ["furo (>=2021.7.5b38)", "proselint (>=0.10.2)", "sphinx-autodoc-typehints (>=1.12)", "sphinx (>=4)"]
-test = ["appdirs (==1.4.4)", "pytest-cov (>=2.7)", "pytest-mock (>=3.6)", "pytest (>=6)"]
+docs = ["furo (>=2022.12.7)", "proselint (>=0.13)", "sphinx (>=6.1.3)", "sphinx-autodoc-typehints (>=1.22,!=1.23.4)"]
+test = ["appdirs (==1.4.4)", "covdefaults (>=2.2.2)", "pytest (>=7.2.1)", "pytest-cov (>=4)", "pytest-mock (>=3.10)"]
 
 [[package]]
 name = "ptyprocess"
@@ -173,28 +510,40 @@ description = "Run a subprocess in a pseudo terminal"
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35"},
+    {file = "ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220"},
+]
 
 [[package]]
 name = "pycodestyle"
-version = "2.9.1"
+version = "2.10.0"
 description = "Python style guide checker"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pycodestyle-2.10.0-py2.py3-none-any.whl", hash = "sha256:8a4eaf0d0495c7395bdab3589ac2db602797d76207242c17d470186815706610"},
+    {file = "pycodestyle-2.10.0.tar.gz", hash = "sha256:347187bdb476329d98f695c213d7295a846d1152ff4fe9bacb8a9590b8ee7053"},
+]
 
 [[package]]
 name = "pydocstyle"
-version = "6.1.1"
+version = "6.3.0"
 description = "Python docstring style checker"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
+    {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+]
 
 [package.dependencies]
-snowballstemmer = "*"
+snowballstemmer = ">=2.2.0"
 
 [package.extras]
-toml = ["toml"]
+toml = ["tomli (>=1.2.3)"]
 
 [[package]]
 name = "pyflakes"
@@ -203,6 +552,25 @@ description = "passive checker of Python programs"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pyflakes-2.5.0-py2.py3-none-any.whl", hash = "sha256:4579f67d887f804e67edb544428f264b7b24f435b263c4614f384135cea553d2"},
+    {file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
+]
+
+[[package]]
+name = "pygments"
+version = "2.14.0"
+description = "Pygments is a syntax highlighting package written in Python."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "Pygments-2.14.0-py3-none-any.whl", hash = "sha256:fa7bd7bd2771287c0de303af8bfdfc731f51bd2c6a47ab69d117138893b82717"},
+    {file = "Pygments-2.14.0.tar.gz", hash = "sha256:b3ed06a9e8ac9a9aae5a6f5dbe78a8a58655d17b43b93c078f094ddc476ae297"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
 
 [[package]]
 name = "pylama"
@@ -211,6 +579,10 @@ description = "Code audit tool for python"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pylama-8.4.1-py3-none-any.whl", hash = "sha256:5bbdbf5b620aba7206d688ed9fc917ecd3d73e15ec1a89647037a09fa3a86e60"},
+    {file = "pylama-8.4.1.tar.gz", hash = "sha256:2d4f7aecfb5b7466216d48610c7d6bad1c3990c29cdd392ad08259b161e486f6"},
+]
 
 [package.dependencies]
 mccabe = ">=0.7.0"
@@ -219,22 +591,51 @@ pydocstyle = ">=6.1.1"
 pyflakes = ">=2.5.0"
 
 [package.extras]
-all = ["pylint", "eradicate", "radon", "mypy", "vulture"]
+all = ["eradicate", "mypy", "pylint", "radon", "vulture"]
 eradicate = ["eradicate"]
 mypy = ["mypy"]
 pylint = ["pylint"]
 radon = ["radon"]
-tests = ["pytest (>=7.1.2)", "pytest-mypy", "eradicate (>=2.0.0)", "radon (>=5.1.0)", "mypy", "pylint (>=2.11.1)", "pylama-quotes", "toml", "vulture", "types-setuptools", "types-toml"]
+tests = ["eradicate (>=2.0.0)", "mypy", "pylama-quotes", "pylint (>=2.11.1)", "pytest (>=7.1.2)", "pytest-mypy", "radon (>=5.1.0)", "toml", "types-setuptools", "types-toml", "vulture"]
 toml = ["toml (>=0.10.2)"]
 vulture = ["vulture"]
 
 [[package]]
 name = "pyrsistent"
-version = "0.19.1"
+version = "0.19.3"
 description = "Persistent/Functional/Immutable data structures"
 category = "main"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pyrsistent-0.19.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:20460ac0ea439a3e79caa1dbd560344b64ed75e85d8703943e0b66c2a6150e4a"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4c18264cb84b5e68e7085a43723f9e4c1fd1d935ab240ce02c0324a8e01ccb64"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b774f9288dda8d425adb6544e5903f1fb6c273ab3128a355c6b972b7df39dcf"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-win32.whl", hash = "sha256:5a474fb80f5e0d6c9394d8db0fc19e90fa540b82ee52dba7d246a7791712f74a"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-win_amd64.whl", hash = "sha256:49c32f216c17148695ca0e02a5c521e28a4ee6c5089f97e34fe24163113722da"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:f0774bf48631f3a20471dd7c5989657b639fd2d285b861237ea9e82c36a415a9"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ab2204234c0ecd8b9368dbd6a53e83c3d4f3cab10ecaf6d0e772f456c442393"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e42296a09e83028b3476f7073fcb69ffebac0e66dbbfd1bd847d61f74db30f19"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-win32.whl", hash = "sha256:64220c429e42a7150f4bfd280f6f4bb2850f95956bde93c6fda1b70507af6ef3"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-win_amd64.whl", hash = "sha256:016ad1afadf318eb7911baa24b049909f7f3bb2c5b1ed7b6a8f21db21ea3faa8"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c4db1bd596fefd66b296a3d5d943c94f4fac5bcd13e99bffe2ba6a759d959a28"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aeda827381f5e5d65cced3024126529ddc4289d944f75e090572c77ceb19adbf"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:42ac0b2f44607eb92ae88609eda931a4f0dfa03038c44c772e07f43e738bcac9"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-win32.whl", hash = "sha256:e8f2b814a3dc6225964fa03d8582c6e0b6650d68a232df41e3cc1b66a5d2f8d1"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-win_amd64.whl", hash = "sha256:c9bb60a40a0ab9aba40a59f68214eed5a29c6274c83b2cc206a359c4a89fa41b"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a2471f3f8693101975b1ff85ffd19bb7ca7dd7c38f8a81701f67d6b4f97b87d8"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc5d149f31706762c1f8bda2e8c4f8fead6e80312e3692619a75301d3dbb819a"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3311cb4237a341aa52ab8448c27e3a9931e2ee09561ad150ba94e4cfd3fc888c"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-win32.whl", hash = "sha256:f0e7c4b2f77593871e918be000b96c8107da48444d57005b6a6bc61fb4331b2c"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-win_amd64.whl", hash = "sha256:c147257a92374fde8498491f53ffa8f4822cd70c0d85037e09028e478cababb7"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b735e538f74ec31378f5a1e3886a26d2ca6351106b4dfde376a26fc32a044edc"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99abb85579e2165bd8522f0c0138864da97847875ecbd45f3e7e2af569bfc6f2"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3a8cb235fa6d3fd7aae6a4f1429bbb1fec1577d978098da1252f0489937786f3"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-win32.whl", hash = "sha256:c74bed51f9b41c48366a286395c67f4e894374306b197e62810e0fdaf2364da2"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-win_amd64.whl", hash = "sha256:878433581fc23e906d947a6814336eee031a00e6defba224234169ae3d3d6a98"},
+    {file = "pyrsistent-0.19.3-py3-none-any.whl", hash = "sha256:ccf0d6bd208f8111179f0c26fdf84ed7c3891982f2edaeae7422575f47e66b64"},
+    {file = "pyrsistent-0.19.3.tar.gz", hash = "sha256:1a2994773706bbb4995c31a97bc94f1418314923bd1048c6d964837040376440"},
+]
 
 [[package]]
 name = "pyyaml"
@@ -243,6 +644,70 @@ description = "YAML parser and emitter for Python"
 category = "main"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "PyYAML-6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4db7c7aef085872ef65a8fd7d6d09a14ae91f691dec3e87ee5ee0539d516f53"},
+    {file = "PyYAML-6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9df7ed3b3d2e0ecfe09e14741b857df43adb5a3ddadc919a2d94fbdf78fea53c"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77f396e6ef4c73fdc33a9157446466f1cff553d979bd00ecb64385760c6babdc"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a80a78046a72361de73f8f395f1f1e49f956c6be882eed58505a15f3e430962b"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f84fbc98b019fef2ee9a1cb3ce93e3187a6df0b2538a651bfb890254ba9f90b5"},
+    {file = "PyYAML-6.0-cp310-cp310-win32.whl", hash = "sha256:2cd5df3de48857ed0544b34e2d40e9fac445930039f3cfe4bcc592a1f836d513"},
+    {file = "PyYAML-6.0-cp310-cp310-win_amd64.whl", hash = "sha256:daf496c58a8c52083df09b80c860005194014c3698698d1a57cbcfa182142a3a"},
+    {file = "PyYAML-6.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d4b0ba9512519522b118090257be113b9468d804b19d63c71dbcf4a48fa32358"},
+    {file = "PyYAML-6.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:81957921f441d50af23654aa6c5e5eaf9b06aba7f0a19c18a538dc7ef291c5a1"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:afa17f5bc4d1b10afd4466fd3a44dc0e245382deca5b3c353d8b757f9e3ecb8d"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dbad0e9d368bb989f4515da330b88a057617d16b6a8245084f1b05400f24609f"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:432557aa2c09802be39460360ddffd48156e30721f5e8d917f01d31694216782"},
+    {file = "PyYAML-6.0-cp311-cp311-win32.whl", hash = "sha256:bfaef573a63ba8923503d27530362590ff4f576c626d86a9fed95822a8255fd7"},
+    {file = "PyYAML-6.0-cp311-cp311-win_amd64.whl", hash = "sha256:01b45c0191e6d66c470b6cf1b9531a771a83c1c4208272ead47a3ae4f2f603bf"},
+    {file = "PyYAML-6.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:897b80890765f037df3403d22bab41627ca8811ae55e9a722fd0392850ec4d86"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50602afada6d6cbfad699b0c7bb50d5ccffa7e46a3d738092afddc1f9758427f"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:48c346915c114f5fdb3ead70312bd042a953a8ce5c7106d5bfb1a5254e47da92"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:98c4d36e99714e55cfbaaee6dd5badbc9a1ec339ebfc3b1f52e293aee6bb71a4"},
+    {file = "PyYAML-6.0-cp36-cp36m-win32.whl", hash = "sha256:0283c35a6a9fbf047493e3a0ce8d79ef5030852c51e9d911a27badfde0605293"},
+    {file = "PyYAML-6.0-cp36-cp36m-win_amd64.whl", hash = "sha256:07751360502caac1c067a8132d150cf3d61339af5691fe9e87803040dbc5db57"},
+    {file = "PyYAML-6.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:819b3830a1543db06c4d4b865e70ded25be52a2e0631ccd2f6a47a2822f2fd7c"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:473f9edb243cb1935ab5a084eb238d842fb8f404ed2193a915d1784b5a6b5fc0"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0ce82d761c532fe4ec3f87fc45688bdd3a4c1dc5e0b4a19814b9009a29baefd4"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:231710d57adfd809ef5d34183b8ed1eeae3f76459c18fb4a0b373ad56bedcdd9"},
+    {file = "PyYAML-6.0-cp37-cp37m-win32.whl", hash = "sha256:c5687b8d43cf58545ade1fe3e055f70eac7a5a1a0bf42824308d868289a95737"},
+    {file = "PyYAML-6.0-cp37-cp37m-win_amd64.whl", hash = "sha256:d15a181d1ecd0d4270dc32edb46f7cb7733c7c508857278d3d378d14d606db2d"},
+    {file = "PyYAML-6.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0b4624f379dab24d3725ffde76559cff63d9ec94e1736b556dacdfebe5ab6d4b"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:213c60cd50106436cc818accf5baa1aba61c0189ff610f64f4a3e8c6726218ba"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9fa600030013c4de8165339db93d182b9431076eb98eb40ee068700c9c813e34"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:277a0ef2981ca40581a47093e9e2d13b3f1fbbeffae064c1d21bfceba2030287"},
+    {file = "PyYAML-6.0-cp38-cp38-win32.whl", hash = "sha256:d4eccecf9adf6fbcc6861a38015c2a64f38b9d94838ac1810a9023a0609e1b78"},
+    {file = "PyYAML-6.0-cp38-cp38-win_amd64.whl", hash = "sha256:1e4747bc279b4f613a09eb64bba2ba602d8a6664c6ce6396a4d0cd413a50ce07"},
+    {file = "PyYAML-6.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:055d937d65826939cb044fc8c9b08889e8c743fdc6a32b33e2390f66013e449b"},
+    {file = "PyYAML-6.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e61ceaab6f49fb8bdfaa0f92c4b57bcfbea54c09277b1b4f7ac376bfb7a7c174"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d67d839ede4ed1b28a4e8909735fc992a923cdb84e618544973d7dfc71540803"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cba8c411ef271aa037d7357a2bc8f9ee8b58b9965831d9e51baf703280dc73d3"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:40527857252b61eacd1d9af500c3337ba8deb8fc298940291486c465c8b46ec0"},
+    {file = "PyYAML-6.0-cp39-cp39-win32.whl", hash = "sha256:b5b9eccad747aabaaffbc6064800670f0c297e52c12754eb1d976c57e4f74dcb"},
+    {file = "PyYAML-6.0-cp39-cp39-win_amd64.whl", hash = "sha256:b3d267842bf12586ba6c734f89d1f5b871df0273157918b0ccefa29deb05c21c"},
+    {file = "PyYAML-6.0.tar.gz", hash = "sha256:68fb519c14306fec9720a2a5b45bc9f0c8d1b9c72adf45c37baedfcd949c35a2"},
+]
+
+[[package]]
+name = "requests"
+version = "2.28.2"
+description = "Python HTTP for Humans."
+category = "dev"
+optional = false
+python-versions = ">=3.7, <4"
+files = [
+    {file = "requests-2.28.2-py3-none-any.whl", hash = "sha256:64299f4909223da747622c030b781c0d7811e359c37124b4bd368fb8c6518baa"},
+    {file = "requests-2.28.2.tar.gz", hash = "sha256:98b1b2782e3c6c4904938b84c0eb932721069dfdb9134313beff7c83c2df24bf"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<1.27"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
 
 [[package]]
 name = "snowballstemmer"
@@ -251,6 +716,175 @@ description = "This package provides 29 stemmers for 28 languages generated from
 category = "dev"
 optional = false
 python-versions = "*"
+files = [
+    {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"},
+    {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
+]
+
+[[package]]
+name = "sphinx"
+version = "6.1.3"
+description = "Python documentation generator"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "Sphinx-6.1.3.tar.gz", hash = "sha256:0dac3b698538ffef41716cf97ba26c1c7788dba73ce6f150c1ff5b4720786dd2"},
+    {file = "sphinx-6.1.3-py3-none-any.whl", hash = "sha256:807d1cb3d6be87eb78a381c3e70ebd8d346b9a25f3753e9947e866b2786865fc"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.0"
+description = "Read the Docs theme for Sphinx"
+category = "dev"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+    {file = "sphinx_rtd_theme-1.2.0-py2.py3-none-any.whl", hash = "sha256:f823f7e71890abe0ac6aaa6013361ea2696fc8d3e1fa798f463e82bdb77eeff2"},
+    {file = "sphinx_rtd_theme-1.2.0.tar.gz", hash = "sha256:a0d8bd1a2ed52e0b338cbe19c4b2eef3c5e7a048769753dac6a9f059c7b641b8"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = {version = ">=2.0.0,<3.0.0 || >3.0.0", markers = "python_version > \"3\""}
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.4"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "sphinxcontrib-applehelp-1.0.4.tar.gz", hash = "sha256:828f867945bbe39817c210a1abfd1bc4895c8b73fcaade56d45357a348a07d7e"},
+    {file = "sphinxcontrib_applehelp-1.0.4-py3-none-any.whl", hash = "sha256:29d341f67fb0f6f586b23ad80e072c8e6ad0b48417db2bde114a4c9746feb228"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.2"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"},
+    {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.1"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "sphinxcontrib-htmlhelp-2.0.1.tar.gz", hash = "sha256:0cbdd302815330058422b98a113195c9249825d681e18f11e8b1f78a2f11efff"},
+    {file = "sphinxcontrib_htmlhelp-2.0.1-py3-none-any.whl", hash = "sha256:c38cb46dccf316c79de6e5515e1770414b797162b23cd3d06e67020e1d2a6903"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+category = "dev"
+optional = false
+python-versions = ">=2.7"
+files = [
+    {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+    {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+    {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.3"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"},
+    {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.5"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"},
+    {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
 
 [[package]]
 name = "toml"
@@ -259,6 +893,10 @@ description = "Python Library for Tom's Obvious, Minimal Language"
 category = "dev"
 optional = false
 python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
+files = [
+    {file = "toml-0.10.2-py2.py3-none-any.whl", hash = "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b"},
+    {file = "toml-0.10.2.tar.gz", hash = "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"},
+]
 
 [[package]]
 name = "tomli"
@@ -267,22 +905,51 @@ description = "A lil' TOML parser"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"},
+    {file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"},
+]
 
 [[package]]
 name = "types-pyyaml"
-version = "6.0.12.1"
+version = "6.0.12.8"
 description = "Typing stubs for PyYAML"
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "types-PyYAML-6.0.12.8.tar.gz", hash = "sha256:19304869a89d49af00be681e7b267414df213f4eb89634c4495fa62e8f942b9f"},
+    {file = "types_PyYAML-6.0.12.8-py3-none-any.whl", hash = "sha256:5314a4b2580999b2ea06b2e5f9a7763d860d6e09cdf21c0e9561daa9cbd60178"},
+]
 
 [[package]]
 name = "typing-extensions"
-version = "4.4.0"
+version = "4.5.0"
 description = "Backported and Experimental Type Hints for Python 3.7+"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "typing_extensions-4.5.0-py3-none-any.whl", hash = "sha256:fb33085c39dd998ac16d1431ebc293a8b3eedd00fd4a32de0ff79002c19511b4"},
+    {file = "typing_extensions-4.5.0.tar.gz", hash = "sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb"},
+]
+
+[[package]]
+name = "urllib3"
+version = "1.26.15"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
+files = [
+    {file = "urllib3-1.26.15-py2.py3-none-any.whl", hash = "sha256:aa751d169e23c7479ce47a0cb0da579e3ede798f994f5816a74e4f4500dcea42"},
+    {file = "urllib3-1.26.15.tar.gz", hash = "sha256:8a388717b9476f934a21484e8c8e61875ab60644d29b9b39e11e4b9dc1c6b305"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)", "brotlipy (>=0.6.0)"]
+secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress", "pyOpenSSL (>=0.14)", "urllib3-secure-extra"]
+socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
 
 [[package]]
 name = "warlock"
@@ -291,47 +958,16 @@ description = "Python object model built on JSON schema and JSON patch."
 category = "main"
 optional = false
 python-versions = ">=3.7,<4.0"
+files = [
+    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
+    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
+]
 
 [package.dependencies]
 jsonpatch = ">=1,<2"
 jsonschema = ">=4,<5"
 
 [metadata]
-lock-version = "1.1"
+lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "a0f040b07fc6ce4deb0be078b9a88c2a465cb6bccb9e260a67e92c2403e2319f"
-
-[metadata.files]
-attrs = []
-black = []
-click = []
-colorama = []
-isort = []
-jsonpatch = []
-jsonpointer = []
-jsonschema = []
-mccabe = []
-mypy = []
-mypy-extensions = []
-pathspec = []
-pexpect = [
-    {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
-    {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
-]
-platformdirs = [
-    {file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
-    {file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
-]
-ptyprocess = []
-pycodestyle = []
-pydocstyle = []
-pyflakes = []
-pylama = []
-pyrsistent = []
-pyyaml = []
-snowballstemmer = []
-toml = []
-tomli = []
-types-pyyaml = []
-typing-extensions = []
-warlock = []
+content-hash = "b3f428e987713d7875434c4b43cadadcb7d77dd3d62fd6855fb8e77ec946f082"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index a136c91e5e..c0fe323272 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -22,6 +22,13 @@ pylama = "^8.4.1"
 pyflakes = "2.5.0"
 toml = "^0.10.2"
 
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+Sphinx = "^6.1.3"
+sphinx-rtd-theme = "^1.2.0"
+
 [tool.poetry.scripts]
 dts = "main:main"
 
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-04 12:37 ` [RFC PATCH v2 " Juraj Linkeš
  2023-05-04 12:37   ` [RFC PATCH v2 1/4] dts: code adjustments for sphinx Juraj Linkeš
  2023-05-04 12:37   ` [RFC PATCH v2 2/4] dts: add doc generation dependencies Juraj Linkeš
@ 2023-05-04 12:37   ` Juraj Linkeš
  2023-05-04 12:45     ` Bruce Richardson
  2023-05-05 10:56     ` Bruce Richardson
  2023-05-04 12:37   ` [RFC PATCH v2 4/4] dts: format docstrigs to google format Juraj Linkeš
                     ` (2 subsequent siblings)
  5 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-04 12:37 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

The tool used to generate developer docs is sphinx, which is already
used in DPDK. The configuration is kept the same to preserve the style.

Sphinx generates the documentation from Python docstrings. The docstring
format most suitable for DTS seems to be the Google format [0] which
requires the sphinx.ext.napoleon extension.

There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx import the
  code.
* Also the same Python packages as DTS, for the same reason.

[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 doc/api/meson.build      |  1 +
 doc/guides/conf.py       | 22 ++++++++++++++----
 doc/guides/meson.build   |  1 +
 doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
 dts/doc/doc-index.rst    | 20 ++++++++++++++++
 dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
 dts/meson.build          | 16 +++++++++++++
 meson.build              |  1 +
 meson_options.txt        |  2 ++
 9 files changed, 137 insertions(+), 5 deletions(-)
 create mode 100644 dts/doc/doc-index.rst
 create mode 100644 dts/doc/meson.build
 create mode 100644 dts/meson.build

diff --git a/doc/api/meson.build b/doc/api/meson.build
index 2876a78a7e..1f0c725a94 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
 
+doc_api_build_dir = meson.current_build_dir()
 doxygen = find_program('doxygen', required: get_option('enable_docs'))
 
 if not doxygen.found()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index a55ce38800..04c842b67a 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
 from sphinx import __version__ as sphinx_version
 from os import listdir
 from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
 from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
 
 import configparser
 
@@ -24,6 +23,19 @@
           file=stderr)
     pass
 
+extensions = ['sphinx.ext.napoleon']
+
+# Python docstring options
+autodoc_member_order = 'bysource'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_use_ivar = True
+napoleon_use_rtype = False
+add_module_names = False
+toc_object_entries_show_parents = 'hide'
+
 stop_on_error = ('-W' in argv)
 
 project = 'Data Plane Development Kit'
@@ -35,8 +47,8 @@
 html_show_copyright = False
 highlight_language = 'none'
 
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+path.append(environ.setdefault('DTS_ROOT', '.'))
+version = environ.setdefault('DPDK_VERSION', "None")
 
 master_doc = 'index'
 
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Intel Corporation
 
+doc_guides_source_dir = meson.current_source_dir()
 sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
 
 if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index ebd6dceb6a..a547da2017 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -282,3 +282,32 @@ There are three tools used in DTS to help with code checking, style and formatti
 These three tools are all used in ``devtools/dts-check-format.sh``,
 the DTS code check and format script.
 Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+   .. code-block:: console
+
+   poetry install --with docs
+   poetry shell
+
+
+Build commands
+~~~~~~~~~~~~~~
+
+The documentation is built using the standard DPDK build system.
+
+After entering Poetry's shell, build the documentation with:
+
+   .. code-block:: console
+
+   ninja -C build dts/doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+   Make sure to fix any Sphinx warnings when adding or updating docstrings.
diff --git a/dts/doc/doc-index.rst b/dts/doc/doc-index.rst
new file mode 100644
index 0000000000..10151c6851
--- /dev/null
+++ b/dts/doc/doc-index.rst
@@ -0,0 +1,20 @@
+.. DPDK Test Suite documentation master file, created by
+   sphinx-quickstart on Tue Mar 14 12:23:52 2023.
+   You can adapt this file completely to your liking, but it should at least
+   contain the root `toctree` directive.
+
+Welcome to DPDK Test Suite's documentation!
+===========================================
+
+.. toctree::
+   :maxdepth: 4
+   :caption: Contents:
+
+   modules
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..db2bb0bed9
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,50 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
+sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
+
+if sphinx.found() and sphinx_apidoc.found()
+endif
+
+dts_api_framework_dir = join_paths(dts_dir, 'framework')
+dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+dts_api_src = custom_target('dts_api_src',
+        output: 'modules.rst',
+        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
+            sphinx_apidoc, '--append-syspath', '--force',
+            '--module-first', '--separate',
+            '--doc-project', 'DTS', '-V', meson.project_version(),
+            '-o', dts_api_build_dir,
+            dts_api_framework_dir],
+        build_by_default: get_option('enable_dts_docs'))
+doc_targets += dts_api_src
+doc_target_names += 'DTS_API_sphinx_sources'
+
+cp = find_program('cp', required: get_option('enable_docs'))
+cp_index = custom_target('cp_index',
+        input: 'doc-index.rst',
+        output: 'index.rst',
+        depends: dts_api_src,
+        command: [cp, '@INPUT@', join_paths(dts_api_build_dir, 'index.rst')],
+        build_by_default: get_option('enable_dts_docs'))
+doc_targets += cp_index
+doc_target_names += 'DTS_API_sphinx_index'
+
+extra_sphinx_args = ['-a', '-c', doc_guides_source_dir]
+if get_option('werror')
+    extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
+dts_api_html = custom_target('dts_api_html',
+        output: 'html',
+        depends: cp_index,
+        command: ['DTS_ROOT=@0@'.format(dts_dir),
+            sphinx_wrapper, sphinx, meson.project_version(),
+            dts_api_build_dir, dts_api_build_dir, extra_sphinx_args],
+        build_by_default: get_option('enable_dts_docs'),
+        install: get_option('enable_dts_docs'),
+        install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..17bda07636
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+    message = 'No docs targets found'
+else
+    message = 'Built docs:'
+endif
+run_target('dts/doc', command: [echo, message, doc_target_names],
+    depends: doc_targets)
diff --git a/meson.build b/meson.build
index f91d652bc5..7820f334bb 100644
--- a/meson.build
+++ b/meson.build
@@ -84,6 +84,7 @@ subdir('app')
 
 # build docs
 subdir('doc')
+subdir('dts')
 
 # build any examples explicitly requested - useful for developers - and
 # install any example code into the appropriate install path
diff --git a/meson_options.txt b/meson_options.txt
index 82c8297065..267f1b3ef7 100644
--- a/meson_options.txt
+++ b/meson_options.txt
@@ -16,6 +16,8 @@ option('drivers_install_subdir', type: 'string', value: 'dpdk/pmds-<VERSION>', d
        'Subdirectory of libdir where to install PMDs. Defaults to using a versioned subdirectory.')
 option('enable_docs', type: 'boolean', value: false, description:
        'build documentation')
+option('enable_dts_docs', type: 'boolean', value: false, description:
+       'Build DTS API documentation.')
 option('enable_apps', type: 'string', value: '', description:
        'Comma-separated list of apps to build. If unspecified, build all apps.')
 option('enable_drivers', type: 'string', value: '', description:
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v2 4/4] dts: format docstrigs to google format
  2023-05-04 12:37 ` [RFC PATCH v2 " Juraj Linkeš
                     ` (2 preceding siblings ...)
  2023-05-04 12:37   ` [RFC PATCH v2 3/4] dts: add doc generation Juraj Linkeš
@ 2023-05-04 12:37   ` Juraj Linkeš
  2023-05-05 14:06   ` [RFC PATCH v2 0/4] dts: add dts api docs Bruce Richardson
  2023-05-11  9:14   ` [RFC PATCH v3 " Juraj Linkeš
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-04 12:37 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

WIP: only one module is reformatted to serve as a demonstration.

The google format is documented here [0].

[0]: https://google.github.io/styleguide/pyguide.html

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/testbed_model/node.py | 152 +++++++++++++++++++---------
 1 file changed, 103 insertions(+), 49 deletions(-)

diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 90467981c3..ad8ef442af 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+There's a base class, Node, that's supposed to be extended by other classes
+with functionality specific to that node type.
+The only part that can be used standalone is the Node.skip_setup static method,
+which is a decorator used to skip method execution
+if skip_setup is passed by the user on the cmdline or in an env variable.
 """
 
 from typing import Any, Callable
@@ -26,10 +31,25 @@
 
 
 class Node(object):
-    """
-    Basic class for node management. This class implements methods that
-    manage a node, such as information gathering (of CPU/PCI/NIC) and
-    environment setup.
+    """The base class for node management.
+
+    It shouldn't be instantiated, but rather extended.
+    It implements common methods to manage any node:
+
+       * connection to the node
+       * information gathering of CPU
+       * hugepages setup
+
+    Arguments:
+        node_config: The config from the input configuration file.
+
+    Attributes:
+        main_session: The primary OS-agnostic remote session used
+            to communicate with the node.
+        config: The configuration used to create the node.
+        name: The name of the node.
+        lcores: The list of logical cores that DTS can use on the node.
+            It's derived from logical cores present on the node and user configuration.
     """
 
     main_session: OSSession
@@ -56,65 +76,89 @@ def __init__(self, node_config: NodeConfiguration):
         self._logger.info(f"Created node: {self.name}")
 
     def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
-        """
-        Perform the execution setup that will be done for each execution
-        this node is part of.
+        """Execution setup steps.
+
+        Configure hugepages and call self._set_up_execution where
+        the rest of the configuration steps (if any) are implemented.
+
+        Args:
+            execution_config: The execution configuration according to which
+                the setup steps will be taken.
         """
         self._setup_hugepages()
         self._set_up_execution(execution_config)
 
     def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional execution setup steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional execution setup steps.
         """
 
     def tear_down_execution(self) -> None:
-        """
-        Perform the execution teardown that will be done after each execution
-        this node is part of concludes.
+        """Execution teardown steps.
+
+        There are currently no common execution teardown steps
+        common to all DTS node types.
         """
         self._tear_down_execution()
 
     def _tear_down_execution(self) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional execution teardown steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional execution teardown steps.
         """
 
     def set_up_build_target(
         self, build_target_config: BuildTargetConfiguration
     ) -> None:
-        """
-        Perform the build target setup that will be done for each build target
-        tested on this node.
+        """Build target setup steps.
+
+        There are currently no common build target setup steps
+        common to all DTS node types.
+
+        Args:
+            build_target_config: The build target configuration according to which
+                the setup steps will be taken.
         """
         self._set_up_build_target(build_target_config)
 
     def _set_up_build_target(
         self, build_target_config: BuildTargetConfiguration
     ) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional build target setup steps for derived classes.
+
+        Derived classes should optionally overwrite this
+        if they want to add additional build target setup steps.
         """
 
     def tear_down_build_target(self) -> None:
-        """
-        Perform the build target teardown that will be done after each build target
-        tested on this node.
+        """Build target teardown steps.
+
+        There are currently no common build target teardown steps
+        common to all DTS node types.
         """
         self._tear_down_build_target()
 
     def _tear_down_build_target(self) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional build target teardown steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional build target teardown steps.
         """
 
     def create_session(self, name: str) -> OSSession:
-        """
-        Create and return a new OSSession tailored to the remote OS.
+        """Create and return a new OS-agnostic remote session.
+
+        The returned session won't be used by the object creating it.
+        Will be cleaned up automatically.
+
+        Args:
+            name: The name of the session.
+
+        Returns:
+            A new OS-agnostic remote session.
         """
         session_name = f"{self.name} {name}"
         connection = create_session(
@@ -130,14 +174,24 @@ def filter_lcores(
         filter_specifier: LogicalCoreCount | LogicalCoreList,
         ascending: bool = True,
     ) -> list[LogicalCore]:
-        """
-        Filter the LogicalCores found on the Node according to
-        a LogicalCoreCount or a LogicalCoreList.
+        """Filter the node's logical cores that DTS can use.
 
-        If ascending is True, use cores with the lowest numerical id first
-        and continue in ascending order. If False, start with the highest
-        id and continue in descending order. This ordering affects which
-        sockets to consider first as well.
+        Logical cores that DTS can use are ones that are present on the node,
+        but filtered according to user config.
+        The filter_specifier will filter cores from those logical cores.
+
+        Args:
+            filter_specifier: Two different filters can be used, one that specifies
+                the number of logical cores per core, cores per socket and
+                the number of sockets,
+                the other that specifies a logical core list.
+            ascending: If True, use cores with the lowest numerical id first
+                and continue in ascending order. If False, start with the highest
+                id and continue in descending order. This ordering affects which
+                sockets to consider first as well.
+
+        Returns:
+            A list of logical cores.
         """
         self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
         return lcore_filter(
@@ -147,17 +201,14 @@ def filter_lcores(
         ).filter()
 
     def _get_remote_cpus(self) -> None:
-        """
-        Scan CPUs in the remote OS and store a list of LogicalCores.
-        """
+        """Scan CPUs in the remote OS and store a list of LogicalCores."""
         self._logger.info("Getting CPU information.")
         self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
 
     def _setup_hugepages(self):
-        """
-        Setup hugepages on the Node. Different architectures can supply different
-        amounts of memory for hugepages and numa-based hugepage allocation may need
-        to be considered.
+        """Setup hugepages on the Node.
+
+        Configure the hugepages only if they're specified in user configuration.
         """
         if self.config.hugepages:
             self.main_session.setup_hugepages(
@@ -165,9 +216,7 @@ def _setup_hugepages(self):
             )
 
     def close(self) -> None:
-        """
-        Close all connections and free other resources.
-        """
+        """Close all connections and free other resources."""
         if self.main_session:
             self.main_session.close()
         for session in self._other_sessions:
@@ -176,6 +225,11 @@ def close(self) -> None:
 
     @staticmethod
     def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+        """A decorator that skips the decorated function.
+
+        When used, the decorator executes an empty lambda function
+        instead of the decorated function.
+        """
         if SETTINGS.skip_setup:
             return lambda *args: None
         else:
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-04 12:37   ` [RFC PATCH v2 3/4] dts: add doc generation Juraj Linkeš
@ 2023-05-04 12:45     ` Bruce Richardson
  2023-05-05  7:53       ` Juraj Linkeš
  2023-05-05 10:56     ` Bruce Richardson
  1 sibling, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-05-04 12:45 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> The tool used to generate developer docs is sphinx, which is already
> used in DPDK. The configuration is kept the same to preserve the style.
> 
> Sphinx generates the documentation from Python docstrings. The docstring
> format most suitable for DTS seems to be the Google format [0] which
> requires the sphinx.ext.napoleon extension.
> 
> There are two requirements for building DTS docs:
> * The same Python version as DTS or higher, because Sphinx import the
>   code.
> * Also the same Python packages as DTS, for the same reason.
> 
> [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> 
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
>  doc/api/meson.build      |  1 +
>  doc/guides/conf.py       | 22 ++++++++++++++----
>  doc/guides/meson.build   |  1 +
>  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
>  dts/doc/doc-index.rst    | 20 ++++++++++++++++
>  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
>  dts/meson.build          | 16 +++++++++++++
>  meson.build              |  1 +
>  meson_options.txt        |  2 ++
>  9 files changed, 137 insertions(+), 5 deletions(-)
>  create mode 100644 dts/doc/doc-index.rst
>  create mode 100644 dts/doc/meson.build
>  create mode 100644 dts/meson.build
> 

<snip>

> diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> new file mode 100644
> index 0000000000..db2bb0bed9
> --- /dev/null
> +++ b/dts/doc/meson.build
> @@ -0,0 +1,50 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> +
> +if sphinx.found() and sphinx_apidoc.found()
> +endif
> +
> +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> +dts_api_src = custom_target('dts_api_src',
> +        output: 'modules.rst',
> +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
> +            sphinx_apidoc, '--append-syspath', '--force',
> +            '--module-first', '--separate',
> +            '--doc-project', 'DTS', '-V', meson.project_version(),
> +            '-o', dts_api_build_dir,
> +            dts_api_framework_dir],
> +        build_by_default: get_option('enable_dts_docs'))
> +doc_targets += dts_api_src
> +doc_target_names += 'DTS_API_sphinx_sources'
> +
> +cp = find_program('cp', required: get_option('enable_docs'))

This should probably be "enable_dts_docs"


^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-04 12:45     ` Bruce Richardson
@ 2023-05-05  7:53       ` Juraj Linkeš
  2023-05-05 10:24         ` Bruce Richardson
  0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-05  7:53 UTC (permalink / raw)
  To: Bruce Richardson
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Thu, May 4, 2023 at 2:45 PM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> > The tool used to generate developer docs is sphinx, which is already
> > used in DPDK. The configuration is kept the same to preserve the style.
> >
> > Sphinx generates the documentation from Python docstrings. The docstring
> > format most suitable for DTS seems to be the Google format [0] which
> > requires the sphinx.ext.napoleon extension.
> >
> > There are two requirements for building DTS docs:
> > * The same Python version as DTS or higher, because Sphinx import the
> >   code.
> > * Also the same Python packages as DTS, for the same reason.
> >
> > [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> >  doc/api/meson.build      |  1 +
> >  doc/guides/conf.py       | 22 ++++++++++++++----
> >  doc/guides/meson.build   |  1 +
> >  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
> >  dts/doc/doc-index.rst    | 20 ++++++++++++++++
> >  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
> >  dts/meson.build          | 16 +++++++++++++
> >  meson.build              |  1 +
> >  meson_options.txt        |  2 ++
> >  9 files changed, 137 insertions(+), 5 deletions(-)
> >  create mode 100644 dts/doc/doc-index.rst
> >  create mode 100644 dts/doc/meson.build
> >  create mode 100644 dts/meson.build
> >
>
> <snip>
>
> > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > new file mode 100644
> > index 0000000000..db2bb0bed9
> > --- /dev/null
> > +++ b/dts/doc/meson.build
> > @@ -0,0 +1,50 @@
> > +# SPDX-License-Identifier: BSD-3-Clause
> > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +
> > +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> > +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> > +
> > +if sphinx.found() and sphinx_apidoc.found()
> > +endif
> > +
> > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > +dts_api_src = custom_target('dts_api_src',
> > +        output: 'modules.rst',
> > +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
> > +            sphinx_apidoc, '--append-syspath', '--force',
> > +            '--module-first', '--separate',
> > +            '--doc-project', 'DTS', '-V', meson.project_version(),
> > +            '-o', dts_api_build_dir,
> > +            dts_api_framework_dir],
> > +        build_by_default: get_option('enable_dts_docs'))
> > +doc_targets += dts_api_src
> > +doc_target_names += 'DTS_API_sphinx_sources'
> > +
> > +cp = find_program('cp', required: get_option('enable_docs'))
>
> This should probably be "enable_dts_docs"
>

Right, I overlooked that.
What do you think of the implementation in general?

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-05  7:53       ` Juraj Linkeš
@ 2023-05-05 10:24         ` Bruce Richardson
  2023-05-05 10:41           ` Juraj Linkeš
  0 siblings, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-05-05 10:24 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Fri, May 05, 2023 at 09:53:50AM +0200, Juraj Linkeš wrote:
> On Thu, May 4, 2023 at 2:45 PM Bruce Richardson
> <bruce.richardson@intel.com> wrote:
> >
> > On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> > > The tool used to generate developer docs is sphinx, which is already
> > > used in DPDK. The configuration is kept the same to preserve the style.
> > >
> > > Sphinx generates the documentation from Python docstrings. The docstring
> > > format most suitable for DTS seems to be the Google format [0] which
> > > requires the sphinx.ext.napoleon extension.
> > >
> > > There are two requirements for building DTS docs:
> > > * The same Python version as DTS or higher, because Sphinx import the
> > >   code.
> > > * Also the same Python packages as DTS, for the same reason.
> > >
> > > [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> > >
> > > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > > ---
> > >  doc/api/meson.build      |  1 +
> > >  doc/guides/conf.py       | 22 ++++++++++++++----
> > >  doc/guides/meson.build   |  1 +
> > >  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
> > >  dts/doc/doc-index.rst    | 20 ++++++++++++++++
> > >  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
> > >  dts/meson.build          | 16 +++++++++++++
> > >  meson.build              |  1 +
> > >  meson_options.txt        |  2 ++
> > >  9 files changed, 137 insertions(+), 5 deletions(-)
> > >  create mode 100644 dts/doc/doc-index.rst
> > >  create mode 100644 dts/doc/meson.build
> > >  create mode 100644 dts/meson.build
> > >
> >
> > <snip>
> >
> > > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > > new file mode 100644
> > > index 0000000000..db2bb0bed9
> > > --- /dev/null
> > > +++ b/dts/doc/meson.build
> > > @@ -0,0 +1,50 @@
> > > +# SPDX-License-Identifier: BSD-3-Clause
> > > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > > +
> > > +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> > > +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> > > +
> > > +if sphinx.found() and sphinx_apidoc.found()
> > > +endif
> > > +
> > > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > > +dts_api_src = custom_target('dts_api_src',
> > > +        output: 'modules.rst',
> > > +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
> > > +            sphinx_apidoc, '--append-syspath', '--force',
> > > +            '--module-first', '--separate',
> > > +            '--doc-project', 'DTS', '-V', meson.project_version(),
> > > +            '-o', dts_api_build_dir,
> > > +            dts_api_framework_dir],
> > > +        build_by_default: get_option('enable_dts_docs'))
> > > +doc_targets += dts_api_src
> > > +doc_target_names += 'DTS_API_sphinx_sources'
> > > +
> > > +cp = find_program('cp', required: get_option('enable_docs'))
> >
> > This should probably be "enable_dts_docs"
> >
> 
> Right, I overlooked that.
> What do you think of the implementation in general?

I need to download and apply the patches to test out before I comment on
that. I only gave them a quick scan thus far. I'll try and test them today
if I can.

/Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-05 10:24         ` Bruce Richardson
@ 2023-05-05 10:41           ` Juraj Linkeš
  0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-05 10:41 UTC (permalink / raw)
  To: Bruce Richardson
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Fri, May 5, 2023 at 12:24 PM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Fri, May 05, 2023 at 09:53:50AM +0200, Juraj Linkeš wrote:
> > On Thu, May 4, 2023 at 2:45 PM Bruce Richardson
> > <bruce.richardson@intel.com> wrote:
> > >
> > > On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> > > > The tool used to generate developer docs is sphinx, which is already
> > > > used in DPDK. The configuration is kept the same to preserve the style.
> > > >
> > > > Sphinx generates the documentation from Python docstrings. The docstring
> > > > format most suitable for DTS seems to be the Google format [0] which
> > > > requires the sphinx.ext.napoleon extension.
> > > >
> > > > There are two requirements for building DTS docs:
> > > > * The same Python version as DTS or higher, because Sphinx import the
> > > >   code.
> > > > * Also the same Python packages as DTS, for the same reason.
> > > >
> > > > [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> > > >
> > > > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > > > ---
> > > >  doc/api/meson.build      |  1 +
> > > >  doc/guides/conf.py       | 22 ++++++++++++++----
> > > >  doc/guides/meson.build   |  1 +
> > > >  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
> > > >  dts/doc/doc-index.rst    | 20 ++++++++++++++++
> > > >  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
> > > >  dts/meson.build          | 16 +++++++++++++
> > > >  meson.build              |  1 +
> > > >  meson_options.txt        |  2 ++
> > > >  9 files changed, 137 insertions(+), 5 deletions(-)
> > > >  create mode 100644 dts/doc/doc-index.rst
> > > >  create mode 100644 dts/doc/meson.build
> > > >  create mode 100644 dts/meson.build
> > > >
> > >
> > > <snip>
> > >
> > > > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > > > new file mode 100644
> > > > index 0000000000..db2bb0bed9
> > > > --- /dev/null
> > > > +++ b/dts/doc/meson.build
> > > > @@ -0,0 +1,50 @@
> > > > +# SPDX-License-Identifier: BSD-3-Clause
> > > > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > > > +
> > > > +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> > > > +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> > > > +
> > > > +if sphinx.found() and sphinx_apidoc.found()
> > > > +endif
> > > > +
> > > > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > > > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > > > +dts_api_src = custom_target('dts_api_src',
> > > > +        output: 'modules.rst',
> > > > +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
> > > > +            sphinx_apidoc, '--append-syspath', '--force',
> > > > +            '--module-first', '--separate',
> > > > +            '--doc-project', 'DTS', '-V', meson.project_version(),
> > > > +            '-o', dts_api_build_dir,
> > > > +            dts_api_framework_dir],
> > > > +        build_by_default: get_option('enable_dts_docs'))
> > > > +doc_targets += dts_api_src
> > > > +doc_target_names += 'DTS_API_sphinx_sources'
> > > > +
> > > > +cp = find_program('cp', required: get_option('enable_docs'))
> > >
> > > This should probably be "enable_dts_docs"
> > >
> >
> > Right, I overlooked that.
> > What do you think of the implementation in general?
>
> I need to download and apply the patches to test out before I comment on
> that. I only gave them a quick scan thus far. I'll try and test them today
> if I can.
>

Great, thanks.

Juraj

> /Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-04 12:37   ` [RFC PATCH v2 3/4] dts: add doc generation Juraj Linkeš
  2023-05-04 12:45     ` Bruce Richardson
@ 2023-05-05 10:56     ` Bruce Richardson
  2023-05-05 11:13       ` Juraj Linkeš
  1 sibling, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-05-05 10:56 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> The tool used to generate developer docs is sphinx, which is already
> used in DPDK. The configuration is kept the same to preserve the style.
> 
> Sphinx generates the documentation from Python docstrings. The docstring
> format most suitable for DTS seems to be the Google format [0] which
> requires the sphinx.ext.napoleon extension.
> 
> There are two requirements for building DTS docs:
> * The same Python version as DTS or higher, because Sphinx import the
>   code.
> * Also the same Python packages as DTS, for the same reason.
> 
> [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> 
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
>  doc/api/meson.build      |  1 +
>  doc/guides/conf.py       | 22 ++++++++++++++----
>  doc/guides/meson.build   |  1 +
>  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
>  dts/doc/doc-index.rst    | 20 ++++++++++++++++
>  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
>  dts/meson.build          | 16 +++++++++++++
>  meson.build              |  1 +
>  meson_options.txt        |  2 ++
>  9 files changed, 137 insertions(+), 5 deletions(-)
>  create mode 100644 dts/doc/doc-index.rst
>  create mode 100644 dts/doc/meson.build
>  create mode 100644 dts/meson.build
> 
<snip>

> diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> new file mode 100644
> index 0000000000..db2bb0bed9
> --- /dev/null
> +++ b/dts/doc/meson.build
> @@ -0,0 +1,50 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> +
> +if sphinx.found() and sphinx_apidoc.found()
> +endif
> +
> +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> +dts_api_src = custom_target('dts_api_src',
> +        output: 'modules.rst',
> +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',

This gives errors when I try to configure a build, even without docs
enabled.

	~/dpdk.org$ meson setup build-test
	The Meson build system
	Version: 1.0.1
	Source dir: /home/bruce/dpdk.org
	...
	Program sphinx-build found: YES (/usr/bin/sphinx-build)
	Program sphinx-build found: YES (/usr/bin/sphinx-build)
	Program sphinx-apidoc found: YES (/usr/bin/sphinx-apidoc)

	dts/doc/meson.build:12:0: ERROR: Program 'SPHINX_APIDOC_OPTIONS=members,show-inheritance' not found or not executable

	A full log can be found at /home/bruce/dpdk.org/build-test/meson-logs/meson-log.txt

Assuming these need to be set in the environment, I think you can use the
"env" parameter to custom target instead.

> +            sphinx_apidoc, '--append-syspath', '--force',
> +            '--module-first', '--separate',
> +            '--doc-project', 'DTS', '-V', meson.project_version(),
> +            '-o', dts_api_build_dir,
> +            dts_api_framework_dir],
> +        build_by_default: get_option('enable_dts_docs'))
> +doc_targets += dts_api_src
> +doc_target_names += 'DTS_API_sphinx_sources'
> +

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-05 10:56     ` Bruce Richardson
@ 2023-05-05 11:13       ` Juraj Linkeš
  2023-05-05 13:28         ` Bruce Richardson
  0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-05 11:13 UTC (permalink / raw)
  To: Bruce Richardson
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Fri, May 5, 2023 at 12:57 PM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> > The tool used to generate developer docs is sphinx, which is already
> > used in DPDK. The configuration is kept the same to preserve the style.
> >
> > Sphinx generates the documentation from Python docstrings. The docstring
> > format most suitable for DTS seems to be the Google format [0] which
> > requires the sphinx.ext.napoleon extension.
> >
> > There are two requirements for building DTS docs:
> > * The same Python version as DTS or higher, because Sphinx import the
> >   code.
> > * Also the same Python packages as DTS, for the same reason.
> >
> > [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > ---
> >  doc/api/meson.build      |  1 +
> >  doc/guides/conf.py       | 22 ++++++++++++++----
> >  doc/guides/meson.build   |  1 +
> >  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
> >  dts/doc/doc-index.rst    | 20 ++++++++++++++++
> >  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
> >  dts/meson.build          | 16 +++++++++++++
> >  meson.build              |  1 +
> >  meson_options.txt        |  2 ++
> >  9 files changed, 137 insertions(+), 5 deletions(-)
> >  create mode 100644 dts/doc/doc-index.rst
> >  create mode 100644 dts/doc/meson.build
> >  create mode 100644 dts/meson.build
> >
> <snip>
>
> > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > new file mode 100644
> > index 0000000000..db2bb0bed9
> > --- /dev/null
> > +++ b/dts/doc/meson.build
> > @@ -0,0 +1,50 @@
> > +# SPDX-License-Identifier: BSD-3-Clause
> > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +
> > +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> > +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> > +
> > +if sphinx.found() and sphinx_apidoc.found()
> > +endif
> > +
> > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > +dts_api_src = custom_target('dts_api_src',
> > +        output: 'modules.rst',
> > +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
>
> This gives errors when I try to configure a build, even without docs
> enabled.
>
>         ~/dpdk.org$ meson setup build-test
>         The Meson build system
>         Version: 1.0.1
>         Source dir: /home/bruce/dpdk.org
>         ...
>         Program sphinx-build found: YES (/usr/bin/sphinx-build)
>         Program sphinx-build found: YES (/usr/bin/sphinx-build)
>         Program sphinx-apidoc found: YES (/usr/bin/sphinx-apidoc)
>
>         dts/doc/meson.build:12:0: ERROR: Program 'SPHINX_APIDOC_OPTIONS=members,show-inheritance' not found or not executable
>
>         A full log can be found at /home/bruce/dpdk.org/build-test/meson-logs/meson-log.txt
>
> Assuming these need to be set in the environment, I think you can use the
> "env" parameter to custom target instead.
>

I used meson 0.53.2 as that seemed to be the version I should target
(it's used in .ci/linux-setup.sh) which doesn't support the argument
(I originally wanted to use it, but they added it in 0.57.0). I didn't
see the error with 0.53.2.

Which version should I target? 1.0.1?

> > +            sphinx_apidoc, '--append-syspath', '--force',
> > +            '--module-first', '--separate',
> > +            '--doc-project', 'DTS', '-V', meson.project_version(),
> > +            '-o', dts_api_build_dir,
> > +            dts_api_framework_dir],
> > +        build_by_default: get_option('enable_dts_docs'))
> > +doc_targets += dts_api_src
> > +doc_target_names += 'DTS_API_sphinx_sources'
> > +

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-05 11:13       ` Juraj Linkeš
@ 2023-05-05 13:28         ` Bruce Richardson
  2023-05-09  9:23           ` Juraj Linkeš
  0 siblings, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-05-05 13:28 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Fri, May 05, 2023 at 01:13:34PM +0200, Juraj Linkeš wrote:
> On Fri, May 5, 2023 at 12:57 PM Bruce Richardson
> <bruce.richardson@intel.com> wrote:
> >
> > On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> > > The tool used to generate developer docs is sphinx, which is already
> > > used in DPDK. The configuration is kept the same to preserve the style.
> > >
> > > Sphinx generates the documentation from Python docstrings. The docstring
> > > format most suitable for DTS seems to be the Google format [0] which
> > > requires the sphinx.ext.napoleon extension.
> > >
> > > There are two requirements for building DTS docs:
> > > * The same Python version as DTS or higher, because Sphinx import the
> > >   code.
> > > * Also the same Python packages as DTS, for the same reason.
> > >
> > > [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> > >
> > > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > > ---
> > >  doc/api/meson.build      |  1 +
> > >  doc/guides/conf.py       | 22 ++++++++++++++----
> > >  doc/guides/meson.build   |  1 +
> > >  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
> > >  dts/doc/doc-index.rst    | 20 ++++++++++++++++
> > >  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
> > >  dts/meson.build          | 16 +++++++++++++
> > >  meson.build              |  1 +
> > >  meson_options.txt        |  2 ++
> > >  9 files changed, 137 insertions(+), 5 deletions(-)
> > >  create mode 100644 dts/doc/doc-index.rst
> > >  create mode 100644 dts/doc/meson.build
> > >  create mode 100644 dts/meson.build
> > >
> > <snip>
> >
> > > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > > new file mode 100644
> > > index 0000000000..db2bb0bed9
> > > --- /dev/null
> > > +++ b/dts/doc/meson.build
> > > @@ -0,0 +1,50 @@
> > > +# SPDX-License-Identifier: BSD-3-Clause
> > > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > > +
> > > +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> > > +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> > > +
> > > +if sphinx.found() and sphinx_apidoc.found()
> > > +endif
> > > +
> > > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > > +dts_api_src = custom_target('dts_api_src',
> > > +        output: 'modules.rst',
> > > +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
> >
> > This gives errors when I try to configure a build, even without docs
> > enabled.
> >
> >         ~/dpdk.org$ meson setup build-test
> >         The Meson build system
> >         Version: 1.0.1
> >         Source dir: /home/bruce/dpdk.org
> >         ...
> >         Program sphinx-build found: YES (/usr/bin/sphinx-build)
> >         Program sphinx-build found: YES (/usr/bin/sphinx-build)
> >         Program sphinx-apidoc found: YES (/usr/bin/sphinx-apidoc)
> >
> >         dts/doc/meson.build:12:0: ERROR: Program 'SPHINX_APIDOC_OPTIONS=members,show-inheritance' not found or not executable
> >
> >         A full log can be found at /home/bruce/dpdk.org/build-test/meson-logs/meson-log.txt
> >
> > Assuming these need to be set in the environment, I think you can use the
> > "env" parameter to custom target instead.
> >
> 
> I used meson 0.53.2 as that seemed to be the version I should target
> (it's used in .ci/linux-setup.sh) which doesn't support the argument
> (I originally wanted to use it, but they added it in 0.57.0). I didn't
> see the error with 0.53.2.
> 
> Which version should I target? 1.0.1?
> 
> > > +            sphinx_apidoc, '--append-syspath', '--force',
> > > +            '--module-first', '--separate',
> > > +            '--doc-project', 'DTS', '-V', meson.project_version(),
> > > +            '-o', dts_api_build_dir,
> > > +            dts_api_framework_dir],
> > > +        build_by_default: get_option('enable_dts_docs'))
> > > +doc_targets += dts_api_src
> > > +doc_target_names += 'DTS_API_sphinx_sources'
> > > +

I didn't try with 0.53.2 - let me test that, see if the error goes away. We
may need different calls based on the meson version.

Is there no other way to pass this data rather than via the environment?

/Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 0/4] dts: add dts api docs
  2023-05-04 12:37 ` [RFC PATCH v2 " Juraj Linkeš
                     ` (3 preceding siblings ...)
  2023-05-04 12:37   ` [RFC PATCH v2 4/4] dts: format docstrigs to google format Juraj Linkeš
@ 2023-05-05 14:06   ` Bruce Richardson
  2023-05-09 15:28     ` Juraj Linkeš
  2023-05-11  8:55     ` Juraj Linkeš
  2023-05-11  9:14   ` [RFC PATCH v3 " Juraj Linkeš
  5 siblings, 2 replies; 393+ messages in thread
From: Bruce Richardson @ 2023-05-05 14:06 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Thu, May 04, 2023 at 02:37:45PM +0200, Juraj Linkeš wrote:
> Augment the meson build system with dts api generation. The api docs are
> generated from Python docstrings in DTS using Sphinx. The format of
> choice is the Google format [0].
> 
> The guide html sphinx configuration is used to preserve the same style.
> 
> The build requires the same Python version and dependencies as DTS,
> because Sphinx imports the Python modules. Dependencies are installed
> using Poetry from the dts directory:
> 
> poetry install --with docs
> 
> After installing, enter the Poetry shell:
> 
> poetry shell
> 
> And then run the build:
> ninja -C <meson_build_dir> dts/doc
> 
> There's only one properly documented module that serves as a
> demonstration of the style - framework.testbed_model.node.
> 
> [0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
> 
> Juraj Linkeš (4):
>   dts: code adjustments for sphinx
>   dts: add doc generation dependencies
>   dts: add doc generation
>   dts: format docstrigs to google format
> 

I find the requirement to use poetry to build the docs, and the need to run
specific commands in specific directories quite awkward. With this patchset
there is no ability to just turn on the build option for the DTS doc and
have the docs built on the next rebuild. [Also, with every build I've tried
I can't get it to build without warnings about missing "warlock" module.]

From what I understand from the patchset, the doc building here using
sphinx is primarily concerned with building the API docs. The rest of DPDK
uses doxygen for this, and since doxygen supports python can the same
tooling be used for the DTS docs?

/Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-05 13:28         ` Bruce Richardson
@ 2023-05-09  9:23           ` Juraj Linkeš
  2023-05-09  9:40             ` Bruce Richardson
  0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-09  9:23 UTC (permalink / raw)
  To: Bruce Richardson
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Fri, May 5, 2023 at 3:29 PM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Fri, May 05, 2023 at 01:13:34PM +0200, Juraj Linkeš wrote:
> > On Fri, May 5, 2023 at 12:57 PM Bruce Richardson
> > <bruce.richardson@intel.com> wrote:
> > >
> > > On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> > > > The tool used to generate developer docs is sphinx, which is already
> > > > used in DPDK. The configuration is kept the same to preserve the style.
> > > >
> > > > Sphinx generates the documentation from Python docstrings. The docstring
> > > > format most suitable for DTS seems to be the Google format [0] which
> > > > requires the sphinx.ext.napoleon extension.
> > > >
> > > > There are two requirements for building DTS docs:
> > > > * The same Python version as DTS or higher, because Sphinx import the
> > > >   code.
> > > > * Also the same Python packages as DTS, for the same reason.
> > > >
> > > > [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> > > >
> > > > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > > > ---
> > > >  doc/api/meson.build      |  1 +
> > > >  doc/guides/conf.py       | 22 ++++++++++++++----
> > > >  doc/guides/meson.build   |  1 +
> > > >  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
> > > >  dts/doc/doc-index.rst    | 20 ++++++++++++++++
> > > >  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
> > > >  dts/meson.build          | 16 +++++++++++++
> > > >  meson.build              |  1 +
> > > >  meson_options.txt        |  2 ++
> > > >  9 files changed, 137 insertions(+), 5 deletions(-)
> > > >  create mode 100644 dts/doc/doc-index.rst
> > > >  create mode 100644 dts/doc/meson.build
> > > >  create mode 100644 dts/meson.build
> > > >
> > > <snip>
> > >
> > > > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > > > new file mode 100644
> > > > index 0000000000..db2bb0bed9
> > > > --- /dev/null
> > > > +++ b/dts/doc/meson.build
> > > > @@ -0,0 +1,50 @@
> > > > +# SPDX-License-Identifier: BSD-3-Clause
> > > > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > > > +
> > > > +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> > > > +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> > > > +
> > > > +if sphinx.found() and sphinx_apidoc.found()
> > > > +endif
> > > > +
> > > > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > > > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > > > +dts_api_src = custom_target('dts_api_src',
> > > > +        output: 'modules.rst',
> > > > +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
> > >
> > > This gives errors when I try to configure a build, even without docs
> > > enabled.
> > >
> > >         ~/dpdk.org$ meson setup build-test
> > >         The Meson build system
> > >         Version: 1.0.1
> > >         Source dir: /home/bruce/dpdk.org
> > >         ...
> > >         Program sphinx-build found: YES (/usr/bin/sphinx-build)
> > >         Program sphinx-build found: YES (/usr/bin/sphinx-build)
> > >         Program sphinx-apidoc found: YES (/usr/bin/sphinx-apidoc)
> > >
> > >         dts/doc/meson.build:12:0: ERROR: Program 'SPHINX_APIDOC_OPTIONS=members,show-inheritance' not found or not executable
> > >
> > >         A full log can be found at /home/bruce/dpdk.org/build-test/meson-logs/meson-log.txt
> > >
> > > Assuming these need to be set in the environment, I think you can use the
> > > "env" parameter to custom target instead.
> > >
> >
> > I used meson 0.53.2 as that seemed to be the version I should target
> > (it's used in .ci/linux-setup.sh) which doesn't support the argument
> > (I originally wanted to use it, but they added it in 0.57.0). I didn't
> > see the error with 0.53.2.
> >
> > Which version should I target? 1.0.1?
> >
> > > > +            sphinx_apidoc, '--append-syspath', '--force',
> > > > +            '--module-first', '--separate',
> > > > +            '--doc-project', 'DTS', '-V', meson.project_version(),
> > > > +            '-o', dts_api_build_dir,
> > > > +            dts_api_framework_dir],
> > > > +        build_by_default: get_option('enable_dts_docs'))
> > > > +doc_targets += dts_api_src
> > > > +doc_target_names += 'DTS_API_sphinx_sources'
> > > > +
>
> I didn't try with 0.53.2 - let me test that, see if the error goes away. We
> may need different calls based on the meson version.
>
> Is there no other way to pass this data rather than via the environment?
>

Certainly. For background, I wanted to do the same thing we do for
DPDK_VERSION, but that would require adding an additional parameter to
call-sphinx-build.py, which wouldn't be used by the other call of
call-sphinx-build.py (the one that builds doc guides), so I skipped
the parameter and set the env var before the call.

There are a few options that come to mind:
1. Use the parameter. There are two sub-options here, either make the
parameter positional and mandatory and then we'd have an awkward call
that builds the guides or I could make it optional, but that would
make the script a bit more complex (some argparse logic would be
needed).
2. Hard-code the path into conf.py.
3. Have separate conf.py files. Maybe we could make this work with symlinks.

There could be something else. I like adding the optional parameter. I
don't know the policy on buildtools script complexity so let me know
what you think.

Juraj

> /Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-09  9:23           ` Juraj Linkeš
@ 2023-05-09  9:40             ` Bruce Richardson
  2023-05-10 12:19               ` Juraj Linkeš
  0 siblings, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-05-09  9:40 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Tue, May 09, 2023 at 11:23:50AM +0200, Juraj Linkeš wrote:
> On Fri, May 5, 2023 at 3:29 PM Bruce Richardson
> <bruce.richardson@intel.com> wrote:
> >
> > On Fri, May 05, 2023 at 01:13:34PM +0200, Juraj Linkeš wrote:
> > > On Fri, May 5, 2023 at 12:57 PM Bruce Richardson
> > > <bruce.richardson@intel.com> wrote:
> > > >
> > > > On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> > > > > The tool used to generate developer docs is sphinx, which is already
> > > > > used in DPDK. The configuration is kept the same to preserve the style.
> > > > >
> > > > > Sphinx generates the documentation from Python docstrings. The docstring
> > > > > format most suitable for DTS seems to be the Google format [0] which
> > > > > requires the sphinx.ext.napoleon extension.
> > > > >
> > > > > There are two requirements for building DTS docs:
> > > > > * The same Python version as DTS or higher, because Sphinx import the
> > > > >   code.
> > > > > * Also the same Python packages as DTS, for the same reason.
> > > > >
> > > > > [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> > > > >
> > > > > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > > > > ---
> > > > >  doc/api/meson.build      |  1 +
> > > > >  doc/guides/conf.py       | 22 ++++++++++++++----
> > > > >  doc/guides/meson.build   |  1 +
> > > > >  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
> > > > >  dts/doc/doc-index.rst    | 20 ++++++++++++++++
> > > > >  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
> > > > >  dts/meson.build          | 16 +++++++++++++
> > > > >  meson.build              |  1 +
> > > > >  meson_options.txt        |  2 ++
> > > > >  9 files changed, 137 insertions(+), 5 deletions(-)
> > > > >  create mode 100644 dts/doc/doc-index.rst
> > > > >  create mode 100644 dts/doc/meson.build
> > > > >  create mode 100644 dts/meson.build
> > > > >
> > > > <snip>
> > > >
> > > > > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > > > > new file mode 100644
> > > > > index 0000000000..db2bb0bed9
> > > > > --- /dev/null
> > > > > +++ b/dts/doc/meson.build
> > > > > @@ -0,0 +1,50 @@
> > > > > +# SPDX-License-Identifier: BSD-3-Clause
> > > > > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > > > > +
> > > > > +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> > > > > +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> > > > > +
> > > > > +if sphinx.found() and sphinx_apidoc.found()
> > > > > +endif
> > > > > +
> > > > > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > > > > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > > > > +dts_api_src = custom_target('dts_api_src',
> > > > > +        output: 'modules.rst',
> > > > > +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
> > > >
> > > > This gives errors when I try to configure a build, even without docs
> > > > enabled.
> > > >
> > > >         ~/dpdk.org$ meson setup build-test
> > > >         The Meson build system
> > > >         Version: 1.0.1
> > > >         Source dir: /home/bruce/dpdk.org
> > > >         ...
> > > >         Program sphinx-build found: YES (/usr/bin/sphinx-build)
> > > >         Program sphinx-build found: YES (/usr/bin/sphinx-build)
> > > >         Program sphinx-apidoc found: YES (/usr/bin/sphinx-apidoc)
> > > >
> > > >         dts/doc/meson.build:12:0: ERROR: Program 'SPHINX_APIDOC_OPTIONS=members,show-inheritance' not found or not executable
> > > >
> > > >         A full log can be found at /home/bruce/dpdk.org/build-test/meson-logs/meson-log.txt
> > > >
> > > > Assuming these need to be set in the environment, I think you can use the
> > > > "env" parameter to custom target instead.
> > > >
> > >
> > > I used meson 0.53.2 as that seemed to be the version I should target
> > > (it's used in .ci/linux-setup.sh) which doesn't support the argument
> > > (I originally wanted to use it, but they added it in 0.57.0). I didn't
> > > see the error with 0.53.2.
> > >
> > > Which version should I target? 1.0.1?
> > >
> > > > > +            sphinx_apidoc, '--append-syspath', '--force',
> > > > > +            '--module-first', '--separate',
> > > > > +            '--doc-project', 'DTS', '-V', meson.project_version(),
> > > > > +            '-o', dts_api_build_dir,
> > > > > +            dts_api_framework_dir],
> > > > > +        build_by_default: get_option('enable_dts_docs'))
> > > > > +doc_targets += dts_api_src
> > > > > +doc_target_names += 'DTS_API_sphinx_sources'
> > > > > +
> >
> > I didn't try with 0.53.2 - let me test that, see if the error goes away. We
> > may need different calls based on the meson version.
> >
> > Is there no other way to pass this data rather than via the environment?
> >
> 
> Certainly. For background, I wanted to do the same thing we do for
> DPDK_VERSION, but that would require adding an additional parameter to
> call-sphinx-build.py, which wouldn't be used by the other call of
> call-sphinx-build.py (the one that builds doc guides), so I skipped
> the parameter and set the env var before the call.
> 
> There are a few options that come to mind:
> 1. Use the parameter. There are two sub-options here, either make the
> parameter positional and mandatory and then we'd have an awkward call
> that builds the guides or I could make it optional, but that would
> make the script a bit more complex (some argparse logic would be
> needed).
> 2. Hard-code the path into conf.py.
> 3. Have separate conf.py files. Maybe we could make this work with symlinks.
> 
> There could be something else. I like adding the optional parameter. I
> don't know the policy on buildtools script complexity so let me know
> what you think.
> 
If the parameter would be just unused for the main doc build, and cause no
issues, I don't see why we can't just put it into the main conf.py file. We
can add a comment explaining that it only applies for the DTS doc. However,
option 1, with an extra optional parameter doesn't seem so bad to me
either. Using argparse in the build script doesn't seem like a problem
either, if it's necessary. Maybe others have other opinions, though.

/Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 0/4] dts: add dts api docs
  2023-05-05 14:06   ` [RFC PATCH v2 0/4] dts: add dts api docs Bruce Richardson
@ 2023-05-09 15:28     ` Juraj Linkeš
  2023-05-11  8:55     ` Juraj Linkeš
  1 sibling, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-09 15:28 UTC (permalink / raw)
  To: Bruce Richardson
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Fri, May 5, 2023 at 4:07 PM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Thu, May 04, 2023 at 02:37:45PM +0200, Juraj Linkeš wrote:
> > Augment the meson build system with dts api generation. The api docs are
> > generated from Python docstrings in DTS using Sphinx. The format of
> > choice is the Google format [0].
> >
> > The guide html sphinx configuration is used to preserve the same style.
> >
> > The build requires the same Python version and dependencies as DTS,
> > because Sphinx imports the Python modules. Dependencies are installed
> > using Poetry from the dts directory:
> >
> > poetry install --with docs
> >
> > After installing, enter the Poetry shell:
> >
> > poetry shell
> >
> > And then run the build:
> > ninja -C <meson_build_dir> dts/doc
> >
> > There's only one properly documented module that serves as a
> > demonstration of the style - framework.testbed_model.node.
> >
> > [0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
> >
> > Juraj Linkeš (4):
> >   dts: code adjustments for sphinx
> >   dts: add doc generation dependencies
> >   dts: add doc generation
> >   dts: format docstrigs to google format
> >
>
> I find the requirement to use poetry to build the docs, and the need to run
> specific commands in specific directories quite awkward. With this patchset
> there is no ability to just turn on the build option for the DTS doc and
> have the docs built on the next rebuild. [Also, with every build I've tried
> I can't get it to build without warnings about missing "warlock" module.]
>
> From what I understand from the patchset, the doc building here using
> sphinx is primarily concerned with building the API docs. The rest of DPDK
> uses doxygen for this, and since doxygen supports python can the same
> tooling be used for the DTS docs?
>

I don't think any tool for python API docs would be able to do it
without the dependencies. The standard way to document python code is
in Python docstrings which are accessible during runtime (which is why
the dependencies are needed). Doxygen says that as well:
For Python there is a standard way of documenting the code using so
called documentation strings ("""). Such strings are stored in __doc__
and can be retrieved at runtime. Doxygen will extract such comments
and assume they have to be represented in a preformatted way.

There may be a tool that doesn't use the __doc__ attribute accessible
during runtime (I don't think anyone would implement something like
that though), but that would likely be much worse than Sphinx.

Juraj

> /Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 3/4] dts: add doc generation
  2023-05-09  9:40             ` Bruce Richardson
@ 2023-05-10 12:19               ` Juraj Linkeš
  0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-10 12:19 UTC (permalink / raw)
  To: Bruce Richardson
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Tue, May 9, 2023 at 11:40 AM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Tue, May 09, 2023 at 11:23:50AM +0200, Juraj Linkeš wrote:
> > On Fri, May 5, 2023 at 3:29 PM Bruce Richardson
> > <bruce.richardson@intel.com> wrote:
> > >
> > > On Fri, May 05, 2023 at 01:13:34PM +0200, Juraj Linkeš wrote:
> > > > On Fri, May 5, 2023 at 12:57 PM Bruce Richardson
> > > > <bruce.richardson@intel.com> wrote:
> > > > >
> > > > > On Thu, May 04, 2023 at 02:37:48PM +0200, Juraj Linkeš wrote:
> > > > > > The tool used to generate developer docs is sphinx, which is already
> > > > > > used in DPDK. The configuration is kept the same to preserve the style.
> > > > > >
> > > > > > Sphinx generates the documentation from Python docstrings. The docstring
> > > > > > format most suitable for DTS seems to be the Google format [0] which
> > > > > > requires the sphinx.ext.napoleon extension.
> > > > > >
> > > > > > There are two requirements for building DTS docs:
> > > > > > * The same Python version as DTS or higher, because Sphinx import the
> > > > > >   code.
> > > > > > * Also the same Python packages as DTS, for the same reason.
> > > > > >
> > > > > > [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> > > > > >
> > > > > > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > > > > > ---
> > > > > >  doc/api/meson.build      |  1 +
> > > > > >  doc/guides/conf.py       | 22 ++++++++++++++----
> > > > > >  doc/guides/meson.build   |  1 +
> > > > > >  doc/guides/tools/dts.rst | 29 +++++++++++++++++++++++
> > > > > >  dts/doc/doc-index.rst    | 20 ++++++++++++++++
> > > > > >  dts/doc/meson.build      | 50 ++++++++++++++++++++++++++++++++++++++++
> > > > > >  dts/meson.build          | 16 +++++++++++++
> > > > > >  meson.build              |  1 +
> > > > > >  meson_options.txt        |  2 ++
> > > > > >  9 files changed, 137 insertions(+), 5 deletions(-)
> > > > > >  create mode 100644 dts/doc/doc-index.rst
> > > > > >  create mode 100644 dts/doc/meson.build
> > > > > >  create mode 100644 dts/meson.build
> > > > > >
> > > > > <snip>
> > > > >
> > > > > > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > > > > > new file mode 100644
> > > > > > index 0000000000..db2bb0bed9
> > > > > > --- /dev/null
> > > > > > +++ b/dts/doc/meson.build
> > > > > > @@ -0,0 +1,50 @@
> > > > > > +# SPDX-License-Identifier: BSD-3-Clause
> > > > > > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > > > > > +
> > > > > > +sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
> > > > > > +sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
> > > > > > +
> > > > > > +if sphinx.found() and sphinx_apidoc.found()
> > > > > > +endif
> > > > > > +
> > > > > > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > > > > > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > > > > > +dts_api_src = custom_target('dts_api_src',
> > > > > > +        output: 'modules.rst',
> > > > > > +        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
> > > > >
> > > > > This gives errors when I try to configure a build, even without docs
> > > > > enabled.
> > > > >
> > > > >         ~/dpdk.org$ meson setup build-test
> > > > >         The Meson build system
> > > > >         Version: 1.0.1
> > > > >         Source dir: /home/bruce/dpdk.org
> > > > >         ...
> > > > >         Program sphinx-build found: YES (/usr/bin/sphinx-build)
> > > > >         Program sphinx-build found: YES (/usr/bin/sphinx-build)
> > > > >         Program sphinx-apidoc found: YES (/usr/bin/sphinx-apidoc)
> > > > >
> > > > >         dts/doc/meson.build:12:0: ERROR: Program 'SPHINX_APIDOC_OPTIONS=members,show-inheritance' not found or not executable
> > > > >
> > > > >         A full log can be found at /home/bruce/dpdk.org/build-test/meson-logs/meson-log.txt
> > > > >
> > > > > Assuming these need to be set in the environment, I think you can use the
> > > > > "env" parameter to custom target instead.
> > > > >
> > > >
> > > > I used meson 0.53.2 as that seemed to be the version I should target
> > > > (it's used in .ci/linux-setup.sh) which doesn't support the argument
> > > > (I originally wanted to use it, but they added it in 0.57.0). I didn't
> > > > see the error with 0.53.2.
> > > >
> > > > Which version should I target? 1.0.1?
> > > >
> > > > > > +            sphinx_apidoc, '--append-syspath', '--force',
> > > > > > +            '--module-first', '--separate',
> > > > > > +            '--doc-project', 'DTS', '-V', meson.project_version(),
> > > > > > +            '-o', dts_api_build_dir,
> > > > > > +            dts_api_framework_dir],
> > > > > > +        build_by_default: get_option('enable_dts_docs'))
> > > > > > +doc_targets += dts_api_src
> > > > > > +doc_target_names += 'DTS_API_sphinx_sources'
> > > > > > +
> > >
> > > I didn't try with 0.53.2 - let me test that, see if the error goes away. We
> > > may need different calls based on the meson version.
> > >
> > > Is there no other way to pass this data rather than via the environment?
> > >
> >
> > Certainly. For background, I wanted to do the same thing we do for
> > DPDK_VERSION, but that would require adding an additional parameter to
> > call-sphinx-build.py, which wouldn't be used by the other call of
> > call-sphinx-build.py (the one that builds doc guides), so I skipped
> > the parameter and set the env var before the call.
> >
> > There are a few options that come to mind:
> > 1. Use the parameter. There are two sub-options here, either make the
> > parameter positional and mandatory and then we'd have an awkward call
> > that builds the guides or I could make it optional, but that would
> > make the script a bit more complex (some argparse logic would be
> > needed).
> > 2. Hard-code the path into conf.py.
> > 3. Have separate conf.py files. Maybe we could make this work with symlinks.
> >
> > There could be something else. I like adding the optional parameter. I
> > don't know the policy on buildtools script complexity so let me know
> > what you think.
> >
> If the parameter would be just unused for the main doc build, and cause no
> issues, I don't see why we can't just put it into the main conf.py file. We
> can add a comment explaining that it only applies for the DTS doc. However,
> option 1, with an extra optional parameter doesn't seem so bad to me
> either. Using argparse in the build script doesn't seem like a problem
> either, if it's necessary. Maybe others have other opinions, though.
>

I'll just submit the version with argparse and we'll see how it looks.

> /Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v2 0/4] dts: add dts api docs
  2023-05-05 14:06   ` [RFC PATCH v2 0/4] dts: add dts api docs Bruce Richardson
  2023-05-09 15:28     ` Juraj Linkeš
@ 2023-05-11  8:55     ` Juraj Linkeš
  1 sibling, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-11  8:55 UTC (permalink / raw)
  To: Bruce Richardson
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Fri, May 5, 2023 at 4:07 PM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Thu, May 04, 2023 at 02:37:45PM +0200, Juraj Linkeš wrote:
> > Augment the meson build system with dts api generation. The api docs are
> > generated from Python docstrings in DTS using Sphinx. The format of
> > choice is the Google format [0].
> >
> > The guide html sphinx configuration is used to preserve the same style.
> >
> > The build requires the same Python version and dependencies as DTS,
> > because Sphinx imports the Python modules. Dependencies are installed
> > using Poetry from the dts directory:
> >
> > poetry install --with docs
> >
> > After installing, enter the Poetry shell:
> >
> > poetry shell
> >
> > And then run the build:
> > ninja -C <meson_build_dir> dts/doc
> >
> > There's only one properly documented module that serves as a
> > demonstration of the style - framework.testbed_model.node.
> >
> > [0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
> >
> > Juraj Linkeš (4):
> >   dts: code adjustments for sphinx
> >   dts: add doc generation dependencies
> >   dts: add doc generation
> >   dts: format docstrigs to google format
> >
>
> I find the requirement to use poetry to build the docs, and the need to run
> specific commands in specific directories quite awkward. With this patchset
> there is no ability to just turn on the build option for the DTS doc and
> have the docs built on the next rebuild. [Also, with every build I've tried
> I can't get it to build without warnings about missing "warlock" module.]
>

I want to ask about the warnings. This suggests a problem with
dependencies, have you entered the Poetry shell? We may need to add
some steps to docs, which are currently:

poetry install --with docs
poetry shell

And then:
ninja -C build dts/doc

Maybe the problem is with Poetry version (1.4.2 and higher should
work), which is not specified in the docs yet. I need to update
http://patches.dpdk.org/project/dpdk/patch/20230331091355.1224059-1-juraj.linkes@pantheon.tech/
with it.

Which are your exact steps for building the docs?

Juraj

> From what I understand from the patchset, the doc building here using
> sphinx is primarily concerned with building the API docs. The rest of DPDK
> uses doxygen for this, and since doxygen supports python can the same
> tooling be used for the DTS docs?
>
> /Bruce

^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v3 0/4] dts: add dts api docs
  2023-05-04 12:37 ` [RFC PATCH v2 " Juraj Linkeš
                     ` (4 preceding siblings ...)
  2023-05-05 14:06   ` [RFC PATCH v2 0/4] dts: add dts api docs Bruce Richardson
@ 2023-05-11  9:14   ` Juraj Linkeš
  2023-05-11  9:14     ` [RFC PATCH v3 1/4] dts: code adjustments for sphinx Juraj Linkeš
                       ` (5 more replies)
  5 siblings, 6 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-11  9:14 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

Augment the meson build system with dts api generation. The api docs are
generated from Python docstrings in DTS using Sphinx. The format of
choice is the Google format [0].

The guides html sphinx configuration is used to preserve the same style.

The build requires the same Python version and dependencies as DTS,
because Sphinx imports the Python modules. Dependencies are installed
using Poetry from the dts directory:

poetry install --with docs

After installing, enter the Poetry shell:

poetry shell

And then run the build:
ninja -C <meson_build_dir> dts/doc

There's only one properly documented module that serves as a
demonstration of the style - framework.testbed_model.node. When we agree
on the docstring format, all docstrings will be reformatted.

[0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes

Juraj Linkeš (4):
  dts: code adjustments for sphinx
  dts: add doc generation dependencies
  dts: add doc generation
  dts: format docstrigs to google format

 buildtools/call-sphinx-build.py               |  29 +-
 doc/api/meson.build                           |   1 +
 doc/guides/conf.py                            |  22 +-
 doc/guides/meson.build                        |   1 +
 doc/guides/tools/dts.rst                      |  29 +
 dts/doc/doc-index.rst                         |  20 +
 dts/doc/meson.build                           |  51 ++
 dts/framework/config/__init__.py              |  11 +
 .../{testbed_model/hw => config}/cpu.py       |  13 +
 dts/framework/dts.py                          |   8 +-
 dts/framework/remote_session/__init__.py      |   3 +-
 dts/framework/remote_session/linux_session.py |   2 +-
 dts/framework/remote_session/os_session.py    |  12 +-
 .../remote_session/remote/__init__.py         |  16 -
 .../{remote => }/remote_session.py            |   0
 .../{remote => }/ssh_session.py               |   0
 dts/framework/settings.py                     |  55 +-
 dts/framework/testbed_model/__init__.py       |  10 +-
 dts/framework/testbed_model/hw/__init__.py    |  27 -
 dts/framework/testbed_model/node.py           | 164 ++--
 dts/framework/testbed_model/sut_node.py       |   9 +-
 .../testbed_model/{hw => }/virtual_device.py  |   0
 dts/main.py                                   |   3 +-
 dts/meson.build                               |  16 +
 dts/poetry.lock                               | 770 ++++++++++++++++--
 dts/pyproject.toml                            |   7 +
 dts/tests/TestSuite_hello_world.py            |   6 +-
 meson.build                                   |   1 +
 meson_options.txt                             |   2 +
 29 files changed, 1058 insertions(+), 230 deletions(-)
 create mode 100644 dts/doc/doc-index.rst
 create mode 100644 dts/doc/meson.build
 rename dts/framework/{testbed_model/hw => config}/cpu.py (95%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
 create mode 100644 dts/meson.build

-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v3 1/4] dts: code adjustments for sphinx
  2023-05-11  9:14   ` [RFC PATCH v3 " Juraj Linkeš
@ 2023-05-11  9:14     ` Juraj Linkeš
  2023-05-11  9:14     ` [RFC PATCH v3 2/4] dts: add doc generation dependencies Juraj Linkeš
                       ` (4 subsequent siblings)
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-11  9:14 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

sphinx-build only imports the Python modules when building the
documentation; it doesn't run DTS. This requires changes that make the
code importable without running it. This means:
* properly guarding argument parsing in the if __name__ == '__main__'
  block.
* the logger used by DTS runner underwent the same treatment so that it
  doesn't create unnecessary log files.
* however, DTS uses the arguments to construct an object holding global
  variables. The defaults for the global variables needed to be moved
  from argument parsing elsewhere.
* importing the remote_session module from framework resulted in
  circular imports because of one module trying to import another
  module. This is fixed by more granular imports.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/config/__init__.py              | 11 ++++
 .../{testbed_model/hw => config}/cpu.py       | 13 +++++
 dts/framework/dts.py                          |  8 ++-
 dts/framework/remote_session/__init__.py      |  3 +-
 dts/framework/remote_session/linux_session.py |  2 +-
 dts/framework/remote_session/os_session.py    | 12 +++-
 .../remote_session/remote/__init__.py         | 16 ------
 .../{remote => }/remote_session.py            |  0
 .../{remote => }/ssh_session.py               |  0
 dts/framework/settings.py                     | 55 ++++++++++---------
 dts/framework/testbed_model/__init__.py       | 10 +---
 dts/framework/testbed_model/hw/__init__.py    | 27 ---------
 dts/framework/testbed_model/node.py           | 12 ++--
 dts/framework/testbed_model/sut_node.py       |  9 ++-
 .../testbed_model/{hw => }/virtual_device.py  |  0
 dts/main.py                                   |  3 +-
 dts/tests/TestSuite_hello_world.py            |  6 +-
 17 files changed, 88 insertions(+), 99 deletions(-)
 rename dts/framework/{testbed_model/hw => config}/cpu.py (95%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index ebb0823ff5..293c4cb15b 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -7,6 +7,8 @@
 Yaml config parsing methods
 """
 
+# pylama:ignore=W0611
+
 import json
 import os.path
 import pathlib
@@ -19,6 +21,15 @@
 
 from framework.settings import SETTINGS
 
+from .cpu import (
+    LogicalCore,
+    LogicalCoreCount,
+    LogicalCoreCountFilter,
+    LogicalCoreList,
+    LogicalCoreListFilter,
+    lcore_filter,
+)
+
 
 class StrEnum(Enum):
     @staticmethod
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/config/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/config/cpu.py
index d1918a12dc..8fe785dfe4 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/config/cpu.py
@@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
             )
 
         return filtered_lcores
+
+
+def lcore_filter(
+    core_list: list[LogicalCore],
+    filter_specifier: LogicalCoreCount | LogicalCoreList,
+    ascending: bool,
+) -> LogicalCoreFilter:
+    if isinstance(filter_specifier, LogicalCoreList):
+        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+    elif isinstance(filter_specifier, LogicalCoreCount):
+        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+    else:
+        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 0502284580..22a09b7e34 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,7 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
+import logging
 import sys
 
 from .config import CONFIGURATION, BuildTargetConfiguration, ExecutionConfiguration
@@ -12,7 +13,8 @@
 from .testbed_model import SutNode
 from .utils import check_dts_python_version
 
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG | logging.Logger = logging.getLogger("DTSRunner")
 result: DTSResult = DTSResult(dts_logger)
 
 
@@ -24,6 +26,10 @@ def run_all() -> None:
     global dts_logger
     global result
 
+    # create a regular DTS logger and create a new result with it
+    dts_logger = getLogger("DTSRunner")
+    result = DTSResult(dts_logger)
+
     # check the python version of the server that run dts
     check_dts_python_version()
 
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index ee221503df..17ca1459f7 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -17,7 +17,8 @@
 
 from .linux_session import LinuxSession
 from .os_session import OSSession
-from .remote import CommandResult, RemoteSession, SSHSession
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
 
 
 def create_session(
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/remote_session/linux_session.py
index a1e3bc3a92..c8ce5fe6da 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/remote_session/linux_session.py
@@ -2,8 +2,8 @@
 # Copyright(c) 2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2023 University of New Hampshire
 
+from framework.config import LogicalCore
 from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
 from framework.utils import expand_range
 
 from .posix_session import PosixSession
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/remote_session/os_session.py
index 4c48ae2567..246f0358ea 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/remote_session/os_session.py
@@ -6,13 +6,13 @@
 from collections.abc import Iterable
 from pathlib import PurePath
 
-from framework.config import Architecture, NodeConfiguration
+from framework.config import Architecture, LogicalCore, NodeConfiguration
 from framework.logger import DTSLOG
 from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
 from framework.utils import EnvVarsDict, MesonArgs
 
-from .remote import CommandResult, RemoteSession, create_remote_session
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
 
 
 class OSSession(ABC):
@@ -173,3 +173,9 @@ def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
         if needed and mount the hugepages if needed.
         If force_first_numa is True, configure hugepages just on the first socket.
         """
+
+
+def create_remote_session(
+    node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> RemoteSession:
+    return SSHSession(node_config, name, logger)
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 8a1512210a..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,16 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-
-
-def create_remote_session(
-    node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
-    return SSHSession(node_config, name, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 71955f4581..144f9dea62 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
 import argparse
 import os
 from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Any, TypeVar
 
@@ -59,15 +59,18 @@ def __call__(
 
 @dataclass(slots=True, frozen=True)
 class _Settings:
-    config_file_path: str
-    output_dir: str
-    timeout: float
-    verbose: bool
-    skip_setup: bool
-    dpdk_tarball_path: Path
-    compile_timeout: float
-    test_cases: list
-    re_run: int
+    config_file_path: Path = Path(Path(__file__).parent.parent, "conf.yaml")
+    output_dir: str = "output"
+    timeout: float = 15
+    verbose: bool = False
+    skip_setup: bool = False
+    dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+    compile_timeout: float = 1200
+    test_cases: list[str] = field(default_factory=list)
+    re_run: int = 0
+
+
+SETTINGS: _Settings = _Settings()
 
 
 def _get_parser() -> argparse.ArgumentParser:
@@ -81,7 +84,8 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--config-file",
         action=_env_arg("DTS_CFG_FILE"),
-        default="conf.yaml",
+        default=SETTINGS.config_file_path,
+        type=Path,
         help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
         "and targets.",
     )
@@ -90,7 +94,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--output-dir",
         "--output",
         action=_env_arg("DTS_OUTPUT_DIR"),
-        default="output",
+        default=SETTINGS.output_dir,
         help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
     )
 
@@ -98,7 +102,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-t",
         "--timeout",
         action=_env_arg("DTS_TIMEOUT"),
-        default=15,
+        default=SETTINGS.timeout,
         type=float,
         help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
         "compiling DPDK.",
@@ -108,7 +112,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-v",
         "--verbose",
         action=_env_arg("DTS_VERBOSE"),
-        default="N",
+        default=SETTINGS.verbose,
         help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
         "to the console.",
     )
@@ -117,7 +121,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-s",
         "--skip-setup",
         action=_env_arg("DTS_SKIP_SETUP"),
-        default="N",
+        default=SETTINGS.skip_setup,
         help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
     )
 
@@ -125,7 +129,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--tarball",
         "--snapshot",
         action=_env_arg("DTS_DPDK_TARBALL"),
-        default="dpdk.tar.xz",
+        default=SETTINGS.dpdk_tarball_path,
         type=Path,
         help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball "
         "which will be used in testing.",
@@ -134,7 +138,7 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--compile-timeout",
         action=_env_arg("DTS_COMPILE_TIMEOUT"),
-        default=1200,
+        default=SETTINGS.compile_timeout,
         type=float,
         help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
     )
@@ -142,8 +146,9 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--test-cases",
         action=_env_arg("DTS_TESTCASES"),
-        default="",
-        help="[DTS_TESTCASES] Comma-separated list of test cases to execute. "
+        nargs="*",
+        default=SETTINGS.test_cases,
+        help="[DTS_TESTCASES] A list of test cases to execute. "
         "Unknown test cases will be silently ignored.",
     )
 
@@ -151,7 +156,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--re-run",
         "--re_run",
         action=_env_arg("DTS_RERUN"),
-        default=0,
+        default=SETTINGS.re_run,
         type=int,
         help="[DTS_RERUN] Re-run each test case the specified amount of times "
         "if a test failure occurs",
@@ -165,10 +170,11 @@ def _check_tarball_path(parsed_args: argparse.Namespace) -> None:
         raise ConfigurationError(f"DPDK tarball '{parsed_args.tarball}' doesn't exist.")
 
 
-def _get_settings() -> _Settings:
+def set_settings() -> None:
+    global SETTINGS
     parsed_args = _get_parser().parse_args()
     _check_tarball_path(parsed_args)
-    return _Settings(
+    SETTINGS = _Settings(
         config_file_path=parsed_args.config_file,
         output_dir=parsed_args.output_dir,
         timeout=parsed_args.timeout,
@@ -176,9 +182,6 @@ def _get_settings() -> _Settings:
         skip_setup=(parsed_args.skip_setup == "Y"),
         dpdk_tarball_path=parsed_args.tarball,
         compile_timeout=parsed_args.compile_timeout,
-        test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+        test_cases=parsed_args.test_cases,
         re_run=parsed_args.re_run,
     )
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index f54a947051..148f81993d 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,14 +9,6 @@
 
 # pylama:ignore=W0611
 
-from .hw import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-    VirtualDevice,
-    lcore_filter,
-)
 from .node import Node
 from .sut_node import SutNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
-    core_list: list[LogicalCore],
-    filter_specifier: LogicalCoreCount | LogicalCoreList,
-    ascending: bool,
-) -> LogicalCoreFilter:
-    if isinstance(filter_specifier, LogicalCoreList):
-        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
-    elif isinstance(filter_specifier, LogicalCoreCount):
-        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
-    else:
-        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index d48fafe65d..90467981c3 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,19 +12,17 @@
 from framework.config import (
     BuildTargetConfiguration,
     ExecutionConfiguration,
-    NodeConfiguration,
-)
-from framework.logger import DTSLOG, getLogger
-from framework.remote_session import OSSession, create_session
-from framework.settings import SETTINGS
-
-from .hw import (
     LogicalCore,
     LogicalCoreCount,
     LogicalCoreList,
     LogicalCoreListFilter,
+    NodeConfiguration,
     lcore_filter,
 )
+from framework.logger import DTSLOG, getLogger
+from framework.remote_session import create_session
+from framework.remote_session.os_session import OSSession
+from framework.settings import SETTINGS
 
 
 class Node(object):
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 2b2b50d982..6db4a505bb 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -7,13 +7,18 @@
 import time
 from pathlib import PurePath
 
-from framework.config import BuildTargetConfiguration, NodeConfiguration
+from framework.config import (
+    BuildTargetConfiguration,
+    LogicalCoreCount,
+    LogicalCoreList,
+    NodeConfiguration,
+)
 from framework.remote_session import CommandResult, OSSession
 from framework.settings import SETTINGS
 from framework.utils import EnvVarsDict, MesonArgs
 
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
 from .node import Node
+from .virtual_device import VirtualDevice
 
 
 class SutNode(Node):
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..060ff1b19a 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,11 @@
 
 import logging
 
-from framework import dts
+from framework import dts, settings
 
 
 def main() -> None:
+    settings.set_settings()
     dts.run_all()
 
 
diff --git a/dts/tests/TestSuite_hello_world.py b/dts/tests/TestSuite_hello_world.py
index 7e3d95c0cf..96c31a6c8c 100644
--- a/dts/tests/TestSuite_hello_world.py
+++ b/dts/tests/TestSuite_hello_world.py
@@ -6,12 +6,8 @@
 No other EAL parameters apart from cores are used.
 """
 
+from framework.config import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
 from framework.test_suite import TestSuite
-from framework.testbed_model import (
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreList,
-)
 
 
 class TestHelloWorld(TestSuite):
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v3 2/4] dts: add doc generation dependencies
  2023-05-11  9:14   ` [RFC PATCH v3 " Juraj Linkeš
  2023-05-11  9:14     ` [RFC PATCH v3 1/4] dts: code adjustments for sphinx Juraj Linkeš
@ 2023-05-11  9:14     ` Juraj Linkeš
  2023-05-11  9:14     ` [RFC PATCH v3 3/4] dts: add doc generation Juraj Linkeš
                       ` (3 subsequent siblings)
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-11  9:14 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/poetry.lock    | 770 +++++++++++++++++++++++++++++++++++++++++----
 dts/pyproject.toml |   7 +
 2 files changed, 710 insertions(+), 67 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index 0b2a007d4d..500f89dac1 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,24 +1,69 @@
+# This file is automatically @generated by Poetry and should not be changed by hand.
+
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+    {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
 [[package]]
 name = "attrs"
-version = "22.1.0"
+version = "22.2.0"
 description = "Classes Without Boilerplate"
 category = "main"
 optional = false
-python-versions = ">=3.5"
+python-versions = ">=3.6"
+files = [
+    {file = "attrs-22.2.0-py3-none-any.whl", hash = "sha256:29e95c7f6778868dbd49170f98f8818f78f3dc5e0e37c0b1f474e3561b240836"},
+    {file = "attrs-22.2.0.tar.gz", hash = "sha256:c9227bfc2f01993c03f68db37d1d15c9690188323c067c641f1a35ca58185f99"},
+]
 
 [package.extras]
-dev = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "furo", "sphinx", "sphinx-notfound-page", "pre-commit", "cloudpickle"]
-docs = ["furo", "sphinx", "zope.interface", "sphinx-notfound-page"]
-tests = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "zope.interface", "cloudpickle"]
-tests_no_zope = ["coverage[toml] (>=5.0.2)", "hypothesis", "pympler", "pytest (>=4.3.0)", "mypy (>=0.900,!=0.940)", "pytest-mypy-plugins", "cloudpickle"]
+cov = ["attrs[tests]", "coverage-enable-subprocess", "coverage[toml] (>=5.3)"]
+dev = ["attrs[docs,tests]"]
+docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope.interface"]
+tests = ["attrs[tests-no-zope]", "zope.interface"]
+tests-no-zope = ["cloudpickle", "cloudpickle", "hypothesis", "hypothesis", "mypy (>=0.971,<0.990)", "mypy (>=0.971,<0.990)", "pympler", "pympler", "pytest (>=4.3.0)", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-mypy-plugins", "pytest-xdist[psutil]", "pytest-xdist[psutil]"]
+
+[[package]]
+name = "babel"
+version = "2.12.1"
+description = "Internationalization utilities"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Babel-2.12.1-py3-none-any.whl", hash = "sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610"},
+    {file = "Babel-2.12.1.tar.gz", hash = "sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455"},
+]
 
 [[package]]
 name = "black"
-version = "22.10.0"
+version = "22.12.0"
 description = "The uncompromising code formatter."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "black-22.12.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9eedd20838bd5d75b80c9f5487dbcb06836a43833a37846cf1d8c1cc01cef59d"},
+    {file = "black-22.12.0-cp310-cp310-win_amd64.whl", hash = "sha256:159a46a4947f73387b4d83e87ea006dbb2337eab6c879620a3ba52699b1f4351"},
+    {file = "black-22.12.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d30b212bffeb1e252b31dd269dfae69dd17e06d92b87ad26e23890f3efea366f"},
+    {file = "black-22.12.0-cp311-cp311-win_amd64.whl", hash = "sha256:7412e75863aa5c5411886804678b7d083c7c28421210180d67dfd8cf1221e1f4"},
+    {file = "black-22.12.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c116eed0efb9ff870ded8b62fe9f28dd61ef6e9ddd28d83d7d264a38417dcee2"},
+    {file = "black-22.12.0-cp37-cp37m-win_amd64.whl", hash = "sha256:1f58cbe16dfe8c12b7434e50ff889fa479072096d79f0a7f25e4ab8e94cd8350"},
+    {file = "black-22.12.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:77d86c9f3db9b1bf6761244bc0b3572a546f5fe37917a044e02f3166d5aafa7d"},
+    {file = "black-22.12.0-cp38-cp38-win_amd64.whl", hash = "sha256:82d9fe8fee3401e02e79767016b4907820a7dc28d70d137eb397b92ef3cc5bfc"},
+    {file = "black-22.12.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:101c69b23df9b44247bd88e1d7e90154336ac4992502d4197bdac35dd7ee3320"},
+    {file = "black-22.12.0-cp39-cp39-win_amd64.whl", hash = "sha256:559c7a1ba9a006226f09e4916060982fd27334ae1998e7a38b3f33a37f7a2148"},
+    {file = "black-22.12.0-py3-none-any.whl", hash = "sha256:436cc9167dd28040ad90d3b404aec22cedf24a6e4d7de221bec2730ec0c97bcf"},
+    {file = "black-22.12.0.tar.gz", hash = "sha256:229351e5a18ca30f447bf724d007f890f97e13af070bb6ad4c0a441cd7596a2f"},
+]
 
 [package.dependencies]
 click = ">=8.0.0"
@@ -33,6 +78,103 @@ d = ["aiohttp (>=3.7.4)"]
 jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
 uvloop = ["uvloop (>=0.15.2)"]
 
+[[package]]
+name = "certifi"
+version = "2022.12.7"
+description = "Python package for providing Mozilla's CA Bundle."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "certifi-2022.12.7-py3-none-any.whl", hash = "sha256:4ad3232f5e926d6718ec31cfc1fcadfde020920e278684144551c91769c7bc18"},
+    {file = "certifi-2022.12.7.tar.gz", hash = "sha256:35824b4c3a97115964b408844d64aa14db1cc518f6562e8d7261699d1350a9e3"},
+]
+
+[[package]]
+name = "charset-normalizer"
+version = "3.1.0"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+category = "dev"
+optional = false
+python-versions = ">=3.7.0"
+files = [
+    {file = "charset-normalizer-3.1.0.tar.gz", hash = "sha256:34e0a2f9c370eb95597aae63bf85eb5e96826d81e3dcf88b8886012906f509b5"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e0ac8959c929593fee38da1c2b64ee9778733cdf03c482c9ff1d508b6b593b2b"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d7fc3fca01da18fbabe4625d64bb612b533533ed10045a2ac3dd194bfa656b60"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:04eefcee095f58eaabe6dc3cc2262f3bcd776d2c67005880894f447b3f2cb9c1"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20064ead0717cf9a73a6d1e779b23d149b53daf971169289ed2ed43a71e8d3b0"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1435ae15108b1cb6fffbcea2af3d468683b7afed0169ad718451f8db5d1aff6f"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c84132a54c750fda57729d1e2599bb598f5fa0344085dbde5003ba429a4798c0"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75f2568b4189dda1c567339b48cba4ac7384accb9c2a7ed655cd86b04055c795"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:11d3bcb7be35e7b1bba2c23beedac81ee893ac9871d0ba79effc7fc01167db6c"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:891cf9b48776b5c61c700b55a598621fdb7b1e301a550365571e9624f270c203"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:5f008525e02908b20e04707a4f704cd286d94718f48bb33edddc7d7b584dddc1"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:b06f0d3bf045158d2fb8837c5785fe9ff9b8c93358be64461a1089f5da983137"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:49919f8400b5e49e961f320c735388ee686a62327e773fa5b3ce6721f7e785ce"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:22908891a380d50738e1f978667536f6c6b526a2064156203d418f4856d6e86a"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-win32.whl", hash = "sha256:12d1a39aa6b8c6f6248bb54550efcc1c38ce0d8096a146638fd4738e42284448"},
+    {file = "charset_normalizer-3.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:65ed923f84a6844de5fd29726b888e58c62820e0769b76565480e1fdc3d062f8"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9a3267620866c9d17b959a84dd0bd2d45719b817245e49371ead79ed4f710d19"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6734e606355834f13445b6adc38b53c0fd45f1a56a9ba06c2058f86893ae8017"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f8303414c7b03f794347ad062c0516cee0e15f7a612abd0ce1e25caf6ceb47df"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aaf53a6cebad0eae578f062c7d462155eada9c172bd8c4d250b8c1d8eb7f916a"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3dc5b6a8ecfdc5748a7e429782598e4f17ef378e3e272eeb1340ea57c9109f41"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e1b25e3ad6c909f398df8921780d6a3d120d8c09466720226fc621605b6f92b1"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0ca564606d2caafb0abe6d1b5311c2649e8071eb241b2d64e75a0d0065107e62"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b82fab78e0b1329e183a65260581de4375f619167478dddab510c6c6fb04d9b6"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:bd7163182133c0c7701b25e604cf1611c0d87712e56e88e7ee5d72deab3e76b5"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:11d117e6c63e8f495412d37e7dc2e2fff09c34b2d09dbe2bee3c6229577818be"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:cf6511efa4801b9b38dc5546d7547d5b5c6ef4b081c60b23e4d941d0eba9cbeb"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:abc1185d79f47c0a7aaf7e2412a0eb2c03b724581139193d2d82b3ad8cbb00ac"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:cb7b2ab0188829593b9de646545175547a70d9a6e2b63bf2cd87a0a391599324"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-win32.whl", hash = "sha256:c36bcbc0d5174a80d6cccf43a0ecaca44e81d25be4b7f90f0ed7bcfbb5a00909"},
+    {file = "charset_normalizer-3.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:cca4def576f47a09a943666b8f829606bcb17e2bc2d5911a46c8f8da45f56755"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:0c95f12b74681e9ae127728f7e5409cbbef9cd914d5896ef238cc779b8152373"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fca62a8301b605b954ad2e9c3666f9d97f63872aa4efcae5492baca2056b74ab"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ac0aa6cd53ab9a31d397f8303f92c42f534693528fafbdb997c82bae6e477ad9"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c3af8e0f07399d3176b179f2e2634c3ce9c1301379a6b8c9c9aeecd481da494f"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a5fc78f9e3f501a1614a98f7c54d3969f3ad9bba8ba3d9b438c3bc5d047dd28"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:628c985afb2c7d27a4800bfb609e03985aaecb42f955049957814e0491d4006d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:74db0052d985cf37fa111828d0dd230776ac99c740e1a758ad99094be4f1803d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1e8fcdd8f672a1c4fc8d0bd3a2b576b152d2a349782d1eb0f6b8e52e9954731d"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:04afa6387e2b282cf78ff3dbce20f0cc071c12dc8f685bd40960cc68644cfea6"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:dd5653e67b149503c68c4018bf07e42eeed6b4e956b24c00ccdf93ac79cdff84"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:d2686f91611f9e17f4548dbf050e75b079bbc2a82be565832bc8ea9047b61c8c"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-win32.whl", hash = "sha256:4155b51ae05ed47199dc5b2a4e62abccb274cee6b01da5b895099b61b1982974"},
+    {file = "charset_normalizer-3.1.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322102cdf1ab682ecc7d9b1c5eed4ec59657a65e1c146a0da342b78f4112db23"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e633940f28c1e913615fd624fcdd72fdba807bf53ea6925d6a588e84e1151531"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3a06f32c9634a8705f4ca9946d667609f52cf130d5548881401f1eb2c39b1e2c"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:7381c66e0561c5757ffe616af869b916c8b4e42b367ab29fedc98481d1e74e14"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3573d376454d956553c356df45bb824262c397c6e26ce43e8203c4c540ee0acb"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e89df2958e5159b811af9ff0f92614dabf4ff617c03a4c1c6ff53bf1c399e0e1"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:78cacd03e79d009d95635e7d6ff12c21eb89b894c354bd2b2ed0b4763373693b"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de5695a6f1d8340b12a5d6d4484290ee74d61e467c39ff03b39e30df62cf83a0"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1c60b9c202d00052183c9be85e5eaf18a4ada0a47d188a83c8f5c5b23252f649"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f645caaf0008bacf349875a974220f1f1da349c5dbe7c4ec93048cdc785a3326"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ea9f9c6034ea2d93d9147818f17c2a0860d41b71c38b9ce4d55f21b6f9165a11"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:80d1543d58bd3d6c271b66abf454d437a438dff01c3e62fdbcd68f2a11310d4b"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:73dc03a6a7e30b7edc5b01b601e53e7fc924b04e1835e8e407c12c037e81adbd"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6f5c2e7bc8a4bf7c426599765b1bd33217ec84023033672c1e9a8b35eaeaaaf8"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-win32.whl", hash = "sha256:12a2b561af122e3d94cdb97fe6fb2bb2b82cef0cdca131646fdb940a1eda04f0"},
+    {file = "charset_normalizer-3.1.0-cp38-cp38-win_amd64.whl", hash = "sha256:3160a0fd9754aab7d47f95a6b63ab355388d890163eb03b2d2b87ab0a30cfa59"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:38e812a197bf8e71a59fe55b757a84c1f946d0ac114acafaafaf21667a7e169e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6baf0baf0d5d265fa7944feb9f7451cc316bfe30e8df1a61b1bb08577c554f31"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:8f25e17ab3039b05f762b0a55ae0b3632b2e073d9c8fc88e89aca31a6198e88f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3747443b6a904001473370d7810aa19c3a180ccd52a7157aacc264a5ac79265e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b116502087ce8a6b7a5f1814568ccbd0e9f6cfd99948aa59b0e241dc57cf739f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d16fd5252f883eb074ca55cb622bc0bee49b979ae4e8639fff6ca3ff44f9f854"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21fa558996782fc226b529fdd2ed7866c2c6ec91cee82735c98a197fae39f706"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6f6c7a8a57e9405cad7485f4c9d3172ae486cfef1344b5ddd8e5239582d7355e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ac3775e3311661d4adace3697a52ac0bab17edd166087d493b52d4f4f553f9f0"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:10c93628d7497c81686e8e5e557aafa78f230cd9e77dd0c40032ef90c18f2230"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:6f4f4668e1831850ebcc2fd0b1cd11721947b6dc7c00bf1c6bd3c929ae14f2c7"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:0be65ccf618c1e7ac9b849c315cc2e8a8751d9cfdaa43027d4f6624bd587ab7e"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:53d0a3fa5f8af98a1e261de6a3943ca631c526635eb5817a87a59d9a57ebf48f"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-win32.whl", hash = "sha256:a04f86f41a8916fe45ac5024ec477f41f886b3c435da2d4e3d2709b22ab02af1"},
+    {file = "charset_normalizer-3.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:830d2948a5ec37c386d3170c483063798d7879037492540f10a475e3fd6f244b"},
+    {file = "charset_normalizer-3.1.0-py3-none-any.whl", hash = "sha256:3d9098b479e78c85080c98e1e35ff40b4a31d8953102bb0fd7d1b6f8a2111a3d"},
+]
+
 [[package]]
 name = "click"
 version = "8.1.3"
@@ -40,6 +182,10 @@ description = "Composable command line interface toolkit"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "click-8.1.3-py3-none-any.whl", hash = "sha256:bb4d8133cb15a609f44e8213d9b391b0809795062913b383c62be0ee95b1db48"},
+    {file = "click-8.1.3.tar.gz", hash = "sha256:7682dc8afb30297001674575ea00d1814d808d6a36af415a82bd481d37ba7b8e"},
+]
 
 [package.dependencies]
 colorama = {version = "*", markers = "platform_system == \"Windows\""}
@@ -51,20 +197,82 @@ description = "Cross-platform colored terminal text."
 category = "dev"
 optional = false
 python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
+files = [
+    {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
+    {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
+]
+
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+    {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+    {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+    {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+    {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+    {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
 
 [[package]]
 name = "isort"
-version = "5.10.1"
+version = "5.12.0"
 description = "A Python utility / library to sort Python imports."
 category = "dev"
 optional = false
-python-versions = ">=3.6.1,<4.0"
+python-versions = ">=3.8.0"
+files = [
+    {file = "isort-5.12.0-py3-none-any.whl", hash = "sha256:f84c2818376e66cf843d497486ea8fed8700b340f308f076c6fb1229dff318b6"},
+    {file = "isort-5.12.0.tar.gz", hash = "sha256:8bef7dde241278824a6d83f44a544709b065191b95b6e50894bdc722fcba0504"},
+]
 
 [package.extras]
-pipfile_deprecated_finder = ["pipreqs", "requirementslib"]
-requirements_deprecated_finder = ["pipreqs", "pip-api"]
-colors = ["colorama (>=0.4.3,<0.5.0)"]
+colors = ["colorama (>=0.4.3)"]
+pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"]
 plugins = ["setuptools"]
+requirements-deprecated-finder = ["pip-api", "pipreqs"]
+
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+    {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
 
 [[package]]
 name = "jsonpatch"
@@ -73,6 +281,10 @@ description = "Apply JSON-Patches (RFC 6902)"
 category = "main"
 optional = false
 python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+    {file = "jsonpatch-1.32-py2.py3-none-any.whl", hash = "sha256:26ac385719ac9f54df8a2f0827bb8253aa3ea8ab7b3368457bcdb8c14595a397"},
+    {file = "jsonpatch-1.32.tar.gz", hash = "sha256:b6ddfe6c3db30d81a96aaeceb6baf916094ffa23d7dd5fa2c13e13f8b6e600c2"},
+]
 
 [package.dependencies]
 jsonpointer = ">=1.9"
@@ -84,14 +296,22 @@ description = "Identify specific nodes in a JSON document (RFC 6901)"
 category = "main"
 optional = false
 python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+    {file = "jsonpointer-2.3-py2.py3-none-any.whl", hash = "sha256:51801e558539b4e9cd268638c078c6c5746c9ac96bc38152d443400e4f3793e9"},
+    {file = "jsonpointer-2.3.tar.gz", hash = "sha256:97cba51526c829282218feb99dab1b1e6bdf8efd1c43dc9d57be093c0d69c99a"},
+]
 
 [[package]]
 name = "jsonschema"
-version = "4.17.0"
+version = "4.17.3"
 description = "An implementation of JSON Schema validation for Python"
 category = "main"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "jsonschema-4.17.3-py3-none-any.whl", hash = "sha256:a870ad254da1a8ca84b6a2905cac29d265f805acc57af304784962a2aa6508f6"},
+    {file = "jsonschema-4.17.3.tar.gz", hash = "sha256:0f864437ab8b6076ba6707453ef8f98a6a0d512a80e93f8abdb676f737ecb60d"},
+]
 
 [package.dependencies]
 attrs = ">=17.4.0"
@@ -101,6 +321,66 @@ pyrsistent = ">=0.14.0,<0.17.0 || >0.17.0,<0.17.1 || >0.17.1,<0.17.2 || >0.17.2"
 format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"]
 format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=1.11)"]
 
+[[package]]
+name = "markupsafe"
+version = "2.1.2"
+description = "Safely add untrusted strings to HTML/XML markup."
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:665a36ae6f8f20a4676b53224e33d456a6f5a72657d9c83c2aa00765072f31f7"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:340bea174e9761308703ae988e982005aedf427de816d1afe98147668cc03036"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22152d00bf4a9c7c83960521fc558f55a1adbc0631fbb00a9471e097b19d72e1"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:28057e985dace2f478e042eaa15606c7efccb700797660629da387eb289b9323"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca244fa73f50a800cf8c3ebf7fd93149ec37f5cb9596aa8873ae2c1d23498601"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d9d971ec1e79906046aa3ca266de79eac42f1dbf3612a05dc9368125952bd1a1"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:7e007132af78ea9df29495dbf7b5824cb71648d7133cf7848a2a5dd00d36f9ff"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:7313ce6a199651c4ed9d7e4cfb4aa56fe923b1adf9af3b420ee14e6d9a73df65"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-win32.whl", hash = "sha256:c4a549890a45f57f1ebf99c067a4ad0cb423a05544accaf2b065246827ed9603"},
+    {file = "MarkupSafe-2.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:835fb5e38fd89328e9c81067fd642b3593c33e1e17e2fdbf77f5676abb14a156"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:2ec4f2d48ae59bbb9d1f9d7efb9236ab81429a764dedca114f5fdabbc3788013"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:608e7073dfa9e38a85d38474c082d4281f4ce276ac0010224eaba11e929dd53a"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:65608c35bfb8a76763f37036547f7adfd09270fbdbf96608be2bead319728fcd"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2bfb563d0211ce16b63c7cb9395d2c682a23187f54c3d79bfec33e6705473c6"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:da25303d91526aac3672ee6d49a2f3db2d9502a4a60b55519feb1a4c7714e07d"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:9cad97ab29dfc3f0249b483412c85c8ef4766d96cdf9dcf5a1e3caa3f3661cf1"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:085fd3201e7b12809f9e6e9bc1e5c96a368c8523fad5afb02afe3c051ae4afcc"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1bea30e9bf331f3fef67e0a3877b2288593c98a21ccb2cf29b74c581a4eb3af0"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-win32.whl", hash = "sha256:7df70907e00c970c60b9ef2938d894a9381f38e6b9db73c5be35e59d92e06625"},
+    {file = "MarkupSafe-2.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:e55e40ff0cc8cc5c07996915ad367fa47da6b3fc091fdadca7f5403239c5fec3"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:a6e40afa7f45939ca356f348c8e23048e02cb109ced1eb8420961b2f40fb373a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cf877ab4ed6e302ec1d04952ca358b381a882fbd9d1b07cccbfd61783561f98a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63ba06c9941e46fa389d389644e2d8225e0e3e5ebcc4ff1ea8506dce646f8c8a"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f1cd098434e83e656abf198f103a8207a8187c0fc110306691a2e94a78d0abb2"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:55f44b440d491028addb3b88f72207d71eeebfb7b5dbf0643f7c023ae1fba619"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:a6f2fcca746e8d5910e18782f976489939d54a91f9411c32051b4aab2bd7c513"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:0b462104ba25f1ac006fdab8b6a01ebbfbce9ed37fd37fd4acd70c67c973e460"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-win32.whl", hash = "sha256:7668b52e102d0ed87cb082380a7e2e1e78737ddecdde129acadb0eccc5423859"},
+    {file = "MarkupSafe-2.1.2-cp37-cp37m-win_amd64.whl", hash = "sha256:6d6607f98fcf17e534162f0709aaad3ab7a96032723d8ac8750ffe17ae5a0666"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a806db027852538d2ad7555b203300173dd1b77ba116de92da9afbc3a3be3eed"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:a4abaec6ca3ad8660690236d11bfe28dfd707778e2442b45addd2f086d6ef094"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f03a532d7dee1bed20bc4884194a16160a2de9ffc6354b3878ec9682bb623c54"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4cf06cdc1dda95223e9d2d3c58d3b178aa5dacb35ee7e3bbac10e4e1faacb419"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:22731d79ed2eb25059ae3df1dfc9cb1546691cc41f4e3130fe6bfbc3ecbbecfa"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:f8ffb705ffcf5ddd0e80b65ddf7bed7ee4f5a441ea7d3419e861a12eaf41af58"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8db032bf0ce9022a8e41a22598eefc802314e81b879ae093f36ce9ddf39ab1ba"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:2298c859cfc5463f1b64bd55cb3e602528db6fa0f3cfd568d3605c50678f8f03"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-win32.whl", hash = "sha256:50c42830a633fa0cf9e7d27664637532791bfc31c731a87b202d2d8ac40c3ea2"},
+    {file = "MarkupSafe-2.1.2-cp38-cp38-win_amd64.whl", hash = "sha256:bb06feb762bade6bf3c8b844462274db0c76acc95c52abe8dbed28ae3d44a147"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:99625a92da8229df6d44335e6fcc558a5037dd0a760e11d84be2260e6f37002f"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8bca7e26c1dd751236cfb0c6c72d4ad61d986e9a41bbf76cb445f69488b2a2bd"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40627dcf047dadb22cd25ea7ecfe9cbf3bbbad0482ee5920b582f3809c97654f"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:40dfd3fefbef579ee058f139733ac336312663c6706d1163b82b3003fb1925c4"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:090376d812fb6ac5f171e5938e82e7f2d7adc2b629101cec0db8b267815c85e2"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:2e7821bffe00aa6bd07a23913b7f4e01328c3d5cc0b40b36c0bd81d362faeb65"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:c0a33bc9f02c2b17c3ea382f91b4db0e6cde90b63b296422a939886a7a80de1c"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:b8526c6d437855442cdd3d87eede9c425c4445ea011ca38d937db299382e6fa3"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-win32.whl", hash = "sha256:137678c63c977754abe9086a3ec011e8fd985ab90631145dfb9294ad09c102a7"},
+    {file = "MarkupSafe-2.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:0576fe974b40a400449768941d5d0858cc624e3249dfd1e0c33674e5c7ca7aed"},
+    {file = "MarkupSafe-2.1.2.tar.gz", hash = "sha256:abcabc8c2b26036d62d4c746381a6f7cf60aafcc653198ad678306986b09450d"},
+]
+
 [[package]]
 name = "mccabe"
 version = "0.7.0"
@@ -108,6 +388,10 @@ description = "McCabe checker, plugin for flake8"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "mccabe-0.7.0-py2.py3-none-any.whl", hash = "sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e"},
+    {file = "mccabe-0.7.0.tar.gz", hash = "sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325"},
+]
 
 [[package]]
 name = "mypy"
@@ -116,6 +400,31 @@ description = "Optional static typing for Python"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "mypy-0.961-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:697540876638ce349b01b6786bc6094ccdaba88af446a9abb967293ce6eaa2b0"},
+    {file = "mypy-0.961-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b117650592e1782819829605a193360a08aa99f1fc23d1d71e1a75a142dc7e15"},
+    {file = "mypy-0.961-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bdd5ca340beffb8c44cb9dc26697628d1b88c6bddf5c2f6eb308c46f269bb6f3"},
+    {file = "mypy-0.961-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:3e09f1f983a71d0672bbc97ae33ee3709d10c779beb613febc36805a6e28bb4e"},
+    {file = "mypy-0.961-cp310-cp310-win_amd64.whl", hash = "sha256:e999229b9f3198c0c880d5e269f9f8129c8862451ce53a011326cad38b9ccd24"},
+    {file = "mypy-0.961-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:b24be97351084b11582fef18d79004b3e4db572219deee0212078f7cf6352723"},
+    {file = "mypy-0.961-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f4a21d01fc0ba4e31d82f0fff195682e29f9401a8bdb7173891070eb260aeb3b"},
+    {file = "mypy-0.961-cp36-cp36m-win_amd64.whl", hash = "sha256:439c726a3b3da7ca84a0199a8ab444cd8896d95012c4a6c4a0d808e3147abf5d"},
+    {file = "mypy-0.961-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:5a0b53747f713f490affdceef835d8f0cb7285187a6a44c33821b6d1f46ed813"},
+    {file = "mypy-0.961-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:0e9f70df36405c25cc530a86eeda1e0867863d9471fe76d1273c783df3d35c2e"},
+    {file = "mypy-0.961-cp37-cp37m-win_amd64.whl", hash = "sha256:b88f784e9e35dcaa075519096dc947a388319cb86811b6af621e3523980f1c8a"},
+    {file = "mypy-0.961-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:d5aaf1edaa7692490f72bdb9fbd941fbf2e201713523bdb3f4038be0af8846c6"},
+    {file = "mypy-0.961-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:9f5f5a74085d9a81a1f9c78081d60a0040c3efb3f28e5c9912b900adf59a16e6"},
+    {file = "mypy-0.961-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f4b794db44168a4fc886e3450201365c9526a522c46ba089b55e1f11c163750d"},
+    {file = "mypy-0.961-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:64759a273d590040a592e0f4186539858c948302c653c2eac840c7a3cd29e51b"},
+    {file = "mypy-0.961-cp38-cp38-win_amd64.whl", hash = "sha256:63e85a03770ebf403291ec50097954cc5caf2a9205c888ce3a61bd3f82e17569"},
+    {file = "mypy-0.961-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:5f1332964963d4832a94bebc10f13d3279be3ce8f6c64da563d6ee6e2eeda932"},
+    {file = "mypy-0.961-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:006be38474216b833eca29ff6b73e143386f352e10e9c2fbe76aa8549e5554f5"},
+    {file = "mypy-0.961-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:9940e6916ed9371809b35b2154baf1f684acba935cd09928952310fbddaba648"},
+    {file = "mypy-0.961-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:a5ea0875a049de1b63b972456542f04643daf320d27dc592d7c3d9cd5d9bf950"},
+    {file = "mypy-0.961-cp39-cp39-win_amd64.whl", hash = "sha256:1ece702f29270ec6af25db8cf6185c04c02311c6bb21a69f423d40e527b75c56"},
+    {file = "mypy-0.961-py3-none-any.whl", hash = "sha256:03c6cc893e7563e7b2949b969e63f02c000b32502a1b4d1314cabe391aa87d66"},
+    {file = "mypy-0.961.tar.gz", hash = "sha256:f730d56cb924d371c26b8eaddeea3cc07d78ff51c521c6d04899ac6904b75492"},
+]
 
 [package.dependencies]
 mypy-extensions = ">=0.4.3"
@@ -129,19 +438,39 @@ reports = ["lxml"]
 
 [[package]]
 name = "mypy-extensions"
-version = "0.4.3"
-description = "Experimental type system extensions for programs checked with the mypy typechecker."
+version = "1.0.0"
+description = "Type system extensions for programs checked with the mypy type checker."
 category = "dev"
 optional = false
-python-versions = "*"
+python-versions = ">=3.5"
+files = [
+    {file = "mypy_extensions-1.0.0-py3-none-any.whl", hash = "sha256:4392f6c0eb8a5668a69e23d168ffa70f0be9ccfd32b5cc2d26a34ae5b844552d"},
+    {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
+]
+
+[[package]]
+name = "packaging"
+version = "23.0"
+description = "Core utilities for Python packages"
+category = "dev"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "packaging-23.0-py3-none-any.whl", hash = "sha256:714ac14496c3e68c99c29b00845f7a2b85f3bb6f1078fd9f72fd20f0570002b2"},
+    {file = "packaging-23.0.tar.gz", hash = "sha256:b6ad297f8907de0fa2fe1ccbd26fdaf387f5f47c7275fedf8cce89f99446cf97"},
+]
 
 [[package]]
 name = "pathspec"
-version = "0.10.1"
+version = "0.11.1"
 description = "Utility library for gitignore style pattern matching of file paths."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pathspec-0.11.1-py3-none-any.whl", hash = "sha256:d8af70af76652554bd134c22b3e8a1cc46ed7d91edcdd721ef1a0c51a84a5293"},
+    {file = "pathspec-0.11.1.tar.gz", hash = "sha256:2798de800fa92780e33acca925945e9a19a133b715067cf165b8866c15a31687"},
+]
 
 [[package]]
 name = "pexpect"
@@ -150,21 +479,29 @@ description = "Pexpect allows easy control of interactive console applications."
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
+    {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
+]
 
 [package.dependencies]
 ptyprocess = ">=0.5"
 
 [[package]]
 name = "platformdirs"
-version = "2.5.2"
-description = "A small Python module for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
+version = "3.1.1"
+description = "A small Python package for determining appropriate platform-specific dirs, e.g. a \"user data dir\"."
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "platformdirs-3.1.1-py3-none-any.whl", hash = "sha256:e5986afb596e4bb5bde29a79ac9061aa955b94fca2399b7aaac4090860920dd8"},
+    {file = "platformdirs-3.1.1.tar.gz", hash = "sha256:024996549ee88ec1a9aa99ff7f8fc819bb59e2c3477b410d90a16d32d6e707aa"},
+]
 
 [package.extras]
-docs = ["furo (>=2021.7.5b38)", "proselint (>=0.10.2)", "sphinx-autodoc-typehints (>=1.12)", "sphinx (>=4)"]
-test = ["appdirs (==1.4.4)", "pytest-cov (>=2.7)", "pytest-mock (>=3.6)", "pytest (>=6)"]
+docs = ["furo (>=2022.12.7)", "proselint (>=0.13)", "sphinx (>=6.1.3)", "sphinx-autodoc-typehints (>=1.22,!=1.23.4)"]
+test = ["appdirs (==1.4.4)", "covdefaults (>=2.2.2)", "pytest (>=7.2.1)", "pytest-cov (>=4)", "pytest-mock (>=3.10)"]
 
 [[package]]
 name = "ptyprocess"
@@ -173,28 +510,40 @@ description = "Run a subprocess in a pseudo terminal"
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35"},
+    {file = "ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220"},
+]
 
 [[package]]
 name = "pycodestyle"
-version = "2.9.1"
+version = "2.10.0"
 description = "Python style guide checker"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pycodestyle-2.10.0-py2.py3-none-any.whl", hash = "sha256:8a4eaf0d0495c7395bdab3589ac2db602797d76207242c17d470186815706610"},
+    {file = "pycodestyle-2.10.0.tar.gz", hash = "sha256:347187bdb476329d98f695c213d7295a846d1152ff4fe9bacb8a9590b8ee7053"},
+]
 
 [[package]]
 name = "pydocstyle"
-version = "6.1.1"
+version = "6.3.0"
 description = "Python docstring style checker"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
+    {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+]
 
 [package.dependencies]
-snowballstemmer = "*"
+snowballstemmer = ">=2.2.0"
 
 [package.extras]
-toml = ["toml"]
+toml = ["tomli (>=1.2.3)"]
 
 [[package]]
 name = "pyflakes"
@@ -203,6 +552,25 @@ description = "passive checker of Python programs"
 category = "dev"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "pyflakes-2.5.0-py2.py3-none-any.whl", hash = "sha256:4579f67d887f804e67edb544428f264b7b24f435b263c4614f384135cea553d2"},
+    {file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
+]
+
+[[package]]
+name = "pygments"
+version = "2.14.0"
+description = "Pygments is a syntax highlighting package written in Python."
+category = "dev"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "Pygments-2.14.0-py3-none-any.whl", hash = "sha256:fa7bd7bd2771287c0de303af8bfdfc731f51bd2c6a47ab69d117138893b82717"},
+    {file = "Pygments-2.14.0.tar.gz", hash = "sha256:b3ed06a9e8ac9a9aae5a6f5dbe78a8a58655d17b43b93c078f094ddc476ae297"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
 
 [[package]]
 name = "pylama"
@@ -211,6 +579,10 @@ description = "Code audit tool for python"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pylama-8.4.1-py3-none-any.whl", hash = "sha256:5bbdbf5b620aba7206d688ed9fc917ecd3d73e15ec1a89647037a09fa3a86e60"},
+    {file = "pylama-8.4.1.tar.gz", hash = "sha256:2d4f7aecfb5b7466216d48610c7d6bad1c3990c29cdd392ad08259b161e486f6"},
+]
 
 [package.dependencies]
 mccabe = ">=0.7.0"
@@ -219,22 +591,51 @@ pydocstyle = ">=6.1.1"
 pyflakes = ">=2.5.0"
 
 [package.extras]
-all = ["pylint", "eradicate", "radon", "mypy", "vulture"]
+all = ["eradicate", "mypy", "pylint", "radon", "vulture"]
 eradicate = ["eradicate"]
 mypy = ["mypy"]
 pylint = ["pylint"]
 radon = ["radon"]
-tests = ["pytest (>=7.1.2)", "pytest-mypy", "eradicate (>=2.0.0)", "radon (>=5.1.0)", "mypy", "pylint (>=2.11.1)", "pylama-quotes", "toml", "vulture", "types-setuptools", "types-toml"]
+tests = ["eradicate (>=2.0.0)", "mypy", "pylama-quotes", "pylint (>=2.11.1)", "pytest (>=7.1.2)", "pytest-mypy", "radon (>=5.1.0)", "toml", "types-setuptools", "types-toml", "vulture"]
 toml = ["toml (>=0.10.2)"]
 vulture = ["vulture"]
 
 [[package]]
 name = "pyrsistent"
-version = "0.19.1"
+version = "0.19.3"
 description = "Persistent/Functional/Immutable data structures"
 category = "main"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "pyrsistent-0.19.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:20460ac0ea439a3e79caa1dbd560344b64ed75e85d8703943e0b66c2a6150e4a"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4c18264cb84b5e68e7085a43723f9e4c1fd1d935ab240ce02c0324a8e01ccb64"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4b774f9288dda8d425adb6544e5903f1fb6c273ab3128a355c6b972b7df39dcf"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-win32.whl", hash = "sha256:5a474fb80f5e0d6c9394d8db0fc19e90fa540b82ee52dba7d246a7791712f74a"},
+    {file = "pyrsistent-0.19.3-cp310-cp310-win_amd64.whl", hash = "sha256:49c32f216c17148695ca0e02a5c521e28a4ee6c5089f97e34fe24163113722da"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:f0774bf48631f3a20471dd7c5989657b639fd2d285b861237ea9e82c36a415a9"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ab2204234c0ecd8b9368dbd6a53e83c3d4f3cab10ecaf6d0e772f456c442393"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e42296a09e83028b3476f7073fcb69ffebac0e66dbbfd1bd847d61f74db30f19"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-win32.whl", hash = "sha256:64220c429e42a7150f4bfd280f6f4bb2850f95956bde93c6fda1b70507af6ef3"},
+    {file = "pyrsistent-0.19.3-cp311-cp311-win_amd64.whl", hash = "sha256:016ad1afadf318eb7911baa24b049909f7f3bb2c5b1ed7b6a8f21db21ea3faa8"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c4db1bd596fefd66b296a3d5d943c94f4fac5bcd13e99bffe2ba6a759d959a28"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:aeda827381f5e5d65cced3024126529ddc4289d944f75e090572c77ceb19adbf"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:42ac0b2f44607eb92ae88609eda931a4f0dfa03038c44c772e07f43e738bcac9"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-win32.whl", hash = "sha256:e8f2b814a3dc6225964fa03d8582c6e0b6650d68a232df41e3cc1b66a5d2f8d1"},
+    {file = "pyrsistent-0.19.3-cp37-cp37m-win_amd64.whl", hash = "sha256:c9bb60a40a0ab9aba40a59f68214eed5a29c6274c83b2cc206a359c4a89fa41b"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:a2471f3f8693101975b1ff85ffd19bb7ca7dd7c38f8a81701f67d6b4f97b87d8"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc5d149f31706762c1f8bda2e8c4f8fead6e80312e3692619a75301d3dbb819a"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3311cb4237a341aa52ab8448c27e3a9931e2ee09561ad150ba94e4cfd3fc888c"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-win32.whl", hash = "sha256:f0e7c4b2f77593871e918be000b96c8107da48444d57005b6a6bc61fb4331b2c"},
+    {file = "pyrsistent-0.19.3-cp38-cp38-win_amd64.whl", hash = "sha256:c147257a92374fde8498491f53ffa8f4822cd70c0d85037e09028e478cababb7"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b735e538f74ec31378f5a1e3886a26d2ca6351106b4dfde376a26fc32a044edc"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99abb85579e2165bd8522f0c0138864da97847875ecbd45f3e7e2af569bfc6f2"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3a8cb235fa6d3fd7aae6a4f1429bbb1fec1577d978098da1252f0489937786f3"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-win32.whl", hash = "sha256:c74bed51f9b41c48366a286395c67f4e894374306b197e62810e0fdaf2364da2"},
+    {file = "pyrsistent-0.19.3-cp39-cp39-win_amd64.whl", hash = "sha256:878433581fc23e906d947a6814336eee031a00e6defba224234169ae3d3d6a98"},
+    {file = "pyrsistent-0.19.3-py3-none-any.whl", hash = "sha256:ccf0d6bd208f8111179f0c26fdf84ed7c3891982f2edaeae7422575f47e66b64"},
+    {file = "pyrsistent-0.19.3.tar.gz", hash = "sha256:1a2994773706bbb4995c31a97bc94f1418314923bd1048c6d964837040376440"},
+]
 
 [[package]]
 name = "pyyaml"
@@ -243,6 +644,70 @@ description = "YAML parser and emitter for Python"
 category = "main"
 optional = false
 python-versions = ">=3.6"
+files = [
+    {file = "PyYAML-6.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:d4db7c7aef085872ef65a8fd7d6d09a14ae91f691dec3e87ee5ee0539d516f53"},
+    {file = "PyYAML-6.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9df7ed3b3d2e0ecfe09e14741b857df43adb5a3ddadc919a2d94fbdf78fea53c"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77f396e6ef4c73fdc33a9157446466f1cff553d979bd00ecb64385760c6babdc"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a80a78046a72361de73f8f395f1f1e49f956c6be882eed58505a15f3e430962b"},
+    {file = "PyYAML-6.0-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:f84fbc98b019fef2ee9a1cb3ce93e3187a6df0b2538a651bfb890254ba9f90b5"},
+    {file = "PyYAML-6.0-cp310-cp310-win32.whl", hash = "sha256:2cd5df3de48857ed0544b34e2d40e9fac445930039f3cfe4bcc592a1f836d513"},
+    {file = "PyYAML-6.0-cp310-cp310-win_amd64.whl", hash = "sha256:daf496c58a8c52083df09b80c860005194014c3698698d1a57cbcfa182142a3a"},
+    {file = "PyYAML-6.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d4b0ba9512519522b118090257be113b9468d804b19d63c71dbcf4a48fa32358"},
+    {file = "PyYAML-6.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:81957921f441d50af23654aa6c5e5eaf9b06aba7f0a19c18a538dc7ef291c5a1"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:afa17f5bc4d1b10afd4466fd3a44dc0e245382deca5b3c353d8b757f9e3ecb8d"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dbad0e9d368bb989f4515da330b88a057617d16b6a8245084f1b05400f24609f"},
+    {file = "PyYAML-6.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:432557aa2c09802be39460360ddffd48156e30721f5e8d917f01d31694216782"},
+    {file = "PyYAML-6.0-cp311-cp311-win32.whl", hash = "sha256:bfaef573a63ba8923503d27530362590ff4f576c626d86a9fed95822a8255fd7"},
+    {file = "PyYAML-6.0-cp311-cp311-win_amd64.whl", hash = "sha256:01b45c0191e6d66c470b6cf1b9531a771a83c1c4208272ead47a3ae4f2f603bf"},
+    {file = "PyYAML-6.0-cp36-cp36m-macosx_10_9_x86_64.whl", hash = "sha256:897b80890765f037df3403d22bab41627ca8811ae55e9a722fd0392850ec4d86"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50602afada6d6cbfad699b0c7bb50d5ccffa7e46a3d738092afddc1f9758427f"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:48c346915c114f5fdb3ead70312bd042a953a8ce5c7106d5bfb1a5254e47da92"},
+    {file = "PyYAML-6.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:98c4d36e99714e55cfbaaee6dd5badbc9a1ec339ebfc3b1f52e293aee6bb71a4"},
+    {file = "PyYAML-6.0-cp36-cp36m-win32.whl", hash = "sha256:0283c35a6a9fbf047493e3a0ce8d79ef5030852c51e9d911a27badfde0605293"},
+    {file = "PyYAML-6.0-cp36-cp36m-win_amd64.whl", hash = "sha256:07751360502caac1c067a8132d150cf3d61339af5691fe9e87803040dbc5db57"},
+    {file = "PyYAML-6.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:819b3830a1543db06c4d4b865e70ded25be52a2e0631ccd2f6a47a2822f2fd7c"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:473f9edb243cb1935ab5a084eb238d842fb8f404ed2193a915d1784b5a6b5fc0"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0ce82d761c532fe4ec3f87fc45688bdd3a4c1dc5e0b4a19814b9009a29baefd4"},
+    {file = "PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:231710d57adfd809ef5d34183b8ed1eeae3f76459c18fb4a0b373ad56bedcdd9"},
+    {file = "PyYAML-6.0-cp37-cp37m-win32.whl", hash = "sha256:c5687b8d43cf58545ade1fe3e055f70eac7a5a1a0bf42824308d868289a95737"},
+    {file = "PyYAML-6.0-cp37-cp37m-win_amd64.whl", hash = "sha256:d15a181d1ecd0d4270dc32edb46f7cb7733c7c508857278d3d378d14d606db2d"},
+    {file = "PyYAML-6.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0b4624f379dab24d3725ffde76559cff63d9ec94e1736b556dacdfebe5ab6d4b"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:213c60cd50106436cc818accf5baa1aba61c0189ff610f64f4a3e8c6726218ba"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9fa600030013c4de8165339db93d182b9431076eb98eb40ee068700c9c813e34"},
+    {file = "PyYAML-6.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:277a0ef2981ca40581a47093e9e2d13b3f1fbbeffae064c1d21bfceba2030287"},
+    {file = "PyYAML-6.0-cp38-cp38-win32.whl", hash = "sha256:d4eccecf9adf6fbcc6861a38015c2a64f38b9d94838ac1810a9023a0609e1b78"},
+    {file = "PyYAML-6.0-cp38-cp38-win_amd64.whl", hash = "sha256:1e4747bc279b4f613a09eb64bba2ba602d8a6664c6ce6396a4d0cd413a50ce07"},
+    {file = "PyYAML-6.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:055d937d65826939cb044fc8c9b08889e8c743fdc6a32b33e2390f66013e449b"},
+    {file = "PyYAML-6.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e61ceaab6f49fb8bdfaa0f92c4b57bcfbea54c09277b1b4f7ac376bfb7a7c174"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d67d839ede4ed1b28a4e8909735fc992a923cdb84e618544973d7dfc71540803"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cba8c411ef271aa037d7357a2bc8f9ee8b58b9965831d9e51baf703280dc73d3"},
+    {file = "PyYAML-6.0-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:40527857252b61eacd1d9af500c3337ba8deb8fc298940291486c465c8b46ec0"},
+    {file = "PyYAML-6.0-cp39-cp39-win32.whl", hash = "sha256:b5b9eccad747aabaaffbc6064800670f0c297e52c12754eb1d976c57e4f74dcb"},
+    {file = "PyYAML-6.0-cp39-cp39-win_amd64.whl", hash = "sha256:b3d267842bf12586ba6c734f89d1f5b871df0273157918b0ccefa29deb05c21c"},
+    {file = "PyYAML-6.0.tar.gz", hash = "sha256:68fb519c14306fec9720a2a5b45bc9f0c8d1b9c72adf45c37baedfcd949c35a2"},
+]
+
+[[package]]
+name = "requests"
+version = "2.28.2"
+description = "Python HTTP for Humans."
+category = "dev"
+optional = false
+python-versions = ">=3.7, <4"
+files = [
+    {file = "requests-2.28.2-py3-none-any.whl", hash = "sha256:64299f4909223da747622c030b781c0d7811e359c37124b4bd368fb8c6518baa"},
+    {file = "requests-2.28.2.tar.gz", hash = "sha256:98b1b2782e3c6c4904938b84c0eb932721069dfdb9134313beff7c83c2df24bf"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<1.27"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
 
 [[package]]
 name = "snowballstemmer"
@@ -251,6 +716,175 @@ description = "This package provides 29 stemmers for 28 languages generated from
 category = "dev"
 optional = false
 python-versions = "*"
+files = [
+    {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"},
+    {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
+]
+
+[[package]]
+name = "sphinx"
+version = "6.1.3"
+description = "Python documentation generator"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "Sphinx-6.1.3.tar.gz", hash = "sha256:0dac3b698538ffef41716cf97ba26c1c7788dba73ce6f150c1ff5b4720786dd2"},
+    {file = "sphinx-6.1.3-py3-none-any.whl", hash = "sha256:807d1cb3d6be87eb78a381c3e70ebd8d346b9a25f3753e9947e866b2786865fc"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.0"
+description = "Read the Docs theme for Sphinx"
+category = "dev"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+    {file = "sphinx_rtd_theme-1.2.0-py2.py3-none-any.whl", hash = "sha256:f823f7e71890abe0ac6aaa6013361ea2696fc8d3e1fa798f463e82bdb77eeff2"},
+    {file = "sphinx_rtd_theme-1.2.0.tar.gz", hash = "sha256:a0d8bd1a2ed52e0b338cbe19c4b2eef3c5e7a048769753dac6a9f059c7b641b8"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = {version = ">=2.0.0,<3.0.0 || >3.0.0", markers = "python_version > \"3\""}
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.4"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "sphinxcontrib-applehelp-1.0.4.tar.gz", hash = "sha256:828f867945bbe39817c210a1abfd1bc4895c8b73fcaade56d45357a348a07d7e"},
+    {file = "sphinxcontrib_applehelp-1.0.4-py3-none-any.whl", hash = "sha256:29d341f67fb0f6f586b23ad80e072c8e6ad0b48417db2bde114a4c9746feb228"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.2"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"},
+    {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.1"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+category = "dev"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "sphinxcontrib-htmlhelp-2.0.1.tar.gz", hash = "sha256:0cbdd302815330058422b98a113195c9249825d681e18f11e8b1f78a2f11efff"},
+    {file = "sphinxcontrib_htmlhelp-2.0.1-py3-none-any.whl", hash = "sha256:c38cb46dccf316c79de6e5515e1770414b797162b23cd3d06e67020e1d2a6903"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+category = "dev"
+optional = false
+python-versions = ">=2.7"
+files = [
+    {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+    {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+    {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.3"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"},
+    {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.5"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)."
+category = "dev"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"},
+    {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
 
 [[package]]
 name = "toml"
@@ -259,6 +893,10 @@ description = "Python Library for Tom's Obvious, Minimal Language"
 category = "dev"
 optional = false
 python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
+files = [
+    {file = "toml-0.10.2-py2.py3-none-any.whl", hash = "sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b"},
+    {file = "toml-0.10.2.tar.gz", hash = "sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"},
+]
 
 [[package]]
 name = "tomli"
@@ -267,22 +905,51 @@ description = "A lil' TOML parser"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "tomli-2.0.1-py3-none-any.whl", hash = "sha256:939de3e7a6161af0c887ef91b7d41a53e7c5a1ca976325f429cb46ea9bc30ecc"},
+    {file = "tomli-2.0.1.tar.gz", hash = "sha256:de526c12914f0c550d15924c62d72abc48d6fe7364aa87328337a31007fe8a4f"},
+]
 
 [[package]]
 name = "types-pyyaml"
-version = "6.0.12.1"
+version = "6.0.12.8"
 description = "Typing stubs for PyYAML"
 category = "main"
 optional = false
 python-versions = "*"
+files = [
+    {file = "types-PyYAML-6.0.12.8.tar.gz", hash = "sha256:19304869a89d49af00be681e7b267414df213f4eb89634c4495fa62e8f942b9f"},
+    {file = "types_PyYAML-6.0.12.8-py3-none-any.whl", hash = "sha256:5314a4b2580999b2ea06b2e5f9a7763d860d6e09cdf21c0e9561daa9cbd60178"},
+]
 
 [[package]]
 name = "typing-extensions"
-version = "4.4.0"
+version = "4.5.0"
 description = "Backported and Experimental Type Hints for Python 3.7+"
 category = "dev"
 optional = false
 python-versions = ">=3.7"
+files = [
+    {file = "typing_extensions-4.5.0-py3-none-any.whl", hash = "sha256:fb33085c39dd998ac16d1431ebc293a8b3eedd00fd4a32de0ff79002c19511b4"},
+    {file = "typing_extensions-4.5.0.tar.gz", hash = "sha256:5cb5f4a79139d699607b3ef622a1dedafa84e115ab0024e0d9c044a9479ca7cb"},
+]
+
+[[package]]
+name = "urllib3"
+version = "1.26.15"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+category = "dev"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*"
+files = [
+    {file = "urllib3-1.26.15-py2.py3-none-any.whl", hash = "sha256:aa751d169e23c7479ce47a0cb0da579e3ede798f994f5816a74e4f4500dcea42"},
+    {file = "urllib3-1.26.15.tar.gz", hash = "sha256:8a388717b9476f934a21484e8c8e61875ab60644d29b9b39e11e4b9dc1c6b305"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)", "brotlipy (>=0.6.0)"]
+secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress", "pyOpenSSL (>=0.14)", "urllib3-secure-extra"]
+socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
 
 [[package]]
 name = "warlock"
@@ -291,47 +958,16 @@ description = "Python object model built on JSON schema and JSON patch."
 category = "main"
 optional = false
 python-versions = ">=3.7,<4.0"
+files = [
+    {file = "warlock-2.0.1-py3-none-any.whl", hash = "sha256:448df959cec31904f686ac8c6b1dfab80f0cdabce3d303be517dd433eeebf012"},
+    {file = "warlock-2.0.1.tar.gz", hash = "sha256:99abbf9525b2a77f2cde896d3a9f18a5b4590db063db65e08207694d2e0137fc"},
+]
 
 [package.dependencies]
 jsonpatch = ">=1,<2"
 jsonschema = ">=4,<5"
 
 [metadata]
-lock-version = "1.1"
+lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "a0f040b07fc6ce4deb0be078b9a88c2a465cb6bccb9e260a67e92c2403e2319f"
-
-[metadata.files]
-attrs = []
-black = []
-click = []
-colorama = []
-isort = []
-jsonpatch = []
-jsonpointer = []
-jsonschema = []
-mccabe = []
-mypy = []
-mypy-extensions = []
-pathspec = []
-pexpect = [
-    {file = "pexpect-4.8.0-py2.py3-none-any.whl", hash = "sha256:0b48a55dcb3c05f3329815901ea4fc1537514d6ba867a152b581d69ae3710937"},
-    {file = "pexpect-4.8.0.tar.gz", hash = "sha256:fc65a43959d153d0114afe13997d439c22823a27cefceb5ff35c2178c6784c0c"},
-]
-platformdirs = [
-    {file = "platformdirs-2.5.2-py3-none-any.whl", hash = "sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788"},
-    {file = "platformdirs-2.5.2.tar.gz", hash = "sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"},
-]
-ptyprocess = []
-pycodestyle = []
-pydocstyle = []
-pyflakes = []
-pylama = []
-pyrsistent = []
-pyyaml = []
-snowballstemmer = []
-toml = []
-tomli = []
-types-pyyaml = []
-typing-extensions = []
-warlock = []
+content-hash = "b3f428e987713d7875434c4b43cadadcb7d77dd3d62fd6855fb8e77ec946f082"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index a136c91e5e..c0fe323272 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -22,6 +22,13 @@ pylama = "^8.4.1"
 pyflakes = "2.5.0"
 toml = "^0.10.2"
 
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+Sphinx = "^6.1.3"
+sphinx-rtd-theme = "^1.2.0"
+
 [tool.poetry.scripts]
 dts = "main:main"
 
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v3 3/4] dts: add doc generation
  2023-05-11  9:14   ` [RFC PATCH v3 " Juraj Linkeš
  2023-05-11  9:14     ` [RFC PATCH v3 1/4] dts: code adjustments for sphinx Juraj Linkeš
  2023-05-11  9:14     ` [RFC PATCH v3 2/4] dts: add doc generation dependencies Juraj Linkeš
@ 2023-05-11  9:14     ` Juraj Linkeš
  2023-05-11  9:14     ` [RFC PATCH v3 4/4] dts: format docstrigs to google format Juraj Linkeš
                       ` (2 subsequent siblings)
  5 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-11  9:14 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

The tool used to generate developer docs is sphinx, which is already
used in DPDK. The configuration is kept the same to preserve the style.

Sphinx generates the documentation from Python docstrings. The docstring
format most suitable for DTS seems to be the Google format [0] which
requires the sphinx.ext.napoleon extension.

There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx import the
  code.
* Also the same Python packages as DTS, for the same reason.

[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 buildtools/call-sphinx-build.py | 29 ++++++++++++-------
 doc/api/meson.build             |  1 +
 doc/guides/conf.py              | 22 ++++++++++----
 doc/guides/meson.build          |  1 +
 doc/guides/tools/dts.rst        | 29 +++++++++++++++++++
 dts/doc/doc-index.rst           | 20 +++++++++++++
 dts/doc/meson.build             | 51 +++++++++++++++++++++++++++++++++
 dts/meson.build                 | 16 +++++++++++
 meson.build                     |  1 +
 meson_options.txt               |  2 ++
 10 files changed, 157 insertions(+), 15 deletions(-)
 create mode 100644 dts/doc/doc-index.rst
 create mode 100644 dts/doc/meson.build
 create mode 100644 dts/meson.build

diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..c2f3acfb1d 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,46 @@
 # Copyright(c) 2019 Intel Corporation
 #
 
+import argparse
 import sys
 import os
 from os.path import join
 from subprocess import run, PIPE, STDOUT
 from packaging.version import Version
 
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default='.')
+args, extra_args = parser.parse_known_args()
 
 # set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+os.environ['DTS_ROOT'] = args.dts_root
 
 # for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
           stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
 if Version(ver) >= Version('1.7'):
     sphinx_cmd += ['-j', 'auto']
 
 # find all the files sphinx will process so we can write them as dependencies
 srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
     srcfiles.extend([join(root, f) for f in files])
 
 # run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
-    process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
-                  stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+    process = run(
+        sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+        stdout=out
+    )
 
 # create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
     d.write('html: ' + ' '.join(srcfiles) + '\n')
 
 sys.exit(process.returncode)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 2876a78a7e..1f0c725a94 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
 
+doc_api_build_dir = meson.current_build_dir()
 doxygen = find_program('doxygen', required: get_option('enable_docs'))
 
 if not doxygen.found()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index a55ce38800..3f11cbc8fc 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
 from sphinx import __version__ as sphinx_version
 from os import listdir
 from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
 from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
 
 import configparser
 
@@ -24,6 +23,19 @@
           file=stderr)
     pass
 
+extensions = ['sphinx.ext.napoleon']
+
+# Python docstring options
+autodoc_member_order = 'bysource'
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_use_ivar = True
+napoleon_use_rtype = False
+add_module_names = False
+toc_object_entries_show_parents = 'hide'
+
 stop_on_error = ('-W' in argv)
 
 project = 'Data Plane Development Kit'
@@ -35,8 +47,8 @@
 html_show_copyright = False
 highlight_language = 'none'
 
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+path.append(environ.get('DTS_ROOT'))
+version = environ.setdefault('DPDK_VERSION', "None")
 
 master_doc = 'index'
 
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Intel Corporation
 
+doc_guides_source_dir = meson.current_source_dir()
 sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
 
 if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index ebd6dceb6a..a547da2017 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -282,3 +282,32 @@ There are three tools used in DTS to help with code checking, style and formatti
 These three tools are all used in ``devtools/dts-check-format.sh``,
 the DTS code check and format script.
 Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+   .. code-block:: console
+
+   poetry install --with docs
+   poetry shell
+
+
+Build commands
+~~~~~~~~~~~~~~
+
+The documentation is built using the standard DPDK build system.
+
+After entering Poetry's shell, build the documentation with:
+
+   .. code-block:: console
+
+   ninja -C build dts/doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+   Make sure to fix any Sphinx warnings when adding or updating docstrings.
diff --git a/dts/doc/doc-index.rst b/dts/doc/doc-index.rst
new file mode 100644
index 0000000000..10151c6851
--- /dev/null
+++ b/dts/doc/doc-index.rst
@@ -0,0 +1,20 @@
+.. DPDK Test Suite documentation master file, created by
+   sphinx-quickstart on Tue Mar 14 12:23:52 2023.
+   You can adapt this file completely to your liking, but it should at least
+   contain the root `toctree` directive.
+
+Welcome to DPDK Test Suite's documentation!
+===========================================
+
+.. toctree::
+   :maxdepth: 4
+   :caption: Contents:
+
+   modules
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..8c1416296b
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,51 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build', required: get_option('enable_dts_docs'))
+sphinx_apidoc = find_program('sphinx-apidoc', required: get_option('enable_dts_docs'))
+
+if sphinx.found() and sphinx_apidoc.found()
+endif
+
+dts_api_framework_dir = join_paths(dts_dir, 'framework')
+dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+dts_api_src = custom_target('dts_api_src',
+        output: 'modules.rst',
+        command: ['SPHINX_APIDOC_OPTIONS=members,show-inheritance',
+            sphinx_apidoc, '--append-syspath', '--force',
+            '--module-first', '--separate',
+            '--doc-project', 'DTS', '-V', meson.project_version(),
+            '-o', dts_api_build_dir,
+            dts_api_framework_dir],
+        build_by_default: get_option('enable_dts_docs'))
+doc_targets += dts_api_src
+doc_target_names += 'DTS_API_sphinx_sources'
+
+cp = find_program('cp', required: get_option('enable_dts_docs'))
+cp_index = custom_target('cp_index',
+        input: 'doc-index.rst',
+        output: 'index.rst',
+        depends: dts_api_src,
+        command: [cp, '@INPUT@', join_paths(dts_api_build_dir, 'index.rst')],
+        build_by_default: get_option('enable_dts_docs'))
+doc_targets += cp_index
+doc_target_names += 'DTS_API_sphinx_index'
+
+extra_sphinx_args = ['-a', '-c', doc_guides_source_dir]
+if get_option('werror')
+    extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
+dts_api_html = custom_target('dts_api_html',
+        output: 'html',
+        depends: cp_index,
+        command: ['DTS_ROOT=@0@'.format(dts_dir),
+            sphinx_wrapper, sphinx, meson.project_version(),
+            dts_api_build_dir, dts_api_build_dir,
+            '--dts-root', dts_dir, extra_sphinx_args],
+        build_by_default: get_option('enable_dts_docs'),
+        install: get_option('enable_dts_docs'),
+        install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..17bda07636
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+    message = 'No docs targets found'
+else
+    message = 'Built docs:'
+endif
+run_target('dts/doc', command: [echo, message, doc_target_names],
+    depends: doc_targets)
diff --git a/meson.build b/meson.build
index f91d652bc5..7820f334bb 100644
--- a/meson.build
+++ b/meson.build
@@ -84,6 +84,7 @@ subdir('app')
 
 # build docs
 subdir('doc')
+subdir('dts')
 
 # build any examples explicitly requested - useful for developers - and
 # install any example code into the appropriate install path
diff --git a/meson_options.txt b/meson_options.txt
index 82c8297065..267f1b3ef7 100644
--- a/meson_options.txt
+++ b/meson_options.txt
@@ -16,6 +16,8 @@ option('drivers_install_subdir', type: 'string', value: 'dpdk/pmds-<VERSION>', d
        'Subdirectory of libdir where to install PMDs. Defaults to using a versioned subdirectory.')
 option('enable_docs', type: 'boolean', value: false, description:
        'build documentation')
+option('enable_dts_docs', type: 'boolean', value: false, description:
+       'Build DTS API documentation.')
 option('enable_apps', type: 'string', value: '', description:
        'Comma-separated list of apps to build. If unspecified, build all apps.')
 option('enable_drivers', type: 'string', value: '', description:
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v3 4/4] dts: format docstrigs to google format
  2023-05-11  9:14   ` [RFC PATCH v3 " Juraj Linkeš
                       ` (2 preceding siblings ...)
  2023-05-11  9:14     ` [RFC PATCH v3 3/4] dts: add doc generation Juraj Linkeš
@ 2023-05-11  9:14     ` Juraj Linkeš
  2023-06-21 18:27       ` Jeremy Spewock
  2023-05-17 16:56     ` [RFC PATCH v3 0/4] dts: add dts api docs Bruce Richardson
  2023-08-31 10:04     ` [RFC PATCH v4 " Juraj Linkeš
  5 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-11  9:14 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, jspewock, probb
  Cc: dev, Juraj Linkeš

WIP: only one module is reformatted to serve as a demonstration.

The google format is documented here [0].

[0]: https://google.github.io/styleguide/pyguide.html

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/testbed_model/node.py | 152 +++++++++++++++++++---------
 1 file changed, 103 insertions(+), 49 deletions(-)

diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 90467981c3..ad8ef442af 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+There's a base class, Node, that's supposed to be extended by other classes
+with functionality specific to that node type.
+The only part that can be used standalone is the Node.skip_setup static method,
+which is a decorator used to skip method execution
+if skip_setup is passed by the user on the cmdline or in an env variable.
 """
 
 from typing import Any, Callable
@@ -26,10 +31,25 @@
 
 
 class Node(object):
-    """
-    Basic class for node management. This class implements methods that
-    manage a node, such as information gathering (of CPU/PCI/NIC) and
-    environment setup.
+    """The base class for node management.
+
+    It shouldn't be instantiated, but rather extended.
+    It implements common methods to manage any node:
+
+       * connection to the node
+       * information gathering of CPU
+       * hugepages setup
+
+    Arguments:
+        node_config: The config from the input configuration file.
+
+    Attributes:
+        main_session: The primary OS-agnostic remote session used
+            to communicate with the node.
+        config: The configuration used to create the node.
+        name: The name of the node.
+        lcores: The list of logical cores that DTS can use on the node.
+            It's derived from logical cores present on the node and user configuration.
     """
 
     main_session: OSSession
@@ -56,65 +76,89 @@ def __init__(self, node_config: NodeConfiguration):
         self._logger.info(f"Created node: {self.name}")
 
     def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
-        """
-        Perform the execution setup that will be done for each execution
-        this node is part of.
+        """Execution setup steps.
+
+        Configure hugepages and call self._set_up_execution where
+        the rest of the configuration steps (if any) are implemented.
+
+        Args:
+            execution_config: The execution configuration according to which
+                the setup steps will be taken.
         """
         self._setup_hugepages()
         self._set_up_execution(execution_config)
 
     def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional execution setup steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional execution setup steps.
         """
 
     def tear_down_execution(self) -> None:
-        """
-        Perform the execution teardown that will be done after each execution
-        this node is part of concludes.
+        """Execution teardown steps.
+
+        There are currently no common execution teardown steps
+        common to all DTS node types.
         """
         self._tear_down_execution()
 
     def _tear_down_execution(self) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional execution teardown steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional execution teardown steps.
         """
 
     def set_up_build_target(
         self, build_target_config: BuildTargetConfiguration
     ) -> None:
-        """
-        Perform the build target setup that will be done for each build target
-        tested on this node.
+        """Build target setup steps.
+
+        There are currently no common build target setup steps
+        common to all DTS node types.
+
+        Args:
+            build_target_config: The build target configuration according to which
+                the setup steps will be taken.
         """
         self._set_up_build_target(build_target_config)
 
     def _set_up_build_target(
         self, build_target_config: BuildTargetConfiguration
     ) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional build target setup steps for derived classes.
+
+        Derived classes should optionally overwrite this
+        if they want to add additional build target setup steps.
         """
 
     def tear_down_build_target(self) -> None:
-        """
-        Perform the build target teardown that will be done after each build target
-        tested on this node.
+        """Build target teardown steps.
+
+        There are currently no common build target teardown steps
+        common to all DTS node types.
         """
         self._tear_down_build_target()
 
     def _tear_down_build_target(self) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional build target teardown steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional build target teardown steps.
         """
 
     def create_session(self, name: str) -> OSSession:
-        """
-        Create and return a new OSSession tailored to the remote OS.
+        """Create and return a new OS-agnostic remote session.
+
+        The returned session won't be used by the object creating it.
+        Will be cleaned up automatically.
+
+        Args:
+            name: The name of the session.
+
+        Returns:
+            A new OS-agnostic remote session.
         """
         session_name = f"{self.name} {name}"
         connection = create_session(
@@ -130,14 +174,24 @@ def filter_lcores(
         filter_specifier: LogicalCoreCount | LogicalCoreList,
         ascending: bool = True,
     ) -> list[LogicalCore]:
-        """
-        Filter the LogicalCores found on the Node according to
-        a LogicalCoreCount or a LogicalCoreList.
+        """Filter the node's logical cores that DTS can use.
 
-        If ascending is True, use cores with the lowest numerical id first
-        and continue in ascending order. If False, start with the highest
-        id and continue in descending order. This ordering affects which
-        sockets to consider first as well.
+        Logical cores that DTS can use are ones that are present on the node,
+        but filtered according to user config.
+        The filter_specifier will filter cores from those logical cores.
+
+        Args:
+            filter_specifier: Two different filters can be used, one that specifies
+                the number of logical cores per core, cores per socket and
+                the number of sockets,
+                the other that specifies a logical core list.
+            ascending: If True, use cores with the lowest numerical id first
+                and continue in ascending order. If False, start with the highest
+                id and continue in descending order. This ordering affects which
+                sockets to consider first as well.
+
+        Returns:
+            A list of logical cores.
         """
         self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
         return lcore_filter(
@@ -147,17 +201,14 @@ def filter_lcores(
         ).filter()
 
     def _get_remote_cpus(self) -> None:
-        """
-        Scan CPUs in the remote OS and store a list of LogicalCores.
-        """
+        """Scan CPUs in the remote OS and store a list of LogicalCores."""
         self._logger.info("Getting CPU information.")
         self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
 
     def _setup_hugepages(self):
-        """
-        Setup hugepages on the Node. Different architectures can supply different
-        amounts of memory for hugepages and numa-based hugepage allocation may need
-        to be considered.
+        """Setup hugepages on the Node.
+
+        Configure the hugepages only if they're specified in user configuration.
         """
         if self.config.hugepages:
             self.main_session.setup_hugepages(
@@ -165,9 +216,7 @@ def _setup_hugepages(self):
             )
 
     def close(self) -> None:
-        """
-        Close all connections and free other resources.
-        """
+        """Close all connections and free other resources."""
         if self.main_session:
             self.main_session.close()
         for session in self._other_sessions:
@@ -176,6 +225,11 @@ def close(self) -> None:
 
     @staticmethod
     def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+        """A decorator that skips the decorated function.
+
+        When used, the decorator executes an empty lambda function
+        instead of the decorated function.
+        """
         if SETTINGS.skip_setup:
             return lambda *args: None
         else:
-- 
2.30.2


^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v3 0/4] dts: add dts api docs
  2023-05-11  9:14   ` [RFC PATCH v3 " Juraj Linkeš
                       ` (3 preceding siblings ...)
  2023-05-11  9:14     ` [RFC PATCH v3 4/4] dts: format docstrigs to google format Juraj Linkeš
@ 2023-05-17 16:56     ` Bruce Richardson
  2023-05-22  9:17       ` Juraj Linkeš
  2023-08-31 10:04     ` [RFC PATCH v4 " Juraj Linkeš
  5 siblings, 1 reply; 393+ messages in thread
From: Bruce Richardson @ 2023-05-17 16:56 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Thu, May 11, 2023 at 11:14:04AM +0200, Juraj Linkeš wrote:
> Augment the meson build system with dts api generation. The api docs are
> generated from Python docstrings in DTS using Sphinx. The format of
> choice is the Google format [0].
> 
> The guides html sphinx configuration is used to preserve the same style.
> 
> The build requires the same Python version and dependencies as DTS,
> because Sphinx imports the Python modules. Dependencies are installed
> using Poetry from the dts directory:
> 
> poetry install --with docs
> 
> After installing, enter the Poetry shell:
> 
> poetry shell
> 
> And then run the build:
> ninja -C <meson_build_dir> dts/doc
> 
> There's only one properly documented module that serves as a
> demonstration of the style - framework.testbed_model.node. When we agree
> on the docstring format, all docstrings will be reformatted.
> 
> [0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
> 
> Juraj Linkeš (4):
>   dts: code adjustments for sphinx
>   dts: add doc generation dependencies
>   dts: add doc generation
>   dts: format docstrigs to google format
> 
Given that building the DTS docs requires a special set of commands to set
things up and then to run the build through poetry, I think you should just
drop the option in meson_options.txt. I think it's better if building the
DTS docs is the steps that out outline here, and we don't try and integrate
it into the main DPDK build.

With that change:

Series-acked-by: Bruce Richardson <bruce.richardson@intel.com>

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v3 0/4] dts: add dts api docs
  2023-05-17 16:56     ` [RFC PATCH v3 0/4] dts: add dts api docs Bruce Richardson
@ 2023-05-22  9:17       ` Juraj Linkeš
  0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-05-22  9:17 UTC (permalink / raw)
  To: Bruce Richardson
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, wathsala.vithanage,
	jspewock, probb, dev

On Wed, May 17, 2023 at 6:57 PM Bruce Richardson
<bruce.richardson@intel.com> wrote:
>
> On Thu, May 11, 2023 at 11:14:04AM +0200, Juraj Linkeš wrote:
> > Augment the meson build system with dts api generation. The api docs are
> > generated from Python docstrings in DTS using Sphinx. The format of
> > choice is the Google format [0].
> >
> > The guides html sphinx configuration is used to preserve the same style.
> >
> > The build requires the same Python version and dependencies as DTS,
> > because Sphinx imports the Python modules. Dependencies are installed
> > using Poetry from the dts directory:
> >
> > poetry install --with docs
> >
> > After installing, enter the Poetry shell:
> >
> > poetry shell
> >
> > And then run the build:
> > ninja -C <meson_build_dir> dts/doc
> >
> > There's only one properly documented module that serves as a
> > demonstration of the style - framework.testbed_model.node. When we agree
> > on the docstring format, all docstrings will be reformatted.
> >
> > [0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
> >
> > Juraj Linkeš (4):
> >   dts: code adjustments for sphinx
> >   dts: add doc generation dependencies
> >   dts: add doc generation
> >   dts: format docstrigs to google format
> >
> Given that building the DTS docs requires a special set of commands to set
> things up and then to run the build through poetry, I think you should just
> drop the option in meson_options.txt. I think it's better if building the
> DTS docs is the steps that out outline here, and we don't try and integrate
> it into the main DPDK build.
>

That makes a lot of sense. I'll make the change.

> With that change:
>
> Series-acked-by: Bruce Richardson <bruce.richardson@intel.com>

Thanks.

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v3 4/4] dts: format docstrigs to google format
  2023-05-11  9:14     ` [RFC PATCH v3 4/4] dts: format docstrigs to google format Juraj Linkeš
@ 2023-06-21 18:27       ` Jeremy Spewock
  0 siblings, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2023-06-21 18:27 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	wathsala.vithanage, probb, dev

[-- Attachment #1: Type: text/plain, Size: 10848 bytes --]

Acked-by: Jeremy Spweock <jspweock@iol.unh.edu>

On Thu, May 11, 2023 at 5:14 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:

> WIP: only one module is reformatted to serve as a demonstration.
>
> The google format is documented here [0].
>
> [0]: https://google.github.io/styleguide/pyguide.html
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
>  dts/framework/testbed_model/node.py | 152 +++++++++++++++++++---------
>  1 file changed, 103 insertions(+), 49 deletions(-)
>
> diff --git a/dts/framework/testbed_model/node.py
> b/dts/framework/testbed_model/node.py
> index 90467981c3..ad8ef442af 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -3,8 +3,13 @@
>  # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>  # Copyright(c) 2022-2023 University of New Hampshire
>
> -"""
> -A node is a generic host that DTS connects to and manages.
> +"""Common functionality for node management.
> +
> +There's a base class, Node, that's supposed to be extended by other
> classes
> +with functionality specific to that node type.
> +The only part that can be used standalone is the Node.skip_setup static
> method,
> +which is a decorator used to skip method execution
> +if skip_setup is passed by the user on the cmdline or in an env variable.
>  """
>
>  from typing import Any, Callable
> @@ -26,10 +31,25 @@
>
>
>  class Node(object):
> -    """
> -    Basic class for node management. This class implements methods that
> -    manage a node, such as information gathering (of CPU/PCI/NIC) and
> -    environment setup.
> +    """The base class for node management.
> +
> +    It shouldn't be instantiated, but rather extended.
> +    It implements common methods to manage any node:
> +
> +       * connection to the node
> +       * information gathering of CPU
> +       * hugepages setup
> +
> +    Arguments:
> +        node_config: The config from the input configuration file.
> +
> +    Attributes:
> +        main_session: The primary OS-agnostic remote session used
> +            to communicate with the node.
> +        config: The configuration used to create the node.
> +        name: The name of the node.
> +        lcores: The list of logical cores that DTS can use on the node.
> +            It's derived from logical cores present on the node and user
> configuration.
>      """
>
>      main_session: OSSession
> @@ -56,65 +76,89 @@ def __init__(self, node_config: NodeConfiguration):
>          self._logger.info(f"Created node: {self.name}")
>
>      def set_up_execution(self, execution_config: ExecutionConfiguration)
> -> None:
> -        """
> -        Perform the execution setup that will be done for each execution
> -        this node is part of.
> +        """Execution setup steps.
> +
> +        Configure hugepages and call self._set_up_execution where
> +        the rest of the configuration steps (if any) are implemented.
> +
> +        Args:
> +            execution_config: The execution configuration according to
> which
> +                the setup steps will be taken.
>          """
>          self._setup_hugepages()
>          self._set_up_execution(execution_config)
>
>      def _set_up_execution(self, execution_config: ExecutionConfiguration)
> -> None:
> -        """
> -        This method exists to be optionally overwritten by derived
> classes and
> -        is not decorated so that the derived class doesn't have to use
> the decorator.
> +        """Optional additional execution setup steps for derived classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional execution setup steps.
>          """
>
>      def tear_down_execution(self) -> None:
> -        """
> -        Perform the execution teardown that will be done after each
> execution
> -        this node is part of concludes.
> +        """Execution teardown steps.
> +
> +        There are currently no common execution teardown steps
> +        common to all DTS node types.
>          """
>          self._tear_down_execution()
>
>      def _tear_down_execution(self) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived
> classes and
> -        is not decorated so that the derived class doesn't have to use
> the decorator.
> +        """Optional additional execution teardown steps for derived
> classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional execution teardown steps.
>          """
>
>      def set_up_build_target(
>          self, build_target_config: BuildTargetConfiguration
>      ) -> None:
> -        """
> -        Perform the build target setup that will be done for each build
> target
> -        tested on this node.
> +        """Build target setup steps.
> +
> +        There are currently no common build target setup steps
> +        common to all DTS node types.
> +
> +        Args:
> +            build_target_config: The build target configuration according
> to which
> +                the setup steps will be taken.
>          """
>          self._set_up_build_target(build_target_config)
>
>      def _set_up_build_target(
>          self, build_target_config: BuildTargetConfiguration
>      ) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived
> classes and
> -        is not decorated so that the derived class doesn't have to use
> the decorator.
> +        """Optional additional build target setup steps for derived
> classes.
> +
> +        Derived classes should optionally overwrite this
> +        if they want to add additional build target setup steps.
>          """
>
>      def tear_down_build_target(self) -> None:
> -        """
> -        Perform the build target teardown that will be done after each
> build target
> -        tested on this node.
> +        """Build target teardown steps.
> +
> +        There are currently no common build target teardown steps
> +        common to all DTS node types.
>          """
>          self._tear_down_build_target()
>
>      def _tear_down_build_target(self) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived
> classes and
> -        is not decorated so that the derived class doesn't have to use
> the decorator.
> +        """Optional additional build target teardown steps for derived
> classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional build target teardown steps.
>          """
>
>      def create_session(self, name: str) -> OSSession:
> -        """
> -        Create and return a new OSSession tailored to the remote OS.
> +        """Create and return a new OS-agnostic remote session.
> +
> +        The returned session won't be used by the object creating it.
> +        Will be cleaned up automatically.
> +
> +        Args:
> +            name: The name of the session.
> +
> +        Returns:
> +            A new OS-agnostic remote session.
>          """
>          session_name = f"{self.name} {name}"
>          connection = create_session(
> @@ -130,14 +174,24 @@ def filter_lcores(
>          filter_specifier: LogicalCoreCount | LogicalCoreList,
>          ascending: bool = True,
>      ) -> list[LogicalCore]:
> -        """
> -        Filter the LogicalCores found on the Node according to
> -        a LogicalCoreCount or a LogicalCoreList.
> +        """Filter the node's logical cores that DTS can use.
>
> -        If ascending is True, use cores with the lowest numerical id first
> -        and continue in ascending order. If False, start with the highest
> -        id and continue in descending order. This ordering affects which
> -        sockets to consider first as well.
> +        Logical cores that DTS can use are ones that are present on the
> node,
> +        but filtered according to user config.
> +        The filter_specifier will filter cores from those logical cores.
> +
> +        Args:
> +            filter_specifier: Two different filters can be used, one that
> specifies
> +                the number of logical cores per core, cores per socket and
> +                the number of sockets,
> +                the other that specifies a logical core list.
> +            ascending: If True, use cores with the lowest numerical id
> first
> +                and continue in ascending order. If False, start with the
> highest
> +                id and continue in descending order. This ordering
> affects which
> +                sockets to consider first as well.
> +
> +        Returns:
> +            A list of logical cores.
>          """
>          self._logger.debug(f"Filtering {filter_specifier} from
> {self.lcores}.")
>          return lcore_filter(
> @@ -147,17 +201,14 @@ def filter_lcores(
>          ).filter()
>
>      def _get_remote_cpus(self) -> None:
> -        """
> -        Scan CPUs in the remote OS and store a list of LogicalCores.
> -        """
> +        """Scan CPUs in the remote OS and store a list of LogicalCores."""
>          self._logger.info("Getting CPU information.")
>          self.lcores =
> self.main_session.get_remote_cpus(self.config.use_first_core)
>
>      def _setup_hugepages(self):
> -        """
> -        Setup hugepages on the Node. Different architectures can supply
> different
> -        amounts of memory for hugepages and numa-based hugepage
> allocation may need
> -        to be considered.
> +        """Setup hugepages on the Node.
> +
> +        Configure the hugepages only if they're specified in user
> configuration.
>          """
>          if self.config.hugepages:
>              self.main_session.setup_hugepages(
> @@ -165,9 +216,7 @@ def _setup_hugepages(self):
>              )
>
>      def close(self) -> None:
> -        """
> -        Close all connections and free other resources.
> -        """
> +        """Close all connections and free other resources."""
>          if self.main_session:
>              self.main_session.close()
>          for session in self._other_sessions:
> @@ -176,6 +225,11 @@ def close(self) -> None:
>
>      @staticmethod
>      def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> +        """A decorator that skips the decorated function.
> +
> +        When used, the decorator executes an empty lambda function
> +        instead of the decorated function.
> +        """
>          if SETTINGS.skip_setup:
>              return lambda *args: None
>          else:
> --
> 2.30.2
>
>

[-- Attachment #2: Type: text/html, Size: 13939 bytes --]

^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v4 0/4] dts: add dts api docs
  2023-05-11  9:14   ` [RFC PATCH v3 " Juraj Linkeš
                       ` (4 preceding siblings ...)
  2023-05-17 16:56     ` [RFC PATCH v3 0/4] dts: add dts api docs Bruce Richardson
@ 2023-08-31 10:04     ` Juraj Linkeš
  2023-08-31 10:04       ` [RFC PATCH v4 1/4] dts: code adjustments for sphinx Juraj Linkeš
                         ` (4 more replies)
  5 siblings, 5 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-08-31 10:04 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev, Juraj Linkeš

Augment the meson build system with dts api generation. The api docs are
generated from Python docstrings in DTS using Sphinx. The format of
choice is the Google format [0].

The guides html sphinx configuration is used to preserve the same style,
except the sidebar is configured to allow unlimited depth and better
collapsing.

The build requires the same Python version and dependencies as DTS,
because Sphinx imports the Python modules. The modules are imported
individually, requiring code refactoring. Dependencies are installed
using Poetry from the dts directory:

poetry install --with docs

After installing, enter the Poetry shell:

poetry shell

And then run the build:
ninja -C <meson_build_dir> dts/doc

There's only one properly documented module that serves as a
demonstration of the style - framework.testbed_model.node. When we agree
on the docstring format, all docstrings will be reformatted.

[0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes

Juraj Linkeš (4):
  dts: code adjustments for sphinx
  dts: add doc generation dependencies
  dts: add doc generation
  dts: format docstrigs to google format

 buildtools/call-sphinx-build.py               |  29 +-
 doc/api/meson.build                           |   1 +
 doc/guides/conf.py                            |  32 +-
 doc/guides/meson.build                        |   1 +
 doc/guides/tools/dts.rst                      |  29 ++
 dts/doc/doc-index.rst                         |  17 +
 dts/doc/meson.build                           |  50 ++
 dts/framework/config/__init__.py              |   3 -
 dts/framework/dts.py                          |  34 +-
 dts/framework/remote_session/__init__.py      |  41 +-
 .../interactive_remote_session.py             |   0
 .../{remote => }/interactive_shell.py         |   0
 .../{remote => }/python_shell.py              |   0
 .../remote_session/remote/__init__.py         |  27 --
 .../{remote => }/remote_session.py            |   0
 .../{remote => }/ssh_session.py               |   0
 .../{remote => }/testpmd_shell.py             |   0
 dts/framework/settings.py                     |  92 ++--
 dts/framework/test_suite.py                   |   3 +-
 dts/framework/testbed_model/__init__.py       |  12 +-
 dts/framework/testbed_model/common.py         |  29 ++
 dts/framework/testbed_model/{hw => }/cpu.py   |  13 +
 dts/framework/testbed_model/hw/__init__.py    |  27 --
 .../linux_session.py                          |   4 +-
 dts/framework/testbed_model/node.py           | 193 +++++---
 .../os_session.py                             |  14 +-
 dts/framework/testbed_model/{hw => }/port.py  |   0
 .../posix_session.py                          |   2 +-
 dts/framework/testbed_model/sut_node.py       |   8 +-
 dts/framework/testbed_model/tg_node.py        |  30 +-
 .../traffic_generator/__init__.py             |  24 +
 .../capturing_traffic_generator.py            |   2 +-
 .../{ => traffic_generator}/scapy.py          |  17 +-
 .../traffic_generator.py                      |  16 +-
 .../testbed_model/{hw => }/virtual_device.py  |   0
 dts/framework/utils.py                        |  53 +--
 dts/main.py                                   |   3 +-
 dts/meson.build                               |  16 +
 dts/poetry.lock                               | 447 +++++++++++++++++-
 dts/pyproject.toml                            |   7 +
 meson.build                                   |   1 +
 41 files changed, 961 insertions(+), 316 deletions(-)
 create mode 100644 dts/doc/doc-index.rst
 create mode 100644 dts/doc/meson.build
 rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
 rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
 rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
 create mode 100644 dts/framework/testbed_model/common.py
 rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/{remote_session => testbed_model}/linux_session.py (98%)
 rename dts/framework/{remote_session => testbed_model}/os_session.py (97%)
 rename dts/framework/testbed_model/{hw => }/port.py (100%)
 rename dts/framework/{remote_session => testbed_model}/posix_session.py (99%)
 create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
 rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (99%)
 rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (96%)
 rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
 create mode 100644 dts/meson.build

-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v4 1/4] dts: code adjustments for sphinx
  2023-08-31 10:04     ` [RFC PATCH v4 " Juraj Linkeš
@ 2023-08-31 10:04       ` Juraj Linkeš
  2023-10-22 14:30         ` Yoan Picchi
  2023-08-31 10:04       ` [RFC PATCH v4 2/4] dts: add doc generation dependencies Juraj Linkeš
                         ` (3 subsequent siblings)
  4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-08-31 10:04 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev, Juraj Linkeš

sphinx-build only imports the Python modules when building the
documentation; it doesn't run DTS. This requires changes that make the
code importable without running it. This means:
* properly guarding argument parsing in the if __name__ == '__main__'
  block.
* the logger used by DTS runner underwent the same treatment so that it
  doesn't create unnecessary log files.
* however, DTS uses the arguments to construct an object holding global
  variables. The defaults for the global variables needed to be moved
  from argument parsing elsewhere.
* importing the remote_session module from framework resulted in
  circular imports because of one module trying to import another
  module. This is fixed by more granular imports.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/config/__init__.py              |  3 -
 dts/framework/dts.py                          | 34 ++++++-
 dts/framework/remote_session/__init__.py      | 41 ++++-----
 .../interactive_remote_session.py             |  0
 .../{remote => }/interactive_shell.py         |  0
 .../{remote => }/python_shell.py              |  0
 .../remote_session/remote/__init__.py         | 27 ------
 .../{remote => }/remote_session.py            |  0
 .../{remote => }/ssh_session.py               |  0
 .../{remote => }/testpmd_shell.py             |  0
 dts/framework/settings.py                     | 92 +++++++++++--------
 dts/framework/test_suite.py                   |  3 +-
 dts/framework/testbed_model/__init__.py       | 12 +--
 dts/framework/testbed_model/common.py         | 29 ++++++
 dts/framework/testbed_model/{hw => }/cpu.py   | 13 +++
 dts/framework/testbed_model/hw/__init__.py    | 27 ------
 .../linux_session.py                          |  4 +-
 dts/framework/testbed_model/node.py           | 22 ++++-
 .../os_session.py                             | 14 +--
 dts/framework/testbed_model/{hw => }/port.py  |  0
 .../posix_session.py                          |  2 +-
 dts/framework/testbed_model/sut_node.py       |  8 +-
 dts/framework/testbed_model/tg_node.py        | 30 +-----
 .../traffic_generator/__init__.py             | 24 +++++
 .../capturing_traffic_generator.py            |  2 +-
 .../{ => traffic_generator}/scapy.py          | 17 +---
 .../traffic_generator.py                      | 16 +++-
 .../testbed_model/{hw => }/virtual_device.py  |  0
 dts/framework/utils.py                        | 53 +----------
 dts/main.py                                   |  3 +-
 30 files changed, 229 insertions(+), 247 deletions(-)
 rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
 rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
 rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
 create mode 100644 dts/framework/testbed_model/common.py
 rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/{remote_session => testbed_model}/linux_session.py (98%)
 rename dts/framework/{remote_session => testbed_model}/os_session.py (97%)
 rename dts/framework/testbed_model/{hw => }/port.py (100%)
 rename dts/framework/{remote_session => testbed_model}/posix_session.py (99%)
 create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
 rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (99%)
 rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (96%)
 rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index cb7e00ba34..5de8b54bcf 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -324,6 +324,3 @@ def load_config() -> Configuration:
     config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
     config_obj: Configuration = Configuration.from_dict(dict(config))
     return config_obj
-
-
-CONFIGURATION = load_config()
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index f773f0c38d..925a212210 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,22 +3,23 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
+import logging
 import sys
 
 from .config import (
-    CONFIGURATION,
     BuildTargetConfiguration,
     ExecutionConfiguration,
     TestSuiteConfig,
+    load_config,
 )
 from .exception import BlockingTestSuiteError
 from .logger import DTSLOG, getLogger
 from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
 from .test_suite import get_test_suites
 from .testbed_model import SutNode, TGNode
-from .utils import check_dts_python_version
 
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG | logging.Logger = logging.getLogger("DTSRunner")
 result: DTSResult = DTSResult(dts_logger)
 
 
@@ -30,14 +31,18 @@ def run_all() -> None:
     global dts_logger
     global result
 
+    # create a regular DTS logger and create a new result with it
+    dts_logger = getLogger("DTSRunner")
+    result = DTSResult(dts_logger)
+
     # check the python version of the server that run dts
-    check_dts_python_version()
+    _check_dts_python_version()
 
     sut_nodes: dict[str, SutNode] = {}
     tg_nodes: dict[str, TGNode] = {}
     try:
         # for all Execution sections
-        for execution in CONFIGURATION.executions:
+        for execution in load_config().executions:
             sut_node = sut_nodes.get(execution.system_under_test_node.name)
             tg_node = tg_nodes.get(execution.traffic_generator_node.name)
 
@@ -82,6 +87,25 @@ def run_all() -> None:
     _exit_dts()
 
 
+def _check_dts_python_version() -> None:
+    def RED(text: str) -> str:
+        return f"\u001B[31;1m{str(text)}\u001B[0m"
+
+    if sys.version_info.major < 3 or (
+        sys.version_info.major == 3 and sys.version_info.minor < 10
+    ):
+        print(
+            RED(
+                (
+                    "WARNING: DTS execution node's python version is lower than"
+                    "python 3.10, is deprecated and will not work in future releases."
+                )
+            ),
+            file=sys.stderr,
+        )
+        print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+
+
 def _run_execution(
     sut_node: SutNode,
     tg_node: TGNode,
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 00b6d1f03a..5e7ddb2b05 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -12,29 +12,24 @@
 
 # pylama:ignore=W0611
 
-from framework.config import OS, NodeConfiguration
-from framework.exception import ConfigurationError
+from framework.config import NodeConfiguration
 from framework.logger import DTSLOG
 
-from .linux_session import LinuxSession
-from .os_session import InteractiveShellType, OSSession
-from .remote import (
-    CommandResult,
-    InteractiveRemoteSession,
-    InteractiveShell,
-    PythonShell,
-    RemoteSession,
-    SSHSession,
-    TestPmdDevice,
-    TestPmdShell,
-)
-
-
-def create_session(
+from .interactive_remote_session import InteractiveRemoteSession
+from .interactive_shell import InteractiveShell
+from .python_shell import PythonShell
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
+from .testpmd_shell import TestPmdShell
+
+
+def create_remote_session(
     node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> OSSession:
-    match node_config.os:
-        case OS.linux:
-            return LinuxSession(node_config, name, logger)
-        case _:
-            raise ConfigurationError(f"Unsupported OS {node_config.os}")
+) -> RemoteSession:
+    return SSHSession(node_config, name, logger)
+
+
+def create_interactive_session(
+    node_config: NodeConfiguration, logger: DTSLOG
+) -> InteractiveRemoteSession:
+    return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_remote_session.py
rename to dts/framework/remote_session/interactive_remote_session.py
diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_shell.py
rename to dts/framework/remote_session/interactive_shell.py
diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/python_shell.py
rename to dts/framework/remote_session/python_shell.py
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 06403691a5..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-# Copyright(c) 2023 University of New Hampshire
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .interactive_remote_session import InteractiveRemoteSession
-from .interactive_shell import InteractiveShell
-from .python_shell import PythonShell
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-from .testpmd_shell import TestPmdDevice, TestPmdShell
-
-
-def create_remote_session(
-    node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
-    return SSHSession(node_config, name, logger)
-
-
-def create_interactive_session(
-    node_config: NodeConfiguration, logger: DTSLOG
-) -> InteractiveRemoteSession:
-    return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/testpmd_shell.py
rename to dts/framework/remote_session/testpmd_shell.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index cfa39d011b..bf86861efb 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
 import argparse
 import os
 from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Any, TypeVar
 
@@ -22,7 +22,7 @@ def __init__(
             option_strings: Sequence[str],
             dest: str,
             nargs: str | int | None = None,
-            const: str | None = None,
+            const: bool | None = None,
             default: str = None,
             type: Callable[[str], _T | argparse.FileType | None] = None,
             choices: Iterable[_T] | None = None,
@@ -32,6 +32,12 @@ def __init__(
         ) -> None:
             env_var_value = os.environ.get(env_var)
             default = env_var_value or default
+            if const is not None:
+                nargs = 0
+                default = const if env_var_value else default
+                type = None
+                choices = None
+                metavar = None
             super(_EnvironmentArgument, self).__init__(
                 option_strings,
                 dest,
@@ -52,22 +58,28 @@ def __call__(
             values: Any,
             option_string: str = None,
         ) -> None:
-            setattr(namespace, self.dest, values)
+            if self.const is not None:
+                setattr(namespace, self.dest, self.const)
+            else:
+                setattr(namespace, self.dest, values)
 
     return _EnvironmentArgument
 
 
-@dataclass(slots=True, frozen=True)
+@dataclass(slots=True)
 class _Settings:
-    config_file_path: str
-    output_dir: str
-    timeout: float
-    verbose: bool
-    skip_setup: bool
-    dpdk_tarball_path: Path
-    compile_timeout: float
-    test_cases: list
-    re_run: int
+    config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+    output_dir: str = "output"
+    timeout: float = 15
+    verbose: bool = False
+    skip_setup: bool = False
+    dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+    compile_timeout: float = 1200
+    test_cases: list[str] = field(default_factory=list)
+    re_run: int = 0
+
+
+SETTINGS: _Settings = _Settings()
 
 
 def _get_parser() -> argparse.ArgumentParser:
@@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--config-file",
         action=_env_arg("DTS_CFG_FILE"),
-        default="conf.yaml",
+        default=SETTINGS.config_file_path,
+        type=Path,
         help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
         "and targets.",
     )
@@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--output-dir",
         "--output",
         action=_env_arg("DTS_OUTPUT_DIR"),
-        default="output",
+        default=SETTINGS.output_dir,
         help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
     )
 
@@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-t",
         "--timeout",
         action=_env_arg("DTS_TIMEOUT"),
-        default=15,
+        default=SETTINGS.timeout,
         type=float,
         help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
         "compiling DPDK.",
@@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
         "-v",
         "--verbose",
         action=_env_arg("DTS_VERBOSE"),
-        default="N",
-        help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
+        default=SETTINGS.verbose,
+        const=True,
+        help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
         "to the console.",
     )
 
@@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
         "-s",
         "--skip-setup",
         action=_env_arg("DTS_SKIP_SETUP"),
-        default="N",
-        help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
+        const=True,
+        help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
     )
 
     parser.add_argument(
@@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--snapshot",
         "--git-ref",
         action=_env_arg("DTS_DPDK_TARBALL"),
-        default="dpdk.tar.xz",
+        default=SETTINGS.dpdk_tarball_path,
         type=Path,
         help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
         "tag ID or tree ID to test. To test local changes, first commit them, "
@@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--compile-timeout",
         action=_env_arg("DTS_COMPILE_TIMEOUT"),
-        default=1200,
+        default=SETTINGS.compile_timeout,
         type=float,
         help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
     )
@@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--re-run",
         "--re_run",
         action=_env_arg("DTS_RERUN"),
-        default=0,
+        default=SETTINGS.re_run,
         type=int,
         help="[DTS_RERUN] Re-run each test case the specified amount of times "
         "if a test failure occurs",
@@ -162,23 +176,21 @@ def _get_parser() -> argparse.ArgumentParser:
     return parser
 
 
-def _get_settings() -> _Settings:
+def set_settings() -> None:
     parsed_args = _get_parser().parse_args()
-    return _Settings(
-        config_file_path=parsed_args.config_file,
-        output_dir=parsed_args.output_dir,
-        timeout=parsed_args.timeout,
-        verbose=(parsed_args.verbose == "Y"),
-        skip_setup=(parsed_args.skip_setup == "Y"),
-        dpdk_tarball_path=Path(
-            DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
-        )
+    global SETTINGS
+    SETTINGS.config_file_path = parsed_args.config_file
+    SETTINGS.output_dir = parsed_args.output_dir
+    SETTINGS.timeout = parsed_args.timeout
+    SETTINGS.verbose = parsed_args.verbose
+    SETTINGS.skip_setup = parsed_args.skip_setup
+    SETTINGS.dpdk_tarball_path = (
+        Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
         if not os.path.exists(parsed_args.tarball)
-        else Path(parsed_args.tarball),
-        compile_timeout=parsed_args.compile_timeout,
-        test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
-        re_run=parsed_args.re_run,
+        else Path(parsed_args.tarball)
     )
-
-
-SETTINGS: _Settings = _get_settings()
+    SETTINGS.compile_timeout = parsed_args.compile_timeout
+    SETTINGS.test_cases = (
+        parsed_args.test_cases.split(",") if parsed_args.test_cases else []
+    )
+    SETTINGS.re_run = parsed_args.re_run
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 3b890c0451..b381990d98 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -26,8 +26,7 @@
 from .logger import DTSLOG, getLogger
 from .settings import SETTINGS
 from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
-from .testbed_model import SutNode, TGNode
-from .testbed_model.hw.port import Port, PortLink
+from .testbed_model import Port, PortLink, SutNode, TGNode
 from .utils import get_packet_summaries
 
 
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 5cbb859e47..8ced05653b 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,15 +9,9 @@
 
 # pylama:ignore=W0611
 
-from .hw import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-    VirtualDevice,
-    lcore_filter,
-)
+from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
 from .node import Node
+from .port import Port, PortLink
 from .sut_node import SutNode
 from .tg_node import TGNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/common.py b/dts/framework/testbed_model/common.py
new file mode 100644
index 0000000000..9222f57847
--- /dev/null
+++ b/dts/framework/testbed_model/common.py
@@ -0,0 +1,29 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+
+class MesonArgs(object):
+    """
+    Aggregate the arguments needed to build DPDK:
+    default_library: Default library type, Meson allows "shared", "static" and "both".
+               Defaults to None, in which case the argument won't be used.
+    Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
+               Do not use -D with them, for example:
+               meson_args = MesonArgs(enable_kmods=True).
+    """
+
+    _default_library: str
+
+    def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
+        self._default_library = (
+            f"--default-library={default_library}" if default_library else ""
+        )
+        self._dpdk_args = " ".join(
+            (
+                f"-D{dpdk_arg_name}={dpdk_arg_value}"
+                for dpdk_arg_name, dpdk_arg_value in dpdk_args.items()
+            )
+        )
+
+    def __str__(self) -> str:
+        return " ".join(f"{self._default_library} {self._dpdk_args}".split())
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/testbed_model/cpu.py
index d1918a12dc..8fe785dfe4 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
             )
 
         return filtered_lcores
+
+
+def lcore_filter(
+    core_list: list[LogicalCore],
+    filter_specifier: LogicalCoreCount | LogicalCoreList,
+    ascending: bool,
+) -> LogicalCoreFilter:
+    if isinstance(filter_specifier, LogicalCoreList):
+        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+    elif isinstance(filter_specifier, LogicalCoreCount):
+        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+    else:
+        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
-    core_list: list[LogicalCore],
-    filter_specifier: LogicalCoreCount | LogicalCoreList,
-    ascending: bool,
-) -> LogicalCoreFilter:
-    if isinstance(filter_specifier, LogicalCoreList):
-        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
-    elif isinstance(filter_specifier, LogicalCoreCount):
-        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
-    else:
-        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
similarity index 98%
rename from dts/framework/remote_session/linux_session.py
rename to dts/framework/testbed_model/linux_session.py
index a3f1a6bf3b..7b60b5353f 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -9,10 +9,10 @@
 from typing_extensions import NotRequired
 
 from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
 from framework.utils import expand_range
 
+from .cpu import LogicalCore
+from .port import Port
 from .posix_session import PosixSession
 
 
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index fc01e0bf8e..23efa79c50 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,23 +12,26 @@
 from typing import Any, Callable, Type, Union
 
 from framework.config import (
+    OS,
     BuildTargetConfiguration,
     ExecutionConfiguration,
     NodeConfiguration,
 )
+from framework.exception import ConfigurationError
 from framework.logger import DTSLOG, getLogger
-from framework.remote_session import InteractiveShellType, OSSession, create_session
 from framework.settings import SETTINGS
 
-from .hw import (
+from .cpu import (
     LogicalCore,
     LogicalCoreCount,
     LogicalCoreList,
     LogicalCoreListFilter,
-    VirtualDevice,
     lcore_filter,
 )
-from .hw.port import Port
+from .linux_session import LinuxSession
+from .os_session import InteractiveShellType, OSSession
+from .port import Port
+from .virtual_device import VirtualDevice
 
 
 class Node(ABC):
@@ -69,6 +72,7 @@ def __init__(self, node_config: NodeConfiguration):
     def _init_ports(self) -> None:
         self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
         self.main_session.update_ports(self.ports)
+
         for port in self.ports:
             self.configure_port_state(port)
 
@@ -249,3 +253,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
             return lambda *args: None
         else:
             return func
+
+
+def create_session(
+    node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> OSSession:
+    match node_config.os:
+        case OS.linux:
+            return LinuxSession(node_config, name, logger)
+        case _:
+            raise ConfigurationError(f"Unsupported OS {node_config.os}")
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
similarity index 97%
rename from dts/framework/remote_session/os_session.py
rename to dts/framework/testbed_model/os_session.py
index 8a709eac1c..19ba9a69d5 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -10,19 +10,19 @@
 
 from framework.config import Architecture, NodeConfiguration, NodeInfo
 from framework.logger import DTSLOG
-from framework.remote_session.remote import InteractiveShell
-from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
-from framework.utils import MesonArgs
-
-from .remote import (
+from framework.remote_session import (
     CommandResult,
     InteractiveRemoteSession,
+    InteractiveShell,
     RemoteSession,
     create_interactive_session,
     create_remote_session,
 )
+from framework.settings import SETTINGS
+
+from .common import MesonArgs
+from .cpu import LogicalCore
+from .port import Port
 
 InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
 
diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
similarity index 100%
rename from dts/framework/testbed_model/hw/port.py
rename to dts/framework/testbed_model/port.py
diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
similarity index 99%
rename from dts/framework/remote_session/posix_session.py
rename to dts/framework/testbed_model/posix_session.py
index 5da0516e05..9a95baa353 100644
--- a/dts/framework/remote_session/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -9,8 +9,8 @@
 from framework.config import Architecture, NodeInfo
 from framework.exception import DPDKBuildError, RemoteCommandExecutionError
 from framework.settings import SETTINGS
-from framework.utils import MesonArgs
 
+from .common import MesonArgs
 from .os_session import OSSession
 
 
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 202aebfd06..2b7e104dcd 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,12 +15,14 @@
     NodeInfo,
     SutNodeConfiguration,
 )
-from framework.remote_session import CommandResult, InteractiveShellType, OSSession
+from framework.remote_session import CommandResult
 from framework.settings import SETTINGS
-from framework.utils import MesonArgs
 
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
+from .common import MesonArgs
+from .cpu import LogicalCoreCount, LogicalCoreList
 from .node import Node
+from .os_session import InteractiveShellType, OSSession
+from .virtual_device import VirtualDevice
 
 
 class EalParameters(object):
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 27025cfa31..166eb8430e 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -16,16 +16,11 @@
 
 from scapy.packet import Packet  # type: ignore[import]
 
-from framework.config import (
-    ScapyTrafficGeneratorConfig,
-    TGNodeConfiguration,
-    TrafficGeneratorType,
-)
-from framework.exception import ConfigurationError
-
-from .capturing_traffic_generator import CapturingTrafficGenerator
-from .hw.port import Port
+from framework.config import TGNodeConfiguration
+
 from .node import Node
+from .port import Port
+from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
 
 
 class TGNode(Node):
@@ -80,20 +75,3 @@ def close(self) -> None:
         """Free all resources used by the node"""
         self.traffic_generator.close()
         super(TGNode, self).close()
-
-
-def create_traffic_generator(
-    tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
-) -> CapturingTrafficGenerator:
-    """A factory function for creating traffic generator object from user config."""
-
-    from .scapy import ScapyTrafficGenerator
-
-    match traffic_generator_config.traffic_generator_type:
-        case TrafficGeneratorType.SCAPY:
-            return ScapyTrafficGenerator(tg_node, traffic_generator_config)
-        case _:
-            raise ConfigurationError(
-                "Unknown traffic generator: "
-                f"{traffic_generator_config.traffic_generator_type}"
-            )
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
new file mode 100644
index 0000000000..11bfa1ee0f
--- /dev/null
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -0,0 +1,24 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
+from framework.exception import ConfigurationError
+from framework.testbed_model.node import Node
+
+from .capturing_traffic_generator import CapturingTrafficGenerator
+from .scapy import ScapyTrafficGenerator
+
+
+def create_traffic_generator(
+    tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
+) -> CapturingTrafficGenerator:
+    """A factory function for creating traffic generator object from user config."""
+
+    match traffic_generator_config.traffic_generator_type:
+        case TrafficGeneratorType.SCAPY:
+            return ScapyTrafficGenerator(tg_node, traffic_generator_config)
+        case _:
+            raise ConfigurationError(
+                "Unknown traffic generator: "
+                f"{traffic_generator_config.traffic_generator_type}"
+            )
diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
similarity index 99%
rename from dts/framework/testbed_model/capturing_traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index ab98987f8e..765378fb4a 100644
--- a/dts/framework/testbed_model/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -16,9 +16,9 @@
 from scapy.packet import Packet  # type: ignore[import]
 
 from framework.settings import SETTINGS
+from framework.testbed_model.port import Port
 from framework.utils import get_packet_summaries
 
-from .hw.port import Port
 from .traffic_generator import TrafficGenerator
 
 
diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
similarity index 96%
rename from dts/framework/testbed_model/scapy.py
rename to dts/framework/testbed_model/traffic_generator/scapy.py
index af0d4dbb25..395d7e9bc0 100644
--- a/dts/framework/testbed_model/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -24,16 +24,15 @@
 from scapy.packet import Packet  # type: ignore[import]
 
 from framework.config import OS, ScapyTrafficGeneratorConfig
-from framework.logger import DTSLOG, getLogger
 from framework.remote_session import PythonShell
 from framework.settings import SETTINGS
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
 
 from .capturing_traffic_generator import (
     CapturingTrafficGenerator,
     _get_default_capture_name,
 )
-from .hw.port import Port
-from .tg_node import TGNode
 
 """
 ========= BEGIN RPC FUNCTIONS =========
@@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
     session: PythonShell
     rpc_server_proxy: xmlrpc.client.ServerProxy
     _config: ScapyTrafficGeneratorConfig
-    _tg_node: TGNode
-    _logger: DTSLOG
-
-    def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
-        self._config = config
-        self._tg_node = tg_node
-        self._logger = getLogger(
-            f"{self._tg_node.name} {self._config.traffic_generator_type}"
-        )
+
+    def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+        super().__init__(tg_node, config)
 
         assert (
             self._tg_node.config.os == OS.linux
diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
similarity index 80%
rename from dts/framework/testbed_model/traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 28c35d3ce4..ea7c3963da 100644
--- a/dts/framework/testbed_model/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -12,11 +12,12 @@
 
 from scapy.packet import Packet  # type: ignore[import]
 
-from framework.logger import DTSLOG
+from framework.config import TrafficGeneratorConfig
+from framework.logger import DTSLOG, getLogger
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
 from framework.utils import get_packet_summaries
 
-from .hw.port import Port
-
 
 class TrafficGenerator(ABC):
     """The base traffic generator.
@@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
     Defines the few basic methods that each traffic generator must implement.
     """
 
+    _config: TrafficGeneratorConfig
+    _tg_node: Node
     _logger: DTSLOG
 
+    def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+        self._config = config
+        self._tg_node = tg_node
+        self._logger = getLogger(
+            f"{self._tg_node.name} {self._config.traffic_generator_type}"
+        )
+
     def send_packet(self, packet: Packet, port: Port) -> None:
         """Send a packet and block until it is fully sent.
 
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index d27c2c5b5f..07e2d1c076 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -7,7 +7,6 @@
 import json
 import os
 import subprocess
-import sys
 from enum import Enum
 from pathlib import Path
 from subprocess import SubprocessError
@@ -16,6 +15,8 @@
 
 from .exception import ConfigurationError
 
+REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
+
 
 class StrEnum(Enum):
     @staticmethod
@@ -28,25 +29,6 @@ def __str__(self) -> str:
         return self.name
 
 
-REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
-
-
-def check_dts_python_version() -> None:
-    if sys.version_info.major < 3 or (
-        sys.version_info.major == 3 and sys.version_info.minor < 10
-    ):
-        print(
-            RED(
-                (
-                    "WARNING: DTS execution node's python version is lower than"
-                    "python 3.10, is deprecated and will not work in future releases."
-                )
-            ),
-            file=sys.stderr,
-        )
-        print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
-
-
 def expand_range(range_str: str) -> list[int]:
     """
     Process range string into a list of integers. There are two possible formats:
@@ -77,37 +59,6 @@ def get_packet_summaries(packets: list[Packet]):
     return f"Packet contents: \n{packet_summaries}"
 
 
-def RED(text: str) -> str:
-    return f"\u001B[31;1m{str(text)}\u001B[0m"
-
-
-class MesonArgs(object):
-    """
-    Aggregate the arguments needed to build DPDK:
-    default_library: Default library type, Meson allows "shared", "static" and "both".
-               Defaults to None, in which case the argument won't be used.
-    Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
-               Do not use -D with them, for example:
-               meson_args = MesonArgs(enable_kmods=True).
-    """
-
-    _default_library: str
-
-    def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
-        self._default_library = (
-            f"--default-library={default_library}" if default_library else ""
-        )
-        self._dpdk_args = " ".join(
-            (
-                f"-D{dpdk_arg_name}={dpdk_arg_value}"
-                for dpdk_arg_name, dpdk_arg_value in dpdk_args.items()
-            )
-        )
-
-    def __str__(self) -> str:
-        return " ".join(f"{self._default_library} {self._dpdk_args}".split())
-
-
 class _TarCompressionFormat(StrEnum):
     """Compression formats that tar can use.
 
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..060ff1b19a 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,11 @@
 
 import logging
 
-from framework import dts
+from framework import dts, settings
 
 
 def main() -> None:
+    settings.set_settings()
     dts.run_all()
 
 
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v4 2/4] dts: add doc generation dependencies
  2023-08-31 10:04     ` [RFC PATCH v4 " Juraj Linkeš
  2023-08-31 10:04       ` [RFC PATCH v4 1/4] dts: code adjustments for sphinx Juraj Linkeš
@ 2023-08-31 10:04       ` Juraj Linkeš
  2023-10-27 15:27         ` Yoan Picchi
  2023-08-31 10:04       ` [RFC PATCH v4 3/4] dts: add doc generation Juraj Linkeš
                         ` (2 subsequent siblings)
  4 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-08-31 10:04 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev, Juraj Linkeš

Sphinx imports every Python module when generating documentation from
docstrings, meaning all dts dependencies, including Python version,
must be satisfied.
By adding Sphinx to dts dependencies we make sure that the proper
Python version and dependencies are used when Sphinx is executed.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/poetry.lock    | 447 ++++++++++++++++++++++++++++++++++++++++++++-
 dts/pyproject.toml |   7 +
 2 files changed, 453 insertions(+), 1 deletion(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index f7b3b6d602..91afe5231a 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -1,5 +1,16 @@
 # This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
 
+[[package]]
+name = "alabaster"
+version = "0.7.13"
+description = "A configurable sidebar-enabled Sphinx theme"
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
+    {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
+]
+
 [[package]]
 name = "attrs"
 version = "23.1.0"
@@ -18,6 +29,17 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
 tests = ["attrs[tests-no-zope]", "zope-interface"]
 tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
 
+[[package]]
+name = "babel"
+version = "2.12.1"
+description = "Internationalization utilities"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Babel-2.12.1-py3-none-any.whl", hash = "sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610"},
+    {file = "Babel-2.12.1.tar.gz", hash = "sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455"},
+]
+
 [[package]]
 name = "bcrypt"
 version = "4.0.1"
@@ -86,6 +108,17 @@ d = ["aiohttp (>=3.7.4)"]
 jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
 uvloop = ["uvloop (>=0.15.2)"]
 
+[[package]]
+name = "certifi"
+version = "2023.7.22"
+description = "Python package for providing Mozilla's CA Bundle."
+optional = false
+python-versions = ">=3.6"
+files = [
+    {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
+    {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
+]
+
 [[package]]
 name = "cffi"
 version = "1.15.1"
@@ -162,6 +195,90 @@ files = [
 [package.dependencies]
 pycparser = "*"
 
+[[package]]
+name = "charset-normalizer"
+version = "3.2.0"
+description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
+optional = false
+python-versions = ">=3.7.0"
+files = [
+    {file = "charset-normalizer-3.2.0.tar.gz", hash = "sha256:3bb3d25a8e6c0aedd251753a79ae98a093c7e7b471faa3aa9a93a81431987ace"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b87549028f680ca955556e3bd57013ab47474c3124dc069faa0b6545b6c9710"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7c70087bfee18a42b4040bb9ec1ca15a08242cf5867c58726530bdf3945672ed"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a103b3a7069b62f5d4890ae1b8f0597618f628b286b03d4bc9195230b154bfa9"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94aea8eff76ee6d1cdacb07dd2123a68283cb5569e0250feab1240058f53b623"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:db901e2ac34c931d73054d9797383d0f8009991e723dab15109740a63e7f902a"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b0dac0ff919ba34d4df1b6131f59ce95b08b9065233446be7e459f95554c0dc8"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:193cbc708ea3aca45e7221ae58f0fd63f933753a9bfb498a3b474878f12caaad"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:09393e1b2a9461950b1c9a45d5fd251dc7c6f228acab64da1c9c0165d9c7765c"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:baacc6aee0b2ef6f3d308e197b5d7a81c0e70b06beae1f1fcacffdbd124fe0e3"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:bf420121d4c8dce6b889f0e8e4ec0ca34b7f40186203f06a946fa0276ba54029"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:c04a46716adde8d927adb9457bbe39cf473e1e2c2f5d0a16ceb837e5d841ad4f"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:aaf63899c94de41fe3cf934601b0f7ccb6b428c6e4eeb80da72c58eab077b19a"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d62e51710986674142526ab9f78663ca2b0726066ae26b78b22e0f5e571238dd"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-win32.whl", hash = "sha256:04e57ab9fbf9607b77f7d057974694b4f6b142da9ed4a199859d9d4d5c63fe96"},
+    {file = "charset_normalizer-3.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:48021783bdf96e3d6de03a6e39a1171ed5bd7e8bb93fc84cc649d11490f87cea"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4957669ef390f0e6719db3613ab3a7631e68424604a7b448f079bee145da6e09"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:46fb8c61d794b78ec7134a715a3e564aafc8f6b5e338417cb19fe9f57a5a9bf2"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f779d3ad205f108d14e99bb3859aa7dd8e9c68874617c72354d7ecaec2a054ac"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f25c229a6ba38a35ae6e25ca1264621cc25d4d38dca2942a7fce0b67a4efe918"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2efb1bd13885392adfda4614c33d3b68dee4921fd0ac1d3988f8cbb7d589e72a"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f30b48dd7fa1474554b0b0f3fdfdd4c13b5c737a3c6284d3cdc424ec0ffff3a"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:246de67b99b6851627d945db38147d1b209a899311b1305dd84916f2b88526c6"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9bd9b3b31adcb054116447ea22caa61a285d92e94d710aa5ec97992ff5eb7cf3"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:8c2f5e83493748286002f9369f3e6607c565a6a90425a3a1fef5ae32a36d749d"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:3170c9399da12c9dc66366e9d14da8bf7147e1e9d9ea566067bbce7bb74bd9c2"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:7a4826ad2bd6b07ca615c74ab91f32f6c96d08f6fcc3902ceeedaec8cdc3bcd6"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:3b1613dd5aee995ec6d4c69f00378bbd07614702a315a2cf6c1d21461fe17c23"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9e608aafdb55eb9f255034709e20d5a83b6d60c054df0802fa9c9883d0a937aa"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-win32.whl", hash = "sha256:f2a1d0fd4242bd8643ce6f98927cf9c04540af6efa92323e9d3124f57727bfc1"},
+    {file = "charset_normalizer-3.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:681eb3d7e02e3c3655d1b16059fbfb605ac464c834a0c629048a30fad2b27489"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c57921cda3a80d0f2b8aec7e25c8aa14479ea92b5b51b6876d975d925a2ea346"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41b25eaa7d15909cf3ac4c96088c1f266a9a93ec44f87f1d13d4a0e86c81b982"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f058f6963fd82eb143c692cecdc89e075fa0828db2e5b291070485390b2f1c9c"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a7647ebdfb9682b7bb97e2a5e7cb6ae735b1c25008a70b906aecca294ee96cf4"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eef9df1eefada2c09a5e7a40991b9fc6ac6ef20b1372abd48d2794a316dc0449"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e03b8895a6990c9ab2cdcd0f2fe44088ca1c65ae592b8f795c3294af00a461c3"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ee4006268ed33370957f55bf2e6f4d263eaf4dc3cfc473d1d90baff6ed36ce4a"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c4983bf937209c57240cff65906b18bb35e64ae872da6a0db937d7b4af845dd7"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:3bb7fda7260735efe66d5107fb7e6af6a7c04c7fce9b2514e04b7a74b06bf5dd"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:72814c01533f51d68702802d74f77ea026b5ec52793c791e2da806a3844a46c3"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:70c610f6cbe4b9fce272c407dd9d07e33e6bf7b4aa1b7ffb6f6ded8e634e3592"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-win32.whl", hash = "sha256:a401b4598e5d3f4a9a811f3daf42ee2291790c7f9d74b18d75d6e21dda98a1a1"},
+    {file = "charset_normalizer-3.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:c0b21078a4b56965e2b12f247467b234734491897e99c1d51cee628da9786959"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:95eb302ff792e12aba9a8b8f8474ab229a83c103d74a750ec0bd1c1eea32e669"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a100c6d595a7f316f1b6f01d20815d916e75ff98c27a01ae817439ea7726329"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6339d047dab2780cc6220f46306628e04d9750f02f983ddb37439ca47ced7149"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e4b749b9cc6ee664a3300bb3a273c1ca8068c46be705b6c31cf5d276f8628a94"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a38856a971c602f98472050165cea2cdc97709240373041b69030be15047691f"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f87f746ee241d30d6ed93969de31e5ffd09a2961a051e60ae6bddde9ec3583aa"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:89f1b185a01fe560bc8ae5f619e924407efca2191b56ce749ec84982fc59a32a"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e1c8a2f4c69e08e89632defbfabec2feb8a8d99edc9f89ce33c4b9e36ab63037"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2f4ac36d8e2b4cc1aa71df3dd84ff8efbe3bfb97ac41242fbcfc053c67434f46"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a386ebe437176aab38c041de1260cd3ea459c6ce5263594399880bbc398225b2"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:ccd16eb18a849fd8dcb23e23380e2f0a354e8daa0c984b8a732d9cfaba3a776d"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:e6a5bf2cba5ae1bb80b154ed68a3cfa2fa00fde979a7f50d6598d3e17d9ac20c"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:45de3f87179c1823e6d9e32156fb14c1927fcc9aba21433f088fdfb555b77c10"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-win32.whl", hash = "sha256:1000fba1057b92a65daec275aec30586c3de2401ccdcd41f8a5c1e2c87078706"},
+    {file = "charset_normalizer-3.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:8b2c760cfc7042b27ebdb4a43a4453bd829a5742503599144d54a032c5dc7e9e"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:855eafa5d5a2034b4621c74925d89c5efef61418570e5ef9b37717d9c796419c"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:203f0c8871d5a7987be20c72442488a0b8cfd0f43b7973771640fc593f56321f"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e857a2232ba53ae940d3456f7533ce6ca98b81917d47adc3c7fd55dad8fab858"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e86d77b090dbddbe78867a0275cb4df08ea195e660f1f7f13435a4649e954e5"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c4fb39a81950ec280984b3a44f5bd12819953dc5fa3a7e6fa7a80db5ee853952"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2dee8e57f052ef5353cf608e0b4c871aee320dd1b87d351c28764fc0ca55f9f4"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8700f06d0ce6f128de3ccdbc1acaea1ee264d2caa9ca05daaf492fde7c2a7200"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1920d4ff15ce893210c1f0c0e9d19bfbecb7983c76b33f046c13a8ffbd570252"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:c1c76a1743432b4b60ab3358c937a3fe1341c828ae6194108a94c69028247f22"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f7560358a6811e52e9c4d142d497f1a6e10103d3a6881f18d04dbce3729c0e2c"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:c8063cf17b19661471ecbdb3df1c84f24ad2e389e326ccaf89e3fb2484d8dd7e"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:cd6dbe0238f7743d0efe563ab46294f54f9bc8f4b9bcf57c3c666cc5bc9d1299"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:1249cbbf3d3b04902ff081ffbb33ce3377fa6e4c7356f759f3cd076cc138d020"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-win32.whl", hash = "sha256:6c409c0deba34f147f77efaa67b8e4bb83d2f11c8806405f76397ae5b8c0d1c9"},
+    {file = "charset_normalizer-3.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:7095f6fbfaa55defb6b733cfeb14efaae7a29f0b59d8cf213be4e7ca0b857b80"},
+    {file = "charset_normalizer-3.2.0-py3-none-any.whl", hash = "sha256:8e098148dd37b4ce3baca71fb394c81dc5d9c7728c95df695d2dca218edf40e6"},
+]
+
 [[package]]
 name = "click"
 version = "8.1.6"
@@ -232,6 +349,17 @@ ssh = ["bcrypt (>=3.1.5)"]
 test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
 test-randomorder = ["pytest-randomly"]
 
+[[package]]
+name = "docutils"
+version = "0.18.1"
+description = "Docutils -- Python Documentation Utilities"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
+files = [
+    {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
+    {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
+]
+
 [[package]]
 name = "fabric"
 version = "2.7.1"
@@ -252,6 +380,28 @@ pathlib2 = "*"
 pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
 testing = ["mock (>=2.0.0,<3.0)"]
 
+[[package]]
+name = "idna"
+version = "3.4"
+description = "Internationalized Domain Names in Applications (IDNA)"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
+    {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
+]
+
+[[package]]
+name = "imagesize"
+version = "1.4.1"
+description = "Getting image size from png/jpeg/jpeg2000/gif file"
+optional = false
+python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
+files = [
+    {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
+    {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
+]
+
 [[package]]
 name = "invoke"
 version = "1.7.3"
@@ -280,6 +430,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
 plugins = ["setuptools"]
 requirements-deprecated-finder = ["pip-api", "pipreqs"]
 
+[[package]]
+name = "jinja2"
+version = "3.1.2"
+description = "A very fast and expressive template engine."
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
+    {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
+]
+
+[package.dependencies]
+MarkupSafe = ">=2.0"
+
+[package.extras]
+i18n = ["Babel (>=2.7)"]
+
 [[package]]
 name = "jsonpatch"
 version = "1.33"
@@ -340,6 +507,65 @@ files = [
 [package.dependencies]
 referencing = ">=0.28.0"
 
+[[package]]
+name = "markupsafe"
+version = "2.1.3"
+description = "Safely add untrusted strings to HTML/XML markup."
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
+    {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
+    {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
+    {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
+    {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
+    {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
+    {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
+]
+
 [[package]]
 name = "mccabe"
 version = "0.7.0"
@@ -404,6 +630,17 @@ files = [
     {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
 ]
 
+[[package]]
+name = "packaging"
+version = "23.1"
+description = "Core utilities for Python packages"
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "packaging-23.1-py3-none-any.whl", hash = "sha256:994793af429502c4ea2ebf6bf664629d07c1a9fe974af92966e4b8d2df7edc61"},
+    {file = "packaging-23.1.tar.gz", hash = "sha256:a392980d2b6cffa644431898be54b0045151319d1e7ec34f0cfed48767dd334f"},
+]
+
 [[package]]
 name = "paramiko"
 version = "3.2.0"
@@ -515,6 +752,20 @@ files = [
     {file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
 ]
 
+[[package]]
+name = "pygments"
+version = "2.15.1"
+description = "Pygments is a syntax highlighting package written in Python."
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "Pygments-2.15.1-py3-none-any.whl", hash = "sha256:db2db3deb4b4179f399a09054b023b6a586b76499d36965813c71aa8ed7b5fd1"},
+    {file = "Pygments-2.15.1.tar.gz", hash = "sha256:8ace4d3c1dd481894b2005f560ead0f9f19ee64fe983366be1a21e171d12775c"},
+]
+
+[package.extras]
+plugins = ["importlib-metadata"]
+
 [[package]]
 name = "pylama"
 version = "8.4.1"
@@ -632,6 +883,27 @@ files = [
 attrs = ">=22.2.0"
 rpds-py = ">=0.7.0"
 
+[[package]]
+name = "requests"
+version = "2.31.0"
+description = "Python HTTP for Humans."
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
+    {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
+]
+
+[package.dependencies]
+certifi = ">=2017.4.17"
+charset-normalizer = ">=2,<4"
+idna = ">=2.5,<4"
+urllib3 = ">=1.21.1,<3"
+
+[package.extras]
+socks = ["PySocks (>=1.5.6,!=1.5.7)"]
+use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
+
 [[package]]
 name = "rpds-py"
 version = "0.9.2"
@@ -775,6 +1047,162 @@ files = [
     {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
 ]
 
+[[package]]
+name = "sphinx"
+version = "6.2.1"
+description = "Python documentation generator"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
+    {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
+]
+
+[package.dependencies]
+alabaster = ">=0.7,<0.8"
+babel = ">=2.9"
+colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
+docutils = ">=0.18.1,<0.20"
+imagesize = ">=1.3"
+Jinja2 = ">=3.0"
+packaging = ">=21.0"
+Pygments = ">=2.13"
+requests = ">=2.25.0"
+snowballstemmer = ">=2.0"
+sphinxcontrib-applehelp = "*"
+sphinxcontrib-devhelp = "*"
+sphinxcontrib-htmlhelp = ">=2.0.0"
+sphinxcontrib-jsmath = "*"
+sphinxcontrib-qthelp = "*"
+sphinxcontrib-serializinghtml = ">=1.1.5"
+
+[package.extras]
+docs = ["sphinxcontrib-websupport"]
+lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
+test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
+
+[[package]]
+name = "sphinx-rtd-theme"
+version = "1.2.2"
+description = "Read the Docs theme for Sphinx"
+optional = false
+python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
+files = [
+    {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
+    {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
+]
+
+[package.dependencies]
+docutils = "<0.19"
+sphinx = ">=1.6,<7"
+sphinxcontrib-jquery = ">=4,<5"
+
+[package.extras]
+dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
+
+[[package]]
+name = "sphinxcontrib-applehelp"
+version = "1.0.4"
+description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "sphinxcontrib-applehelp-1.0.4.tar.gz", hash = "sha256:828f867945bbe39817c210a1abfd1bc4895c8b73fcaade56d45357a348a07d7e"},
+    {file = "sphinxcontrib_applehelp-1.0.4-py3-none-any.whl", hash = "sha256:29d341f67fb0f6f586b23ad80e072c8e6ad0b48417db2bde114a4c9746feb228"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-devhelp"
+version = "1.0.2"
+description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document."
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"},
+    {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-htmlhelp"
+version = "2.0.1"
+description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
+optional = false
+python-versions = ">=3.8"
+files = [
+    {file = "sphinxcontrib-htmlhelp-2.0.1.tar.gz", hash = "sha256:0cbdd302815330058422b98a113195c9249825d681e18f11e8b1f78a2f11efff"},
+    {file = "sphinxcontrib_htmlhelp-2.0.1-py3-none-any.whl", hash = "sha256:c38cb46dccf316c79de6e5515e1770414b797162b23cd3d06e67020e1d2a6903"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["html5lib", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-jquery"
+version = "4.1"
+description = "Extension to include jQuery on newer Sphinx releases"
+optional = false
+python-versions = ">=2.7"
+files = [
+    {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
+    {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
+]
+
+[package.dependencies]
+Sphinx = ">=1.8"
+
+[[package]]
+name = "sphinxcontrib-jsmath"
+version = "1.0.1"
+description = "A sphinx extension which renders display math in HTML via JavaScript"
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
+    {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
+]
+
+[package.extras]
+test = ["flake8", "mypy", "pytest"]
+
+[[package]]
+name = "sphinxcontrib-qthelp"
+version = "1.0.3"
+description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document."
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"},
+    {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
+[[package]]
+name = "sphinxcontrib-serializinghtml"
+version = "1.1.5"
+description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)."
+optional = false
+python-versions = ">=3.5"
+files = [
+    {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"},
+    {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"},
+]
+
+[package.extras]
+lint = ["docutils-stubs", "flake8", "mypy"]
+test = ["pytest"]
+
 [[package]]
 name = "toml"
 version = "0.10.2"
@@ -819,6 +1247,23 @@ files = [
     {file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
 ]
 
+[[package]]
+name = "urllib3"
+version = "2.0.4"
+description = "HTTP library with thread-safe connection pooling, file post, and more."
+optional = false
+python-versions = ">=3.7"
+files = [
+    {file = "urllib3-2.0.4-py3-none-any.whl", hash = "sha256:de7df1803967d2c2a98e4b11bb7d6bd9210474c46e8a0401514e3a42a75ebde4"},
+    {file = "urllib3-2.0.4.tar.gz", hash = "sha256:8d22f86aae8ef5e410d4f539fde9ce6b2113a001bb4d189e0aed70642d602b11"},
+]
+
+[package.extras]
+brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
+secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
+socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
+zstd = ["zstandard (>=0.18.0)"]
+
 [[package]]
 name = "warlock"
 version = "2.0.1"
@@ -837,4 +1282,4 @@ jsonschema = ">=4,<5"
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
+content-hash = "fea1a3eddd1286d2ccd3bdb61c6ce085403f31567dbe4f55b6775bcf1e325372"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6762edfa6b..159940ce02 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -34,6 +34,13 @@ pylama = "^8.4.1"
 pyflakes = "^2.5.0"
 toml = "^0.10.2"
 
+[tool.poetry.group.docs]
+optional = true
+
+[tool.poetry.group.docs.dependencies]
+sphinx = "<7"
+sphinx-rtd-theme = "^1.2.2"
+
 [build-system]
 requires = ["poetry-core>=1.0.0"]
 build-backend = "poetry.core.masonry.api"
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v4 3/4] dts: add doc generation
  2023-08-31 10:04     ` [RFC PATCH v4 " Juraj Linkeš
  2023-08-31 10:04       ` [RFC PATCH v4 1/4] dts: code adjustments for sphinx Juraj Linkeš
  2023-08-31 10:04       ` [RFC PATCH v4 2/4] dts: add doc generation dependencies Juraj Linkeš
@ 2023-08-31 10:04       ` Juraj Linkeš
  2023-09-20  7:08         ` Juraj Linkeš
  2023-10-26 16:43         ` Yoan Picchi
  2023-08-31 10:04       ` [RFC PATCH v4 4/4] dts: format docstrigs to google format Juraj Linkeš
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
  4 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-08-31 10:04 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev, Juraj Linkeš

The tool used to generate developer docs is sphinx, which is already
used in DPDK. The configuration is kept the same to preserve the style.

Sphinx generates the documentation from Python docstrings. The docstring
format most suitable for DTS seems to be the Google format [0] which
requires the sphinx.ext.napoleon extension.

There are two requirements for building DTS docs:
* The same Python version as DTS or higher, because Sphinx import the
  code.
* Also the same Python packages as DTS, for the same reason.

[0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 buildtools/call-sphinx-build.py | 29 ++++++++++++-------
 doc/api/meson.build             |  1 +
 doc/guides/conf.py              | 32 +++++++++++++++++----
 doc/guides/meson.build          |  1 +
 doc/guides/tools/dts.rst        | 29 +++++++++++++++++++
 dts/doc/doc-index.rst           | 17 +++++++++++
 dts/doc/meson.build             | 50 +++++++++++++++++++++++++++++++++
 dts/meson.build                 | 16 +++++++++++
 meson.build                     |  1 +
 9 files changed, 161 insertions(+), 15 deletions(-)
 create mode 100644 dts/doc/doc-index.rst
 create mode 100644 dts/doc/meson.build
 create mode 100644 dts/meson.build

diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
index 39a60d09fa..c2f3acfb1d 100755
--- a/buildtools/call-sphinx-build.py
+++ b/buildtools/call-sphinx-build.py
@@ -3,37 +3,46 @@
 # Copyright(c) 2019 Intel Corporation
 #
 
+import argparse
 import sys
 import os
 from os.path import join
 from subprocess import run, PIPE, STDOUT
 from packaging.version import Version
 
-# assign parameters to variables
-(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
+parser = argparse.ArgumentParser()
+parser.add_argument('sphinx')
+parser.add_argument('version')
+parser.add_argument('src')
+parser.add_argument('dst')
+parser.add_argument('--dts-root', default='.')
+args, extra_args = parser.parse_known_args()
 
 # set the version in environment for sphinx to pick up
-os.environ['DPDK_VERSION'] = version
+os.environ['DPDK_VERSION'] = args.version
+os.environ['DTS_ROOT'] = args.dts_root
 
 # for sphinx version >= 1.7 add parallelism using "-j auto"
-ver = run([sphinx, '--version'], stdout=PIPE,
+ver = run([args.sphinx, '--version'], stdout=PIPE,
           stderr=STDOUT).stdout.decode().split()[-1]
-sphinx_cmd = [sphinx] + extra_args
+sphinx_cmd = [args.sphinx] + extra_args
 if Version(ver) >= Version('1.7'):
     sphinx_cmd += ['-j', 'auto']
 
 # find all the files sphinx will process so we can write them as dependencies
 srcfiles = []
-for root, dirs, files in os.walk(src):
+for root, dirs, files in os.walk(args.src):
     srcfiles.extend([join(root, f) for f in files])
 
 # run sphinx, putting the html output in a "html" directory
-with open(join(dst, 'sphinx_html.out'), 'w') as out:
-    process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
-                  stdout=out)
+with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
+    process = run(
+        sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
+        stdout=out
+    )
 
 # create a gcc format .d file giving all the dependencies of this doc build
-with open(join(dst, '.html.d'), 'w') as d:
+with open(join(args.dst, '.html.d'), 'w') as d:
     d.write('html: ' + ' '.join(srcfiles) + '\n')
 
 sys.exit(process.returncode)
diff --git a/doc/api/meson.build b/doc/api/meson.build
index 2876a78a7e..1f0c725a94 100644
--- a/doc/api/meson.build
+++ b/doc/api/meson.build
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
 
+doc_api_build_dir = meson.current_build_dir()
 doxygen = find_program('doxygen', required: get_option('enable_docs'))
 
 if not doxygen.found()
diff --git a/doc/guides/conf.py b/doc/guides/conf.py
index 0f7ff5282d..737e5a5688 100644
--- a/doc/guides/conf.py
+++ b/doc/guides/conf.py
@@ -7,10 +7,9 @@
 from sphinx import __version__ as sphinx_version
 from os import listdir
 from os import environ
-from os.path import basename
-from os.path import dirname
+from os.path import basename, dirname
 from os.path import join as path_join
-from sys import argv, stderr
+from sys import argv, stderr, path
 
 import configparser
 
@@ -24,6 +23,29 @@
           file=stderr)
     pass
 
+extensions = ['sphinx.ext.napoleon']
+
+# Python docstring options
+autodoc_default_options = {
+    'members': True,
+    'member-order': 'bysource',
+    'show-inheritance': True,
+}
+autodoc_typehints = 'both'
+autodoc_typehints_format = 'short'
+napoleon_numpy_docstring = False
+napoleon_attr_annotations = True
+napoleon_use_ivar = True
+napoleon_use_rtype = False
+add_module_names = False
+toc_object_entries = False
+
+# Sidebar config
+html_theme_options = {
+    'collapse_navigation': False,
+    'navigation_depth': -1,
+}
+
 stop_on_error = ('-W' in argv)
 
 project = 'Data Plane Development Kit'
@@ -35,8 +57,8 @@
 html_show_copyright = False
 highlight_language = 'none'
 
-release = environ.setdefault('DPDK_VERSION', "None")
-version = release
+path.append(environ.get('DTS_ROOT'))
+version = environ.setdefault('DPDK_VERSION', "None")
 
 master_doc = 'index'
 
diff --git a/doc/guides/meson.build b/doc/guides/meson.build
index 51f81da2e3..8933d75f6b 100644
--- a/doc/guides/meson.build
+++ b/doc/guides/meson.build
@@ -1,6 +1,7 @@
 # SPDX-License-Identifier: BSD-3-Clause
 # Copyright(c) 2018 Intel Corporation
 
+doc_guides_source_dir = meson.current_source_dir()
 sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
 
 if not sphinx.found()
diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 32c18ee472..98923b1467 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -335,3 +335,32 @@ There are three tools used in DTS to help with code checking, style and formatti
 These three tools are all used in ``devtools/dts-check-format.sh``,
 the DTS code check and format script.
 Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
+
+
+Building DTS API docs
+---------------------
+
+To build DTS API docs, install the dependencies with Poetry, then enter its shell:
+
+   .. code-block:: console
+
+   poetry install --with docs
+   poetry shell
+
+
+Build commands
+~~~~~~~~~~~~~~
+
+The documentation is built using the standard DPDK build system.
+
+After entering Poetry's shell, build the documentation with:
+
+   .. code-block:: console
+
+   ninja -C build dts/doc
+
+The output is generated in ``build/doc/api/dts/html``.
+
+.. Note::
+
+   Make sure to fix any Sphinx warnings when adding or updating docstrings.
diff --git a/dts/doc/doc-index.rst b/dts/doc/doc-index.rst
new file mode 100644
index 0000000000..f5dcd553f2
--- /dev/null
+++ b/dts/doc/doc-index.rst
@@ -0,0 +1,17 @@
+.. DPDK Test Suite documentation.
+
+Welcome to DPDK Test Suite's documentation!
+===========================================
+
+.. toctree::
+   :titlesonly:
+   :caption: Contents:
+
+   framework
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
diff --git a/dts/doc/meson.build b/dts/doc/meson.build
new file mode 100644
index 0000000000..8e70eabc51
--- /dev/null
+++ b/dts/doc/meson.build
@@ -0,0 +1,50 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+sphinx = find_program('sphinx-build')
+sphinx_apidoc = find_program('sphinx-apidoc')
+
+if not sphinx.found() or not sphinx_apidoc.found()
+    subdir_done()
+endif
+
+dts_api_framework_dir = join_paths(dts_dir, 'framework')
+dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
+dts_api_src = custom_target('dts_api_src',
+        output: 'modules.rst',
+        command: [sphinx_apidoc, '--append-syspath', '--force',
+            '--module-first', '--separate', '-V', meson.project_version(),
+            '-o', dts_api_build_dir, '--no-toc', '--implicit-namespaces',
+            dts_api_framework_dir],
+        build_by_default: false)
+doc_targets += dts_api_src
+doc_target_names += 'DTS_API_sphinx_sources'
+
+cp = find_program('cp')
+cp_index = custom_target('cp_index',
+        input: 'doc-index.rst',
+        output: 'index.rst',
+        depends: dts_api_src,
+        command: [cp, '@INPUT@', join_paths(dts_api_build_dir, 'index.rst')],
+        build_by_default: false)
+doc_targets += cp_index
+doc_target_names += 'DTS_API_sphinx_index'
+
+extra_sphinx_args = ['-a', '-c', doc_guides_source_dir]
+if get_option('werror')
+    extra_sphinx_args += '-W'
+endif
+
+htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
+dts_api_html = custom_target('dts_api_html',
+        output: 'html',
+        depends: cp_index,
+        command: ['DTS_ROOT=@0@'.format(dts_dir),
+            sphinx_wrapper, sphinx, meson.project_version(),
+            dts_api_build_dir, dts_api_build_dir,
+            '--dts-root', dts_dir, extra_sphinx_args],
+        build_by_default: false,
+        install: false,
+        install_dir: htmldir)
+doc_targets += dts_api_html
+doc_target_names += 'DTS_API_HTML'
diff --git a/dts/meson.build b/dts/meson.build
new file mode 100644
index 0000000000..17bda07636
--- /dev/null
+++ b/dts/meson.build
@@ -0,0 +1,16 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+doc_targets = []
+doc_target_names = []
+dts_dir = meson.current_source_dir()
+
+subdir('doc')
+
+if doc_targets.length() == 0
+    message = 'No docs targets found'
+else
+    message = 'Built docs:'
+endif
+run_target('dts/doc', command: [echo, message, doc_target_names],
+    depends: doc_targets)
diff --git a/meson.build b/meson.build
index 39cb73846d..4d34dc531c 100644
--- a/meson.build
+++ b/meson.build
@@ -85,6 +85,7 @@ subdir('app')
 
 # build docs
 subdir('doc')
+subdir('dts')
 
 # build any examples explicitly requested - useful for developers - and
 # install any example code into the appropriate install path
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [RFC PATCH v4 4/4] dts: format docstrigs to google format
  2023-08-31 10:04     ` [RFC PATCH v4 " Juraj Linkeš
                         ` (2 preceding siblings ...)
  2023-08-31 10:04       ` [RFC PATCH v4 3/4] dts: add doc generation Juraj Linkeš
@ 2023-08-31 10:04       ` Juraj Linkeš
  2023-09-01 17:02         ` Jeremy Spewock
  2023-10-31 12:10         ` Yoan Picchi
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
  4 siblings, 2 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-08-31 10:04 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev, Juraj Linkeš

WIP: only one module is reformatted to serve as a demonstration.

The google format is documented here [0].

[0]: https://google.github.io/styleguide/pyguide.html

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/testbed_model/node.py | 171 +++++++++++++++++++---------
 1 file changed, 118 insertions(+), 53 deletions(-)

diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index 23efa79c50..619743ebe7 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -3,8 +3,13 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
-"""
-A node is a generic host that DTS connects to and manages.
+"""Common functionality for node management.
+
+There's a base class, Node, that's supposed to be extended by other classes
+with functionality specific to that node type.
+The only part that can be used standalone is the Node.skip_setup static method,
+which is a decorator used to skip method execution if skip_setup is passed
+by the user on the cmdline or in an env variable.
 """
 
 from abc import ABC
@@ -35,10 +40,26 @@
 
 
 class Node(ABC):
-    """
-    Basic class for node management. This class implements methods that
-    manage a node, such as information gathering (of CPU/PCI/NIC) and
-    environment setup.
+    """The base class for node management.
+
+    It shouldn't be instantiated, but rather extended.
+    It implements common methods to manage any node:
+
+       * connection to the node
+       * information gathering of CPU
+       * hugepages setup
+
+    Arguments:
+        node_config: The config from the input configuration file.
+
+    Attributes:
+        main_session: The primary OS-agnostic remote session used
+            to communicate with the node.
+        config: The configuration used to create the node.
+        name: The name of the node.
+        lcores: The list of logical cores that DTS can use on the node.
+            It's derived from logical cores present on the node and user configuration.
+        ports: The ports of this node specified in user configuration.
     """
 
     main_session: OSSession
@@ -77,9 +98,14 @@ def _init_ports(self) -> None:
             self.configure_port_state(port)
 
     def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
-        """
-        Perform the execution setup that will be done for each execution
-        this node is part of.
+        """Execution setup steps.
+
+        Configure hugepages and call self._set_up_execution where
+        the rest of the configuration steps (if any) are implemented.
+
+        Args:
+            execution_config: The execution configuration according to which
+                the setup steps will be taken.
         """
         self._setup_hugepages()
         self._set_up_execution(execution_config)
@@ -88,58 +114,78 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
             self.virtual_devices.append(VirtualDevice(vdev))
 
     def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional execution setup steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional execution setup steps.
         """
 
     def tear_down_execution(self) -> None:
-        """
-        Perform the execution teardown that will be done after each execution
-        this node is part of concludes.
+        """Execution teardown steps.
+
+        There are currently no common execution teardown steps
+        common to all DTS node types.
         """
         self.virtual_devices = []
         self._tear_down_execution()
 
     def _tear_down_execution(self) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional execution teardown steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional execution teardown steps.
         """
 
     def set_up_build_target(
         self, build_target_config: BuildTargetConfiguration
     ) -> None:
-        """
-        Perform the build target setup that will be done for each build target
-        tested on this node.
+        """Build target setup steps.
+
+        There are currently no common build target setup steps
+        common to all DTS node types.
+
+        Args:
+            build_target_config: The build target configuration according to which
+                the setup steps will be taken.
         """
         self._set_up_build_target(build_target_config)
 
     def _set_up_build_target(
         self, build_target_config: BuildTargetConfiguration
     ) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional build target setup steps for derived classes.
+
+        Derived classes should optionally overwrite this
+        if they want to add additional build target setup steps.
         """
 
     def tear_down_build_target(self) -> None:
-        """
-        Perform the build target teardown that will be done after each build target
-        tested on this node.
+        """Build target teardown steps.
+
+        There are currently no common build target teardown steps
+        common to all DTS node types.
         """
         self._tear_down_build_target()
 
     def _tear_down_build_target(self) -> None:
-        """
-        This method exists to be optionally overwritten by derived classes and
-        is not decorated so that the derived class doesn't have to use the decorator.
+        """Optional additional build target teardown steps for derived classes.
+
+        Derived classes should overwrite this
+        if they want to add additional build target teardown steps.
         """
 
     def create_session(self, name: str) -> OSSession:
-        """
-        Create and return a new OSSession tailored to the remote OS.
+        """Create and return a new OS-agnostic remote session.
+
+        The returned session won't be used by the node creating it.
+        The session must be used by the caller.
+        Will be cleaned up automatically.
+
+        Args:
+            name: The name of the session.
+
+        Returns:
+            A new OS-agnostic remote session.
         """
         session_name = f"{self.name} {name}"
         connection = create_session(
@@ -186,14 +232,24 @@ def filter_lcores(
         filter_specifier: LogicalCoreCount | LogicalCoreList,
         ascending: bool = True,
     ) -> list[LogicalCore]:
-        """
-        Filter the LogicalCores found on the Node according to
-        a LogicalCoreCount or a LogicalCoreList.
+        """Filter the node's logical cores that DTS can use.
 
-        If ascending is True, use cores with the lowest numerical id first
-        and continue in ascending order. If False, start with the highest
-        id and continue in descending order. This ordering affects which
-        sockets to consider first as well.
+        Logical cores that DTS can use are ones that are present on the node,
+        but filtered according to user config.
+        The filter_specifier will filter cores from those logical cores.
+
+        Args:
+            filter_specifier: Two different filters can be used, one that specifies
+                the number of logical cores per core, cores per socket and
+                the number of sockets,
+                the other that specifies a logical core list.
+            ascending: If True, use cores with the lowest numerical id first
+                and continue in ascending order. If False, start with the highest
+                id and continue in descending order. This ordering affects which
+                sockets to consider first as well.
+
+        Returns:
+            A list of logical cores.
         """
         self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
         return lcore_filter(
@@ -203,17 +259,14 @@ def filter_lcores(
         ).filter()
 
     def _get_remote_cpus(self) -> None:
-        """
-        Scan CPUs in the remote OS and store a list of LogicalCores.
-        """
+        """Scan CPUs in the remote OS and store a list of LogicalCores."""
         self._logger.info("Getting CPU information.")
         self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
 
     def _setup_hugepages(self):
-        """
-        Setup hugepages on the Node. Different architectures can supply different
-        amounts of memory for hugepages and numa-based hugepage allocation may need
-        to be considered.
+        """Setup hugepages on the Node.
+
+        Configure the hugepages only if they're specified in user configuration.
         """
         if self.config.hugepages:
             self.main_session.setup_hugepages(
@@ -221,8 +274,11 @@ def _setup_hugepages(self):
             )
 
     def configure_port_state(self, port: Port, enable: bool = True) -> None:
-        """
-        Enable/disable port.
+        """Enable/disable port.
+
+        Args:
+            port: The port to enable/disable.
+            enable: True to enable, false to disable.
         """
         self.main_session.configure_port_state(port, enable)
 
@@ -232,15 +288,19 @@ def configure_port_ip_address(
         port: Port,
         delete: bool = False,
     ) -> None:
-        """
-        Configure the IP address of a port on this node.
+        """Add an IP address to a port on this node.
+
+        Args:
+            address: The IP address with mask in CIDR format.
+                Can be either IPv4 or IPv6.
+            port: The port to which to add the address.
+            delete: If True, will delete the address from the port
+                instead of adding it.
         """
         self.main_session.configure_port_ip_address(address, port, delete)
 
     def close(self) -> None:
-        """
-        Close all connections and free other resources.
-        """
+        """Close all connections and free other resources."""
         if self.main_session:
             self.main_session.close()
         for session in self._other_sessions:
@@ -249,6 +309,11 @@ def close(self) -> None:
 
     @staticmethod
     def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
+        """A decorator that skips the decorated function.
+
+        When used, the decorator executes an empty lambda function
+        instead of the decorated function.
+        """
         if SETTINGS.skip_setup:
             return lambda *args: None
         else:
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 4/4] dts: format docstrigs to google format
  2023-08-31 10:04       ` [RFC PATCH v4 4/4] dts: format docstrigs to google format Juraj Linkeš
@ 2023-09-01 17:02         ` Jeremy Spewock
  2023-10-31 12:10         ` Yoan Picchi
  1 sibling, 0 replies; 393+ messages in thread
From: Jeremy Spewock @ 2023-09-01 17:02 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson, probb, dev

[-- Attachment #1: Type: text/plain, Size: 2119 bytes --]

On Thu, Aug 31, 2023 at 6:04 AM Juraj Linkeš <juraj.linkes@pantheon.tech>
wrote:

> WIP: only one module is reformatted to serve as a demonstration.
>
> The google format is documented here [0].
>
> [0]: https://google.github.io/styleguide/pyguide.html
>
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
>  dts/framework/testbed_model/node.py | 171 +++++++++++++++++++---------
>  1 file changed, 118 insertions(+), 53 deletions(-)
>
> diff --git a/dts/framework/testbed_model/node.py
> b/dts/framework/testbed_model/node.py
> index 23efa79c50..619743ebe7 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -3,8 +3,13 @@
>  # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>  # Copyright(c) 2022-2023 University of New Hampshire
>
> -"""
> -A node is a generic host that DTS connects to and manages.
> +"""Common functionality for node management.
> +
> +There's a base class, Node, that's supposed to be extended by other
> classes
> +with functionality specific to that node type.
> +The only part that can be used standalone is the Node.skip_setup static
> method,
> +which is a decorator used to skip method execution if skip_setup is passed
> +by the user on the cmdline or in an env variable.
>  """
>
>  from abc import ABC
> @@ -35,10 +40,26 @@
>
>
>  class Node(ABC):
> -    """
> -    Basic class for node management. This class implements methods that
> -    manage a node, such as information gathering (of CPU/PCI/NIC) and
> -    environment setup.
> +    """The base class for node management.
> +
> +    It shouldn't be instantiated, but rather extended.
> +    It implements common methods to manage any node:
> +
> +       * connection to the node
> +       * information gathering of CPU
> +       * hugepages setup
> +
> +    Arguments:
>

My only comment would be we might want to make this Args instead of
arguments just to line up with the rest of the comments, but like you said
this is just a proof of concept really.

Acked-by: Jeremy Spweock <jspweock@iol.unh.edu>

[-- Attachment #2: Type: text/html, Size: 2998 bytes --]

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 3/4] dts: add doc generation
  2023-08-31 10:04       ` [RFC PATCH v4 3/4] dts: add doc generation Juraj Linkeš
@ 2023-09-20  7:08         ` Juraj Linkeš
  2023-10-26 16:43         ` Yoan Picchi
  1 sibling, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-09-20  7:08 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev

<snip>

> diff --git a/doc/guides/conf.py b/doc/guides/conf.py
> index 0f7ff5282d..737e5a5688 100644
> --- a/doc/guides/conf.py
> +++ b/doc/guides/conf.py
> @@ -7,10 +7,9 @@
>  from sphinx import __version__ as sphinx_version
>  from os import listdir
>  from os import environ
> -from os.path import basename
> -from os.path import dirname
> +from os.path import basename, dirname
>  from os.path import join as path_join
> -from sys import argv, stderr
> +from sys import argv, stderr, path
>
>  import configparser
>
> @@ -24,6 +23,29 @@
>            file=stderr)
>      pass
>
> +extensions = ['sphinx.ext.napoleon']
> +
> +# Python docstring options
> +autodoc_default_options = {
> +    'members': True,
> +    'member-order': 'bysource',
> +    'show-inheritance': True,
> +}
> +autodoc_typehints = 'both'
> +autodoc_typehints_format = 'short'
> +napoleon_numpy_docstring = False
> +napoleon_attr_annotations = True
> +napoleon_use_ivar = True
> +napoleon_use_rtype = False
> +add_module_names = False
> +toc_object_entries = False
> +
> +# Sidebar config
> +html_theme_options = {
> +    'collapse_navigation': False,
> +    'navigation_depth': -1,
> +}
> +

Thomas, Bruce,

I've added this configuration which modifies the sidebar a bit. This
affects the DPDK docs so I'd like to know whether this is permissible.
I think the sidebar works better this way even with DPDK docs, but
that may be a personal preference.

Let me know what you think.

>  stop_on_error = ('-W' in argv)
>
>  project = 'Data Plane Development Kit'

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 1/4] dts: code adjustments for sphinx
  2023-08-31 10:04       ` [RFC PATCH v4 1/4] dts: code adjustments for sphinx Juraj Linkeš
@ 2023-10-22 14:30         ` Yoan Picchi
  2023-10-23  6:44           ` Juraj Linkeš
  0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-10-22 14:30 UTC (permalink / raw)
  To: Juraj Linkeš,
	thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev

On 8/31/23 11:04, Juraj Linkeš wrote:
> sphinx-build only imports the Python modules when building the
> documentation; it doesn't run DTS. This requires changes that make the
> code importable without running it. This means:
> * properly guarding argument parsing in the if __name__ == '__main__'
>    block.
> * the logger used by DTS runner underwent the same treatment so that it
>    doesn't create unnecessary log files.
> * however, DTS uses the arguments to construct an object holding global
>    variables. The defaults for the global variables needed to be moved
>    from argument parsing elsewhere.
> * importing the remote_session module from framework resulted in
>    circular imports because of one module trying to import another
>    module. This is fixed by more granular imports.
> 
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
>   dts/framework/config/__init__.py              |  3 -
>   dts/framework/dts.py                          | 34 ++++++-
>   dts/framework/remote_session/__init__.py      | 41 ++++-----
>   .../interactive_remote_session.py             |  0
>   .../{remote => }/interactive_shell.py         |  0
>   .../{remote => }/python_shell.py              |  0
>   .../remote_session/remote/__init__.py         | 27 ------
>   .../{remote => }/remote_session.py            |  0
>   .../{remote => }/ssh_session.py               |  0
>   .../{remote => }/testpmd_shell.py             |  0
>   dts/framework/settings.py                     | 92 +++++++++++--------
>   dts/framework/test_suite.py                   |  3 +-
>   dts/framework/testbed_model/__init__.py       | 12 +--
>   dts/framework/testbed_model/common.py         | 29 ++++++
>   dts/framework/testbed_model/{hw => }/cpu.py   | 13 +++
>   dts/framework/testbed_model/hw/__init__.py    | 27 ------
>   .../linux_session.py                          |  4 +-
>   dts/framework/testbed_model/node.py           | 22 ++++-
>   .../os_session.py                             | 14 +--
>   dts/framework/testbed_model/{hw => }/port.py  |  0
>   .../posix_session.py                          |  2 +-
>   dts/framework/testbed_model/sut_node.py       |  8 +-
>   dts/framework/testbed_model/tg_node.py        | 30 +-----
>   .../traffic_generator/__init__.py             | 24 +++++
>   .../capturing_traffic_generator.py            |  2 +-
>   .../{ => traffic_generator}/scapy.py          | 17 +---
>   .../traffic_generator.py                      | 16 +++-
>   .../testbed_model/{hw => }/virtual_device.py  |  0
>   dts/framework/utils.py                        | 53 +----------
>   dts/main.py                                   |  3 +-
>   30 files changed, 229 insertions(+), 247 deletions(-)
>   rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
>   rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
>   rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
>   delete mode 100644 dts/framework/remote_session/remote/__init__.py
>   rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
>   rename dts/framework/remote_session/{remote => }/ssh_session.py (100%)
>   rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
>   create mode 100644 dts/framework/testbed_model/common.py
>   rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
>   delete mode 100644 dts/framework/testbed_model/hw/__init__.py
>   rename dts/framework/{remote_session => testbed_model}/linux_session.py (98%)
>   rename dts/framework/{remote_session => testbed_model}/os_session.py (97%)
>   rename dts/framework/testbed_model/{hw => }/port.py (100%)
>   rename dts/framework/{remote_session => testbed_model}/posix_session.py (99%)
>   create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
>   rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (99%)
>   rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (96%)
>   rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
>   rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)
> 
> diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
> index cb7e00ba34..5de8b54bcf 100644
> --- a/dts/framework/config/__init__.py
> +++ b/dts/framework/config/__init__.py
> @@ -324,6 +324,3 @@ def load_config() -> Configuration:
>       config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
>       config_obj: Configuration = Configuration.from_dict(dict(config))
>       return config_obj
> -
> -
> -CONFIGURATION = load_config()
> diff --git a/dts/framework/dts.py b/dts/framework/dts.py
> index f773f0c38d..925a212210 100644
> --- a/dts/framework/dts.py
> +++ b/dts/framework/dts.py
> @@ -3,22 +3,23 @@
>   # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>   # Copyright(c) 2022-2023 University of New Hampshire
>   
> +import logging
>   import sys
>   
>   from .config import (
> -    CONFIGURATION,
>       BuildTargetConfiguration,
>       ExecutionConfiguration,
>       TestSuiteConfig,
> +    load_config,
>   )
>   from .exception import BlockingTestSuiteError
>   from .logger import DTSLOG, getLogger
>   from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
>   from .test_suite import get_test_suites
>   from .testbed_model import SutNode, TGNode
> -from .utils import check_dts_python_version
>   
> -dts_logger: DTSLOG = getLogger("DTSRunner")
> +# dummy defaults to satisfy linters
> +dts_logger: DTSLOG | logging.Logger = logging.getLogger("DTSRunner")
>   result: DTSResult = DTSResult(dts_logger)
>   
>   
> @@ -30,14 +31,18 @@ def run_all() -> None:
>       global dts_logger
>       global result
>   
> +    # create a regular DTS logger and create a new result with it
> +    dts_logger = getLogger("DTSRunner")
> +    result = DTSResult(dts_logger)
> +
>       # check the python version of the server that run dts
> -    check_dts_python_version()
> +    _check_dts_python_version()
>   
>       sut_nodes: dict[str, SutNode] = {}
>       tg_nodes: dict[str, TGNode] = {}
>       try:
>           # for all Execution sections
> -        for execution in CONFIGURATION.executions:
> +        for execution in load_config().executions:
>               sut_node = sut_nodes.get(execution.system_under_test_node.name)
>               tg_node = tg_nodes.get(execution.traffic_generator_node.name)
>   
> @@ -82,6 +87,25 @@ def run_all() -> None:
>       _exit_dts()
>   
>   
> +def _check_dts_python_version() -> None:
> +    def RED(text: str) -> str:
> +        return f"\u001B[31;1m{str(text)}\u001B[0m"
> +
> +    if sys.version_info.major < 3 or (
> +        sys.version_info.major == 3 and sys.version_info.minor < 10
> +    ):
> +        print(
> +            RED(
> +                (
> +                    "WARNING: DTS execution node's python version is lower than"
> +                    "python 3.10, is deprecated and will not work in future releases."
> +                )
> +            ),
> +            file=sys.stderr,
> +        )
> +        print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> +
> +
>   def _run_execution(
>       sut_node: SutNode,
>       tg_node: TGNode,
> diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
> index 00b6d1f03a..5e7ddb2b05 100644
> --- a/dts/framework/remote_session/__init__.py
> +++ b/dts/framework/remote_session/__init__.py
> @@ -12,29 +12,24 @@
>   
>   # pylama:ignore=W0611
>   
> -from framework.config import OS, NodeConfiguration
> -from framework.exception import ConfigurationError
> +from framework.config import NodeConfiguration
>   from framework.logger import DTSLOG
>   
> -from .linux_session import LinuxSession
> -from .os_session import InteractiveShellType, OSSession
> -from .remote import (
> -    CommandResult,
> -    InteractiveRemoteSession,
> -    InteractiveShell,
> -    PythonShell,
> -    RemoteSession,
> -    SSHSession,
> -    TestPmdDevice,
> -    TestPmdShell,
> -)
> -
> -
> -def create_session(
> +from .interactive_remote_session import InteractiveRemoteSession
> +from .interactive_shell import InteractiveShell
> +from .python_shell import PythonShell
> +from .remote_session import CommandResult, RemoteSession
> +from .ssh_session import SSHSession
> +from .testpmd_shell import TestPmdShell
> +
> +
> +def create_remote_session(
>       node_config: NodeConfiguration, name: str, logger: DTSLOG
> -) -> OSSession:
> -    match node_config.os:
> -        case OS.linux:
> -            return LinuxSession(node_config, name, logger)
> -        case _:
> -            raise ConfigurationError(f"Unsupported OS {node_config.os}")
> +) -> RemoteSession:
> +    return SSHSession(node_config, name, logger)
> +
> +
> +def create_interactive_session(
> +    node_config: NodeConfiguration, logger: DTSLOG
> +) -> InteractiveRemoteSession:
> +    return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/interactive_remote_session.py
> rename to dts/framework/remote_session/interactive_remote_session.py
> diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/interactive_shell.py
> rename to dts/framework/remote_session/interactive_shell.py
> diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/python_shell.py
> rename to dts/framework/remote_session/python_shell.py
> diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
> deleted file mode 100644
> index 06403691a5..0000000000
> --- a/dts/framework/remote_session/remote/__init__.py
> +++ /dev/null
> @@ -1,27 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -# Copyright(c) 2023 University of New Hampshire
> -
> -# pylama:ignore=W0611
> -
> -from framework.config import NodeConfiguration
> -from framework.logger import DTSLOG
> -
> -from .interactive_remote_session import InteractiveRemoteSession
> -from .interactive_shell import InteractiveShell
> -from .python_shell import PythonShell
> -from .remote_session import CommandResult, RemoteSession
> -from .ssh_session import SSHSession
> -from .testpmd_shell import TestPmdDevice, TestPmdShell
> -
> -
> -def create_remote_session(
> -    node_config: NodeConfiguration, name: str, logger: DTSLOG
> -) -> RemoteSession:
> -    return SSHSession(node_config, name, logger)
> -
> -
> -def create_interactive_session(
> -    node_config: NodeConfiguration, logger: DTSLOG
> -) -> InteractiveRemoteSession:
> -    return InteractiveRemoteSession(node_config, logger)
> diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/remote_session.py
> rename to dts/framework/remote_session/remote_session.py
> diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/ssh_session.py
> rename to dts/framework/remote_session/ssh_session.py
> diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
> similarity index 100%
> rename from dts/framework/remote_session/remote/testpmd_shell.py
> rename to dts/framework/remote_session/testpmd_shell.py
> diff --git a/dts/framework/settings.py b/dts/framework/settings.py
> index cfa39d011b..bf86861efb 100644
> --- a/dts/framework/settings.py
> +++ b/dts/framework/settings.py
> @@ -6,7 +6,7 @@
>   import argparse
>   import os
>   from collections.abc import Callable, Iterable, Sequence
> -from dataclasses import dataclass
> +from dataclasses import dataclass, field
>   from pathlib import Path
>   from typing import Any, TypeVar
>   
> @@ -22,7 +22,7 @@ def __init__(
>               option_strings: Sequence[str],
>               dest: str,
>               nargs: str | int | None = None,
> -            const: str | None = None,
> +            const: bool | None = None,
>               default: str = None,
>               type: Callable[[str], _T | argparse.FileType | None] = None,
>               choices: Iterable[_T] | None = None,
> @@ -32,6 +32,12 @@ def __init__(
>           ) -> None:
>               env_var_value = os.environ.get(env_var)
>               default = env_var_value or default
> +            if const is not None:
> +                nargs = 0
> +                default = const if env_var_value else default
> +                type = None
> +                choices = None
> +                metavar = None
>               super(_EnvironmentArgument, self).__init__(
>                   option_strings,
>                   dest,
> @@ -52,22 +58,28 @@ def __call__(
>               values: Any,
>               option_string: str = None,
>           ) -> None:
> -            setattr(namespace, self.dest, values)
> +            if self.const is not None:
> +                setattr(namespace, self.dest, self.const)
> +            else:
> +                setattr(namespace, self.dest, values)
>   
>       return _EnvironmentArgument
>   
>   
> -@dataclass(slots=True, frozen=True)
> +@dataclass(slots=True)
>   class _Settings:
> -    config_file_path: str
> -    output_dir: str
> -    timeout: float
> -    verbose: bool
> -    skip_setup: bool
> -    dpdk_tarball_path: Path
> -    compile_timeout: float
> -    test_cases: list
> -    re_run: int
> +    config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
> +    output_dir: str = "output"
> +    timeout: float = 15
> +    verbose: bool = False
> +    skip_setup: bool = False
> +    dpdk_tarball_path: Path | str = "dpdk.tar.xz"
> +    compile_timeout: float = 1200
> +    test_cases: list[str] = field(default_factory=list)
> +    re_run: int = 0
> +
> +
> +SETTINGS: _Settings = _Settings()
>   
>   
>   def _get_parser() -> argparse.ArgumentParser:
> @@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
>       parser.add_argument(
>           "--config-file",
>           action=_env_arg("DTS_CFG_FILE"),
> -        default="conf.yaml",
> +        default=SETTINGS.config_file_path,
> +        type=Path,
>           help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
>           "and targets.",
>       )
> @@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
>           "--output-dir",
>           "--output",
>           action=_env_arg("DTS_OUTPUT_DIR"),
> -        default="output",
> +        default=SETTINGS.output_dir,
>           help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
>       )
>   
> @@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
>           "-t",
>           "--timeout",
>           action=_env_arg("DTS_TIMEOUT"),
> -        default=15,
> +        default=SETTINGS.timeout,
>           type=float,
>           help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
>           "compiling DPDK.",
> @@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
>           "-v",
>           "--verbose",
>           action=_env_arg("DTS_VERBOSE"),
> -        default="N",
> -        help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
> +        default=SETTINGS.verbose,
> +        const=True,
> +        help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
>           "to the console.",
>       )
>   
> @@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
>           "-s",
>           "--skip-setup",
>           action=_env_arg("DTS_SKIP_SETUP"),
> -        default="N",
> -        help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
> +        const=True,
> +        help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
>       )
>   
>       parser.add_argument(
> @@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
>           "--snapshot",
>           "--git-ref",
>           action=_env_arg("DTS_DPDK_TARBALL"),
> -        default="dpdk.tar.xz",
> +        default=SETTINGS.dpdk_tarball_path,
>           type=Path,
>           help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
>           "tag ID or tree ID to test. To test local changes, first commit them, "
> @@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
>       parser.add_argument(
>           "--compile-timeout",
>           action=_env_arg("DTS_COMPILE_TIMEOUT"),
> -        default=1200,
> +        default=SETTINGS.compile_timeout,
>           type=float,
>           help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
>       )
> @@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
>           "--re-run",
>           "--re_run",
>           action=_env_arg("DTS_RERUN"),
> -        default=0,
> +        default=SETTINGS.re_run,
>           type=int,
>           help="[DTS_RERUN] Re-run each test case the specified amount of times "
>           "if a test failure occurs",
> @@ -162,23 +176,21 @@ def _get_parser() -> argparse.ArgumentParser:
>       return parser
>   
>   
> -def _get_settings() -> _Settings:
> +def set_settings() -> None:
>       parsed_args = _get_parser().parse_args()
> -    return _Settings(
> -        config_file_path=parsed_args.config_file,
> -        output_dir=parsed_args.output_dir,
> -        timeout=parsed_args.timeout,
> -        verbose=(parsed_args.verbose == "Y"),
> -        skip_setup=(parsed_args.skip_setup == "Y"),
> -        dpdk_tarball_path=Path(
> -            DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
> -        )
> +    global SETTINGS
> +    SETTINGS.config_file_path = parsed_args.config_file
> +    SETTINGS.output_dir = parsed_args.output_dir
> +    SETTINGS.timeout = parsed_args.timeout
> +    SETTINGS.verbose = parsed_args.verbose
> +    SETTINGS.skip_setup = parsed_args.skip_setup
> +    SETTINGS.dpdk_tarball_path = (
> +        Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
>           if not os.path.exists(parsed_args.tarball)
> -        else Path(parsed_args.tarball),
> -        compile_timeout=parsed_args.compile_timeout,
> -        test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
> -        re_run=parsed_args.re_run,
> +        else Path(parsed_args.tarball)
>       )
> -
> -
> -SETTINGS: _Settings = _get_settings()
> +    SETTINGS.compile_timeout = parsed_args.compile_timeout
> +    SETTINGS.test_cases = (
> +        parsed_args.test_cases.split(",") if parsed_args.test_cases else []
> +    )
> +    SETTINGS.re_run = parsed_args.re_run
> diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
> index 3b890c0451..b381990d98 100644
> --- a/dts/framework/test_suite.py
> +++ b/dts/framework/test_suite.py
> @@ -26,8 +26,7 @@
>   from .logger import DTSLOG, getLogger
>   from .settings import SETTINGS
>   from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
> -from .testbed_model import SutNode, TGNode
> -from .testbed_model.hw.port import Port, PortLink
> +from .testbed_model import Port, PortLink, SutNode, TGNode
>   from .utils import get_packet_summaries
>   
>   
> diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
> index 5cbb859e47..8ced05653b 100644
> --- a/dts/framework/testbed_model/__init__.py
> +++ b/dts/framework/testbed_model/__init__.py
> @@ -9,15 +9,9 @@
>   
>   # pylama:ignore=W0611
>   
> -from .hw import (
> -    LogicalCore,
> -    LogicalCoreCount,
> -    LogicalCoreCountFilter,
> -    LogicalCoreList,
> -    LogicalCoreListFilter,
> -    VirtualDevice,
> -    lcore_filter,
> -)
> +from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
>   from .node import Node
> +from .port import Port, PortLink
>   from .sut_node import SutNode
>   from .tg_node import TGNode
> +from .virtual_device import VirtualDevice
> diff --git a/dts/framework/testbed_model/common.py b/dts/framework/testbed_model/common.py
> new file mode 100644
> index 0000000000..9222f57847
> --- /dev/null
> +++ b/dts/framework/testbed_model/common.py
> @@ -0,0 +1,29 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +
> +class MesonArgs(object):
> +    """
> +    Aggregate the arguments needed to build DPDK:
> +    default_library: Default library type, Meson allows "shared", "static" and "both".
> +               Defaults to None, in which case the argument won't be used.
> +    Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
> +               Do not use -D with them, for example:
> +               meson_args = MesonArgs(enable_kmods=True).
> +    """
> +
> +    _default_library: str
> +
> +    def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
> +        self._default_library = (
> +            f"--default-library={default_library}" if default_library else ""
> +        )
> +        self._dpdk_args = " ".join(
> +            (
> +                f"-D{dpdk_arg_name}={dpdk_arg_value}"
> +                for dpdk_arg_name, dpdk_arg_value in dpdk_args.items()
> +            )
> +        )
> +
> +    def __str__(self) -> str:
> +        return " ".join(f"{self._default_library} {self._dpdk_args}".split())
> diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
> similarity index 95%
> rename from dts/framework/testbed_model/hw/cpu.py
> rename to dts/framework/testbed_model/cpu.py
> index d1918a12dc..8fe785dfe4 100644
> --- a/dts/framework/testbed_model/hw/cpu.py
> +++ b/dts/framework/testbed_model/cpu.py
> @@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
>               )
>   
>           return filtered_lcores
> +
> +
> +def lcore_filter(
> +    core_list: list[LogicalCore],
> +    filter_specifier: LogicalCoreCount | LogicalCoreList,
> +    ascending: bool,
> +) -> LogicalCoreFilter:
> +    if isinstance(filter_specifier, LogicalCoreList):
> +        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
> +    elif isinstance(filter_specifier, LogicalCoreCount):
> +        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
> +    else:
> +        raise ValueError(f"Unsupported filter r{filter_specifier}")
> diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
> deleted file mode 100644
> index 88ccac0b0e..0000000000
> --- a/dts/framework/testbed_model/hw/__init__.py
> +++ /dev/null
> @@ -1,27 +0,0 @@
> -# SPDX-License-Identifier: BSD-3-Clause
> -# Copyright(c) 2023 PANTHEON.tech s.r.o.
> -
> -# pylama:ignore=W0611
> -
> -from .cpu import (
> -    LogicalCore,
> -    LogicalCoreCount,
> -    LogicalCoreCountFilter,
> -    LogicalCoreFilter,
> -    LogicalCoreList,
> -    LogicalCoreListFilter,
> -)
> -from .virtual_device import VirtualDevice
> -
> -
> -def lcore_filter(
> -    core_list: list[LogicalCore],
> -    filter_specifier: LogicalCoreCount | LogicalCoreList,
> -    ascending: bool,
> -) -> LogicalCoreFilter:
> -    if isinstance(filter_specifier, LogicalCoreList):
> -        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
> -    elif isinstance(filter_specifier, LogicalCoreCount):
> -        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
> -    else:
> -        raise ValueError(f"Unsupported filter r{filter_specifier}")
> diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
> similarity index 98%
> rename from dts/framework/remote_session/linux_session.py
> rename to dts/framework/testbed_model/linux_session.py
> index a3f1a6bf3b..7b60b5353f 100644
> --- a/dts/framework/remote_session/linux_session.py
> +++ b/dts/framework/testbed_model/linux_session.py
> @@ -9,10 +9,10 @@
>   from typing_extensions import NotRequired
>   
>   from framework.exception import RemoteCommandExecutionError
> -from framework.testbed_model import LogicalCore
> -from framework.testbed_model.hw.port import Port
>   from framework.utils import expand_range
>   
> +from .cpu import LogicalCore
> +from .port import Port
>   from .posix_session import PosixSession
>   
>   
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index fc01e0bf8e..23efa79c50 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -12,23 +12,26 @@
>   from typing import Any, Callable, Type, Union
>   
>   from framework.config import (
> +    OS,
>       BuildTargetConfiguration,
>       ExecutionConfiguration,
>       NodeConfiguration,
>   )
> +from framework.exception import ConfigurationError
>   from framework.logger import DTSLOG, getLogger
> -from framework.remote_session import InteractiveShellType, OSSession, create_session
>   from framework.settings import SETTINGS
>   
> -from .hw import (
> +from .cpu import (
>       LogicalCore,
>       LogicalCoreCount,
>       LogicalCoreList,
>       LogicalCoreListFilter,
> -    VirtualDevice,
>       lcore_filter,
>   )
> -from .hw.port import Port
> +from .linux_session import LinuxSession
> +from .os_session import InteractiveShellType, OSSession
> +from .port import Port
> +from .virtual_device import VirtualDevice
>   
>   
>   class Node(ABC):
> @@ -69,6 +72,7 @@ def __init__(self, node_config: NodeConfiguration):
>       def _init_ports(self) -> None:
>           self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
>           self.main_session.update_ports(self.ports)
> +
>           for port in self.ports:
>               self.configure_port_state(port)
>   
> @@ -249,3 +253,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
>               return lambda *args: None
>           else:
>               return func
> +
> +
> +def create_session(
> +    node_config: NodeConfiguration, name: str, logger: DTSLOG
> +) -> OSSession:
> +    match node_config.os:
> +        case OS.linux:
> +            return LinuxSession(node_config, name, logger)
> +        case _:
> +            raise ConfigurationError(f"Unsupported OS {node_config.os}")
> diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
> similarity index 97%
> rename from dts/framework/remote_session/os_session.py
> rename to dts/framework/testbed_model/os_session.py
> index 8a709eac1c..19ba9a69d5 100644
> --- a/dts/framework/remote_session/os_session.py
> +++ b/dts/framework/testbed_model/os_session.py
> @@ -10,19 +10,19 @@
>   
>   from framework.config import Architecture, NodeConfiguration, NodeInfo
>   from framework.logger import DTSLOG
> -from framework.remote_session.remote import InteractiveShell
> -from framework.settings import SETTINGS
> -from framework.testbed_model import LogicalCore
> -from framework.testbed_model.hw.port import Port
> -from framework.utils import MesonArgs
> -
> -from .remote import (
> +from framework.remote_session import (
>       CommandResult,
>       InteractiveRemoteSession,
> +    InteractiveShell,
>       RemoteSession,
>       create_interactive_session,
>       create_remote_session,
>   )
> +from framework.settings import SETTINGS
> +
> +from .common import MesonArgs
> +from .cpu import LogicalCore
> +from .port import Port
>   
>   InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
>   
> diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
> similarity index 100%
> rename from dts/framework/testbed_model/hw/port.py
> rename to dts/framework/testbed_model/port.py
> diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
> similarity index 99%
> rename from dts/framework/remote_session/posix_session.py
> rename to dts/framework/testbed_model/posix_session.py
> index 5da0516e05..9a95baa353 100644
> --- a/dts/framework/remote_session/posix_session.py
> +++ b/dts/framework/testbed_model/posix_session.py
> @@ -9,8 +9,8 @@
>   from framework.config import Architecture, NodeInfo
>   from framework.exception import DPDKBuildError, RemoteCommandExecutionError
>   from framework.settings import SETTINGS
> -from framework.utils import MesonArgs
>   
> +from .common import MesonArgs
>   from .os_session import OSSession
>   
>   
> diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
> index 202aebfd06..2b7e104dcd 100644
> --- a/dts/framework/testbed_model/sut_node.py
> +++ b/dts/framework/testbed_model/sut_node.py
> @@ -15,12 +15,14 @@
>       NodeInfo,
>       SutNodeConfiguration,
>   )
> -from framework.remote_session import CommandResult, InteractiveShellType, OSSession
> +from framework.remote_session import CommandResult
>   from framework.settings import SETTINGS
> -from framework.utils import MesonArgs
>   
> -from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
> +from .common import MesonArgs
> +from .cpu import LogicalCoreCount, LogicalCoreList
>   from .node import Node
> +from .os_session import InteractiveShellType, OSSession
> +from .virtual_device import VirtualDevice
>   
>   
>   class EalParameters(object):
> diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
> index 27025cfa31..166eb8430e 100644
> --- a/dts/framework/testbed_model/tg_node.py
> +++ b/dts/framework/testbed_model/tg_node.py
> @@ -16,16 +16,11 @@
>   
>   from scapy.packet import Packet  # type: ignore[import]
>   
> -from framework.config import (
> -    ScapyTrafficGeneratorConfig,
> -    TGNodeConfiguration,
> -    TrafficGeneratorType,
> -)
> -from framework.exception import ConfigurationError
> -
> -from .capturing_traffic_generator import CapturingTrafficGenerator
> -from .hw.port import Port
> +from framework.config import TGNodeConfiguration
> +
>   from .node import Node
> +from .port import Port
> +from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
>   
>   
>   class TGNode(Node):
> @@ -80,20 +75,3 @@ def close(self) -> None:
>           """Free all resources used by the node"""
>           self.traffic_generator.close()
>           super(TGNode, self).close()
> -
> -
> -def create_traffic_generator(
> -    tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
> -) -> CapturingTrafficGenerator:
> -    """A factory function for creating traffic generator object from user config."""
> -
> -    from .scapy import ScapyTrafficGenerator
> -
> -    match traffic_generator_config.traffic_generator_type:
> -        case TrafficGeneratorType.SCAPY:
> -            return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> -        case _:
> -            raise ConfigurationError(
> -                "Unknown traffic generator: "
> -                f"{traffic_generator_config.traffic_generator_type}"
> -            )
> diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
> new file mode 100644
> index 0000000000..11bfa1ee0f
> --- /dev/null
> +++ b/dts/framework/testbed_model/traffic_generator/__init__.py
> @@ -0,0 +1,24 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
> +from framework.exception import ConfigurationError
> +from framework.testbed_model.node import Node
> +
> +from .capturing_traffic_generator import CapturingTrafficGenerator
> +from .scapy import ScapyTrafficGenerator
> +
> +
> +def create_traffic_generator(
> +    tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
> +) -> CapturingTrafficGenerator:
> +    """A factory function for creating traffic generator object from user config."""
> +
> +    match traffic_generator_config.traffic_generator_type:
> +        case TrafficGeneratorType.SCAPY:
> +            return ScapyTrafficGenerator(tg_node, traffic_generator_config)
> +        case _:
> +            raise ConfigurationError(
> +                "Unknown traffic generator: "
> +                f"{traffic_generator_config.traffic_generator_type}"
> +            )
> diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> similarity index 99%
> rename from dts/framework/testbed_model/capturing_traffic_generator.py
> rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> index ab98987f8e..765378fb4a 100644
> --- a/dts/framework/testbed_model/capturing_traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
> @@ -16,9 +16,9 @@
>   from scapy.packet import Packet  # type: ignore[import]
>   
>   from framework.settings import SETTINGS
> +from framework.testbed_model.port import Port
>   from framework.utils import get_packet_summaries
>   
> -from .hw.port import Port
>   from .traffic_generator import TrafficGenerator
>   
>   
> diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
> similarity index 96%
> rename from dts/framework/testbed_model/scapy.py
> rename to dts/framework/testbed_model/traffic_generator/scapy.py
> index af0d4dbb25..395d7e9bc0 100644
> --- a/dts/framework/testbed_model/scapy.py
> +++ b/dts/framework/testbed_model/traffic_generator/scapy.py
> @@ -24,16 +24,15 @@
>   from scapy.packet import Packet  # type: ignore[import]
>   
>   from framework.config import OS, ScapyTrafficGeneratorConfig
> -from framework.logger import DTSLOG, getLogger
>   from framework.remote_session import PythonShell
>   from framework.settings import SETTINGS
> +from framework.testbed_model.node import Node
> +from framework.testbed_model.port import Port
>   
>   from .capturing_traffic_generator import (
>       CapturingTrafficGenerator,
>       _get_default_capture_name,
>   )
> -from .hw.port import Port
> -from .tg_node import TGNode
>   
>   """
>   ========= BEGIN RPC FUNCTIONS =========
> @@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
>       session: PythonShell
>       rpc_server_proxy: xmlrpc.client.ServerProxy
>       _config: ScapyTrafficGeneratorConfig
> -    _tg_node: TGNode
> -    _logger: DTSLOG
> -
> -    def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
> -        self._config = config
> -        self._tg_node = tg_node
> -        self._logger = getLogger(
> -            f"{self._tg_node.name} {self._config.traffic_generator_type}"
> -        )
> +
> +    def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
> +        super().__init__(tg_node, config)
>   
>           assert (
>               self._tg_node.config.os == OS.linux
> diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> similarity index 80%
> rename from dts/framework/testbed_model/traffic_generator.py
> rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
> index 28c35d3ce4..ea7c3963da 100644
> --- a/dts/framework/testbed_model/traffic_generator.py
> +++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
> @@ -12,11 +12,12 @@
>   
>   from scapy.packet import Packet  # type: ignore[import]
>   
> -from framework.logger import DTSLOG
> +from framework.config import TrafficGeneratorConfig
> +from framework.logger import DTSLOG, getLogger
> +from framework.testbed_model.node import Node
> +from framework.testbed_model.port import Port
>   from framework.utils import get_packet_summaries
>   
> -from .hw.port import Port
> -
>   
>   class TrafficGenerator(ABC):
>       """The base traffic generator.
> @@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
>       Defines the few basic methods that each traffic generator must implement.
>       """
>   
> +    _config: TrafficGeneratorConfig
> +    _tg_node: Node
>       _logger: DTSLOG
>   
> +    def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
> +        self._config = config
> +        self._tg_node = tg_node
> +        self._logger = getLogger(
> +            f"{self._tg_node.name} {self._config.traffic_generator_type}"
> +        )
> +
>       def send_packet(self, packet: Packet, port: Port) -> None:
>           """Send a packet and block until it is fully sent.
>   
> diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
> similarity index 100%
> rename from dts/framework/testbed_model/hw/virtual_device.py
> rename to dts/framework/testbed_model/virtual_device.py
> diff --git a/dts/framework/utils.py b/dts/framework/utils.py
> index d27c2c5b5f..07e2d1c076 100644
> --- a/dts/framework/utils.py
> +++ b/dts/framework/utils.py
> @@ -7,7 +7,6 @@
>   import json
>   import os
>   import subprocess
> -import sys
>   from enum import Enum
>   from pathlib import Path
>   from subprocess import SubprocessError
> @@ -16,6 +15,8 @@
>   
>   from .exception import ConfigurationError
>   
> +REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> +
>   
>   class StrEnum(Enum):
>       @staticmethod
> @@ -28,25 +29,6 @@ def __str__(self) -> str:
>           return self.name
>   
>   
> -REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
> -
> -
> -def check_dts_python_version() -> None:
> -    if sys.version_info.major < 3 or (
> -        sys.version_info.major == 3 and sys.version_info.minor < 10
> -    ):
> -        print(
> -            RED(
> -                (
> -                    "WARNING: DTS execution node's python version is lower than"
> -                    "python 3.10, is deprecated and will not work in future releases."
> -                )
> -            ),
> -            file=sys.stderr,
> -        )
> -        print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
> -
> -
>   def expand_range(range_str: str) -> list[int]:
>       """
>       Process range string into a list of integers. There are two possible formats:
> @@ -77,37 +59,6 @@ def get_packet_summaries(packets: list[Packet]):
>       return f"Packet contents: \n{packet_summaries}"
>   
>   
> -def RED(text: str) -> str:
> -    return f"\u001B[31;1m{str(text)}\u001B[0m"
> -
> -
> -class MesonArgs(object):
> -    """
> -    Aggregate the arguments needed to build DPDK:
> -    default_library: Default library type, Meson allows "shared", "static" and "both".
> -               Defaults to None, in which case the argument won't be used.
> -    Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
> -               Do not use -D with them, for example:
> -               meson_args = MesonArgs(enable_kmods=True).
> -    """
> -
> -    _default_library: str
> -
> -    def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
> -        self._default_library = (
> -            f"--default-library={default_library}" if default_library else ""
> -        )
> -        self._dpdk_args = " ".join(
> -            (
> -                f"-D{dpdk_arg_name}={dpdk_arg_value}"
> -                for dpdk_arg_name, dpdk_arg_value in dpdk_args.items()
> -            )
> -        )
> -
> -    def __str__(self) -> str:
> -        return " ".join(f"{self._default_library} {self._dpdk_args}".split())
> -
> -
>   class _TarCompressionFormat(StrEnum):
>       """Compression formats that tar can use.
>   
> diff --git a/dts/main.py b/dts/main.py
> index 43311fa847..060ff1b19a 100755
> --- a/dts/main.py
> +++ b/dts/main.py
> @@ -10,10 +10,11 @@
>   
>   import logging
>   
> -from framework import dts
> +from framework import dts, settings
>   
>   
>   def main() -> None:
> +    settings.set_settings()
>       dts.run_all()
>   
>   

My only nitpick comment would be on the name of the file common.py that 
only contain the MesonArgs class. Looks good otherwise

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 1/4] dts: code adjustments for sphinx
  2023-10-22 14:30         ` Yoan Picchi
@ 2023-10-23  6:44           ` Juraj Linkeš
  2023-10-23 11:52             ` Yoan Picchi
  0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-10-23  6:44 UTC (permalink / raw)
  To: Yoan Picchi
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb, dev

<snip>

>
> My only nitpick comment would be on the name of the file common.py that
> only contain the MesonArgs class. Looks good otherwise

Could you elaborate a bit more, Yoan? The common.py module is supposed
to be extended with code common to all other modules in the
testbed_model package. Right now we only have MesonArgs which fits in
common.py, but we could also move something else into common.py. We
also could rename common.py to something else, but then the above
purpose would not be clear.

I'm finishing the docstrings soon so expect a new version where things
like these will be clearer. :-)

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 1/4] dts: code adjustments for sphinx
  2023-10-23  6:44           ` Juraj Linkeš
@ 2023-10-23 11:52             ` Yoan Picchi
  2023-10-24  6:39               ` Juraj Linkeš
  0 siblings, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-10-23 11:52 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb, dev

On 10/23/23 07:44, Juraj Linkeš wrote:
> <snip>
> 
>>
>> My only nitpick comment would be on the name of the file common.py that
>> only contain the MesonArgs class. Looks good otherwise
> 
> Could you elaborate a bit more, Yoan? The common.py module is supposed
> to be extended with code common to all other modules in the
> testbed_model package. Right now we only have MesonArgs which fits in
> common.py, but we could also move something else into common.py. We
> also could rename common.py to something else, but then the above
> purpose would not be clear.
> 
> I'm finishing the docstrings soon so expect a new version where things
> like these will be clearer. :-)

My issue with the name is that it isn't clear what is the purpose of 
this file. It only tell to some extend how it is used.
If we start adding more things in this file, then I see two options:
	- Either this is related to the current class, and thus the file could 
be named meson_arg or something along those lines.
	- Or it is unrelated to the current class, and we end up with a file 
coalescing random bits of code, which is usually a bit dirty in OOP.

Like I said, it's a bit of a nitpick, but given it's an RFC I hope 
you'll give it a thought in the next version.

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 1/4] dts: code adjustments for sphinx
  2023-10-23 11:52             ` Yoan Picchi
@ 2023-10-24  6:39               ` Juraj Linkeš
  2023-10-24 12:21                 ` Yoan Picchi
  0 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-10-24  6:39 UTC (permalink / raw)
  To: Yoan Picchi
  Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb, dev

On Mon, Oct 23, 2023 at 1:52 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 10/23/23 07:44, Juraj Linkeš wrote:
> > <snip>
> >
> >>
> >> My only nitpick comment would be on the name of the file common.py that
> >> only contain the MesonArgs class. Looks good otherwise
> >
> > Could you elaborate a bit more, Yoan? The common.py module is supposed
> > to be extended with code common to all other modules in the
> > testbed_model package. Right now we only have MesonArgs which fits in
> > common.py, but we could also move something else into common.py. We
> > also could rename common.py to something else, but then the above
> > purpose would not be clear.
> >
> > I'm finishing the docstrings soon so expect a new version where things
> > like these will be clearer. :-)
>
> My issue with the name is that it isn't clear what is the purpose of
> this file. It only tell to some extend how it is used.

Well, the name suggests it's code that's common to other modules, as
in code that other modules use, just like the MesonArgs class, which
is used in three different modules. I've chosen common.py as that's
what some of the DPDK libs (such as EAL) seem to be using for this
purpose. Maybe there's a better name though or we could move the class
elsewhere.

> If we start adding more things in this file, then I see two options:
>         - Either this is related to the current class, and thus the file could
> be named meson_arg or something along those lines.
>         - Or it is unrelated to the current class, and we end up with a file
> coalescing random bits of code, which is usually a bit dirty in OOP.
>

It's code that's reused in multiple places, I'm not sure whether that
qualifies as random bits of code. It could be in os_session.py (as
that would work import-wise), but that's not a good place to put it,
as the logic is actually utilized in sut_node.py. But putting it into
sut_node.py doesn't work because of imports. Maybe we could just put
it into utils.py in the framework dir, which is a very similar file,
if not the same. My original thoughts were to have a file with common
code in each package (if needed), depending on where the code is used
(package level-wise), but it looks like we can just have this sort of
common utilities on the top level.

> Like I said, it's a bit of a nitpick, but given it's an RFC I hope
> you'll give it a thought in the next version.

I thought a lot about this before submitting this RFC, but I wanted
someone to have a look at this exact thing - whether the common.py
file makes sense and what is the better name, common.py or utils.py
(which is why I have both in this patch). I'll move the MesonArgs
class to the top level utils.py and remove the common.py file as that
makes the most sense to me now.

If you have any recommendations we may be able to make this even better.

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 1/4] dts: code adjustments for sphinx
  2023-10-24  6:39               ` Juraj Linkeš
@ 2023-10-24 12:21                 ` Yoan Picchi
  0 siblings, 0 replies; 393+ messages in thread
From: Yoan Picchi @ 2023-10-24 12:21 UTC (permalink / raw)
  To: Juraj Linkeš
  Cc: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb, dev

On 10/24/23 07:39, Juraj Linkeš wrote:
> On Mon, Oct 23, 2023 at 1:52 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>>
>> On 10/23/23 07:44, Juraj Linkeš wrote:
>>> <snip>
>>>
>>>>
>>>> My only nitpick comment would be on the name of the file common.py that
>>>> only contain the MesonArgs class. Looks good otherwise
>>>
>>> Could you elaborate a bit more, Yoan? The common.py module is supposed
>>> to be extended with code common to all other modules in the
>>> testbed_model package. Right now we only have MesonArgs which fits in
>>> common.py, but we could also move something else into common.py. We
>>> also could rename common.py to something else, but then the above
>>> purpose would not be clear.
>>>
>>> I'm finishing the docstrings soon so expect a new version where things
>>> like these will be clearer. :-)
>>
>> My issue with the name is that it isn't clear what is the purpose of
>> this file. It only tell to some extend how it is used.
> 
> Well, the name suggests it's code that's common to other modules, as
> in code that other modules use, just like the MesonArgs class, which
> is used in three different modules. I've chosen common.py as that's
> what some of the DPDK libs (such as EAL) seem to be using for this
> purpose. Maybe there's a better name though or we could move the class
> elsewhere.
> 
>> If we start adding more things in this file, then I see two options:
>>          - Either this is related to the current class, and thus the file could
>> be named meson_arg or something along those lines.
>>          - Or it is unrelated to the current class, and we end up with a file
>> coalescing random bits of code, which is usually a bit dirty in OOP.
>>
> 
> It's code that's reused in multiple places, I'm not sure whether that
> qualifies as random bits of code. It could be in os_session.py (as
> that would work import-wise), but that's not a good place to put it,
> as the logic is actually utilized in sut_node.py. But putting it into
> sut_node.py doesn't work because of imports. Maybe we could just put
> it into utils.py in the framework dir, which is a very similar file,
> if not the same. My original thoughts were to have a file with common
> code in each package (if needed), depending on where the code is used
> (package level-wise), but it looks like we can just have this sort of
> common utilities on the top level.
> 
>> Like I said, it's a bit of a nitpick, but given it's an RFC I hope
>> you'll give it a thought in the next version.
> 
> I thought a lot about this before submitting this RFC, but I wanted
> someone to have a look at this exact thing - whether the common.py
> file makes sense and what is the better name, common.py or utils.py
> (which is why I have both in this patch). I'll move the MesonArgs
> class to the top level utils.py and remove the common.py file as that
> makes the most sense to me now.
> 
> If you have any recommendations we may be able to make this even better.

I didn't meant to imply you did not think a lot about it, sorry if it 
came that way.
I prefer the idea of utils.py to a common.py, be it at package level or 
above. There might also be the option of __init__.py but I'm not sure 
about it.
That being said, I'm relatively new to dpdk and didn't know common.py 
was a common thing in EAL so I'll leave it up to you.

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 3/4] dts: add doc generation
  2023-08-31 10:04       ` [RFC PATCH v4 3/4] dts: add doc generation Juraj Linkeš
  2023-09-20  7:08         ` Juraj Linkeš
@ 2023-10-26 16:43         ` Yoan Picchi
  2023-10-27  9:52           ` Juraj Linkeš
  1 sibling, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-10-26 16:43 UTC (permalink / raw)
  To: Juraj Linkeš,
	thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev

On 8/31/23 11:04, Juraj Linkeš wrote:
> The tool used to generate developer docs is sphinx, which is already
> used in DPDK. The configuration is kept the same to preserve the style.
> 
> Sphinx generates the documentation from Python docstrings. The docstring
> format most suitable for DTS seems to be the Google format [0] which
> requires the sphinx.ext.napoleon extension.
> 
> There are two requirements for building DTS docs:
> * The same Python version as DTS or higher, because Sphinx import the
>    code.
> * Also the same Python packages as DTS, for the same reason.
> 
> [0] https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings
> 
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
>   buildtools/call-sphinx-build.py | 29 ++++++++++++-------
>   doc/api/meson.build             |  1 +
>   doc/guides/conf.py              | 32 +++++++++++++++++----
>   doc/guides/meson.build          |  1 +
>   doc/guides/tools/dts.rst        | 29 +++++++++++++++++++
>   dts/doc/doc-index.rst           | 17 +++++++++++
>   dts/doc/meson.build             | 50 +++++++++++++++++++++++++++++++++
>   dts/meson.build                 | 16 +++++++++++
>   meson.build                     |  1 +
>   9 files changed, 161 insertions(+), 15 deletions(-)
>   create mode 100644 dts/doc/doc-index.rst
>   create mode 100644 dts/doc/meson.build
>   create mode 100644 dts/meson.build
> 
> diff --git a/buildtools/call-sphinx-build.py b/buildtools/call-sphinx-build.py
> index 39a60d09fa..c2f3acfb1d 100755
> --- a/buildtools/call-sphinx-build.py
> +++ b/buildtools/call-sphinx-build.py
> @@ -3,37 +3,46 @@
>   # Copyright(c) 2019 Intel Corporation
>   #
>   
> +import argparse
>   import sys
>   import os
>   from os.path import join
>   from subprocess import run, PIPE, STDOUT
>   from packaging.version import Version
>   
> -# assign parameters to variables
> -(sphinx, version, src, dst, *extra_args) = sys.argv[1:]
> +parser = argparse.ArgumentParser()
> +parser.add_argument('sphinx')
> +parser.add_argument('version')
> +parser.add_argument('src')
> +parser.add_argument('dst')
> +parser.add_argument('--dts-root', default='.')
> +args, extra_args = parser.parse_known_args()
>   
>   # set the version in environment for sphinx to pick up
> -os.environ['DPDK_VERSION'] = version
> +os.environ['DPDK_VERSION'] = args.version
> +os.environ['DTS_ROOT'] = args.dts_root
>   
>   # for sphinx version >= 1.7 add parallelism using "-j auto"
> -ver = run([sphinx, '--version'], stdout=PIPE,
> +ver = run([args.sphinx, '--version'], stdout=PIPE,
>             stderr=STDOUT).stdout.decode().split()[-1]
> -sphinx_cmd = [sphinx] + extra_args
> +sphinx_cmd = [args.sphinx] + extra_args
>   if Version(ver) >= Version('1.7'):
>       sphinx_cmd += ['-j', 'auto']
>   
>   # find all the files sphinx will process so we can write them as dependencies
>   srcfiles = []
> -for root, dirs, files in os.walk(src):
> +for root, dirs, files in os.walk(args.src):
>       srcfiles.extend([join(root, f) for f in files])
>   
>   # run sphinx, putting the html output in a "html" directory
> -with open(join(dst, 'sphinx_html.out'), 'w') as out:
> -    process = run(sphinx_cmd + ['-b', 'html', src, join(dst, 'html')],
> -                  stdout=out)
> +with open(join(args.dst, 'sphinx_html.out'), 'w') as out:
> +    process = run(
> +        sphinx_cmd + ['-b', 'html', args.src, join(args.dst, 'html')],
> +        stdout=out
> +    )
>   
>   # create a gcc format .d file giving all the dependencies of this doc build
> -with open(join(dst, '.html.d'), 'w') as d:
> +with open(join(args.dst, '.html.d'), 'w') as d:
>       d.write('html: ' + ' '.join(srcfiles) + '\n')
>   
>   sys.exit(process.returncode)
> diff --git a/doc/api/meson.build b/doc/api/meson.build
> index 2876a78a7e..1f0c725a94 100644
> --- a/doc/api/meson.build
> +++ b/doc/api/meson.build
> @@ -1,6 +1,7 @@
>   # SPDX-License-Identifier: BSD-3-Clause
>   # Copyright(c) 2018 Luca Boccassi <bluca@debian.org>
>   
> +doc_api_build_dir = meson.current_build_dir()
>   doxygen = find_program('doxygen', required: get_option('enable_docs'))
>   
>   if not doxygen.found()
> diff --git a/doc/guides/conf.py b/doc/guides/conf.py
> index 0f7ff5282d..737e5a5688 100644
> --- a/doc/guides/conf.py
> +++ b/doc/guides/conf.py
> @@ -7,10 +7,9 @@
>   from sphinx import __version__ as sphinx_version
>   from os import listdir
>   from os import environ
> -from os.path import basename
> -from os.path import dirname
> +from os.path import basename, dirname
>   from os.path import join as path_join
> -from sys import argv, stderr
> +from sys import argv, stderr, path
>   
>   import configparser
>   
> @@ -24,6 +23,29 @@
>             file=stderr)
>       pass
>   
> +extensions = ['sphinx.ext.napoleon']
> +
> +# Python docstring options
> +autodoc_default_options = {
> +    'members': True,
> +    'member-order': 'bysource',
> +    'show-inheritance': True,
> +}
> +autodoc_typehints = 'both'
> +autodoc_typehints_format = 'short'
> +napoleon_numpy_docstring = False
> +napoleon_attr_annotations = True
> +napoleon_use_ivar = True
> +napoleon_use_rtype = False
> +add_module_names = False
> +toc_object_entries = False
> +
> +# Sidebar config
> +html_theme_options = {
> +    'collapse_navigation': False,
> +    'navigation_depth': -1,
> +}
> +
>   stop_on_error = ('-W' in argv)
>   
>   project = 'Data Plane Development Kit'
> @@ -35,8 +57,8 @@
>   html_show_copyright = False
>   highlight_language = 'none'
>   
> -release = environ.setdefault('DPDK_VERSION', "None")
> -version = release
> +path.append(environ.get('DTS_ROOT'))
> +version = environ.setdefault('DPDK_VERSION', "None")
>   
>   master_doc = 'index'
>   
> diff --git a/doc/guides/meson.build b/doc/guides/meson.build
> index 51f81da2e3..8933d75f6b 100644
> --- a/doc/guides/meson.build
> +++ b/doc/guides/meson.build
> @@ -1,6 +1,7 @@
>   # SPDX-License-Identifier: BSD-3-Clause
>   # Copyright(c) 2018 Intel Corporation
>   
> +doc_guides_source_dir = meson.current_source_dir()
>   sphinx = find_program('sphinx-build', required: get_option('enable_docs'))
>   
>   if not sphinx.found()
> diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> index 32c18ee472..98923b1467 100644
> --- a/doc/guides/tools/dts.rst
> +++ b/doc/guides/tools/dts.rst
> @@ -335,3 +335,32 @@ There are three tools used in DTS to help with code checking, style and formatti
>   These three tools are all used in ``devtools/dts-check-format.sh``,
>   the DTS code check and format script.
>   Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
> +
> +
> +Building DTS API docs
> +---------------------
> +
> +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
> +
> +   .. code-block:: console

I believe the code-block line should not be indented. As it is, it did 
not generate properly when I built the doc. Same for the other code 
block below.

> +
> +   poetry install --with docs
> +   poetry shell
> +
> +
> +Build commands
> +~~~~~~~~~~~~~~
> +
> +The documentation is built using the standard DPDK build system.
> +
> +After entering Poetry's shell, build the documentation with:
> +
> +   .. code-block:: console
> +
> +   ninja -C build dts/doc
> +
> +The output is generated in ``build/doc/api/dts/html``.
> +
> +.. Note::
> +
> +   Make sure to fix any Sphinx warnings when adding or updating docstrings.
> diff --git a/dts/doc/doc-index.rst b/dts/doc/doc-index.rst
> new file mode 100644
> index 0000000000..f5dcd553f2
> --- /dev/null
> +++ b/dts/doc/doc-index.rst
> @@ -0,0 +1,17 @@
> +.. DPDK Test Suite documentation.
> +
> +Welcome to DPDK Test Suite's documentation!
> +===========================================
> +
> +.. toctree::
> +   :titlesonly:
> +   :caption: Contents:
> +
> +   framework
> +
> +Indices and tables
> +==================
> +
> +* :ref:`genindex`
> +* :ref:`modindex`
> +* :ref:`search`
> diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> new file mode 100644
> index 0000000000..8e70eabc51
> --- /dev/null
> +++ b/dts/doc/meson.build
> @@ -0,0 +1,50 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +sphinx = find_program('sphinx-build')
> +sphinx_apidoc = find_program('sphinx-apidoc')
> +
> +if not sphinx.found() or not sphinx_apidoc.found()
> +    subdir_done()
> +endif
> +
> +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> +dts_api_src = custom_target('dts_api_src',
> +        output: 'modules.rst',
> +        command: [sphinx_apidoc, '--append-syspath', '--force',
> +            '--module-first', '--separate', '-V', meson.project_version(),
> +            '-o', dts_api_build_dir, '--no-toc', '--implicit-namespaces',
> +            dts_api_framework_dir],
> +        build_by_default: false)
> +doc_targets += dts_api_src
> +doc_target_names += 'DTS_API_sphinx_sources'
> +
> +cp = find_program('cp')
> +cp_index = custom_target('cp_index',
> +        input: 'doc-index.rst',
> +        output: 'index.rst',
> +        depends: dts_api_src,
> +        command: [cp, '@INPUT@', join_paths(dts_api_build_dir, 'index.rst')],
> +        build_by_default: false)
> +doc_targets += cp_index
> +doc_target_names += 'DTS_API_sphinx_index'
> +
> +extra_sphinx_args = ['-a', '-c', doc_guides_source_dir]
> +if get_option('werror')
> +    extra_sphinx_args += '-W'
> +endif
> +
> +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
> +dts_api_html = custom_target('dts_api_html',
> +        output: 'html',
> +        depends: cp_index,
> +        command: ['DTS_ROOT=@0@'.format(dts_dir),
> +            sphinx_wrapper, sphinx, meson.project_version(),
> +            dts_api_build_dir, dts_api_build_dir,
> +            '--dts-root', dts_dir, extra_sphinx_args],
> +        build_by_default: false,
> +        install: false,
> +        install_dir: htmldir)

I don't entirely understand what this command do. I suspect it's the one 
that create and populate the html directory in build/doc/api/dts/
The main thing that confuse me is this htmldir. Mine seemed to point to 
share/doc/dpdk. Such a directory doesn't seems to be built (or I could 
not find where). That might be intended given install is set to false. 
But in that case, why bother setting it?
The thing that worries me more is that dpdk/doc/guide/meson.build do 
build some html and have the same htmldir. If we were to enable the 
install in both, wouldn't the html overwrite each other (in particular 
index.html)?
If my understanding is correct, I believe htmldir should either be 
removed, or set to a different value (<datadir>/doc/dpdk/dts ?)

> +doc_targets += dts_api_html
> +doc_target_names += 'DTS_API_HTML'
> diff --git a/dts/meson.build b/dts/meson.build
> new file mode 100644
> index 0000000000..17bda07636
> --- /dev/null
> +++ b/dts/meson.build
> @@ -0,0 +1,16 @@
> +# SPDX-License-Identifier: BSD-3-Clause
> +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> +
> +doc_targets = []
> +doc_target_names = []
> +dts_dir = meson.current_source_dir()
> +
> +subdir('doc')
> +
> +if doc_targets.length() == 0
> +    message = 'No docs targets found'
> +else
> +    message = 'Built docs:'
> +endif
> +run_target('dts/doc', command: [echo, message, doc_target_names],
> +    depends: doc_targets)
> diff --git a/meson.build b/meson.build
> index 39cb73846d..4d34dc531c 100644
> --- a/meson.build
> +++ b/meson.build
> @@ -85,6 +85,7 @@ subdir('app')
>   
>   # build docs
>   subdir('doc')
> +subdir('dts')
>   
>   # build any examples explicitly requested - useful for developers - and
>   # install any example code into the appropriate install path

When building from scratch, I had several warning/errors.

$ meson build
[usual meson output block...]
WARNING: Target "dts/doc" has a path separator in its name.
This is not supported, it can cause unexpected failures and will become
a hard error in the future.
Configuring rte_build_config.h using configuration
Message:
=================
Applications Enabled
=================
[...]

If you change the target name, remember to change it in the doc too.



$ ninja -C build dts/doc
[create all the rst...]
[3/4] Generating dts_api_html with a custom command.
dpdk/dts/framework/remote_session/interactive_shell.py:docstring of 
framework.remote_session.interactive_shell.InteractiveShell:25: WARNING: 
Definition list ends without a blank line; unexpected unindent.

dpdk/dts/framework/remote_session/interactive_shell.py:docstring of 
framework.remote_session.interactive_shell.InteractiveShell:27: ERROR: 
Unexpected indentation.

dpdk/dts/framework/testbed_model/common.py:docstring of 
framework.testbed_model.common.MesonArgs:3: ERROR: Unexpected indentation.

dpdk/dts/framework/testbed_model/common.py:docstring of 
framework.testbed_model.common.MesonArgs:4: WARNING: Block quote ends 
without a blank line; unexpected unindent.

dpdk/dts/framework/testbed_model/linux_session.py:docstring of 
framework.testbed_model.linux_session.LshwOutput:10: ERROR: Unexpected 
indentation.

dpdk/dts/framework/testbed_model/linux_session.py:docstring of 
framework.testbed_model.linux_session.LshwOutput:13: WARNING: Block 
quote ends without a blank line; unexpected unindent.

dpdk/dts/framework/testbed_model/sut_node.py:docstring of 
framework.testbed_model.sut_node.SutNode.create_eal_parameters:3: ERROR: 
Unexpected indentation.

dpdk/dts/framework/testbed_model/sut_node.py:docstring of 
framework.testbed_model.sut_node.SutNode.create_eal_parameters:6: 
WARNING: Block quote ends without a blank line; unexpected unindent.

dpdk/dts/framework/testbed_model/sut_node.py:docstring of 
framework.testbed_model.sut_node.SutNode.create_eal_parameters:18: 
ERROR: Unexpected indentation.

dpdk/dts/framework/testbed_model/sut_node.py:docstring of 
framework.testbed_model.sut_node.SutNode.create_eal_parameters:20: 
WARNING: Block quote ends without a blank line; unexpected unindent.

dpdk/dts/framework/test_suite.py:docstring of 
framework.test_suite.TestSuite:1: ERROR: Unknown target name: "test".

dpdk/dts/framework/test_suite.py:docstring of 
framework.test_suite.TestSuite:1: ERROR: Unknown target name: "test_perf".

If I then try to rerun ninja, those errors don't appear, so it seems to 
happen mostly on fresh build.


^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 3/4] dts: add doc generation
  2023-10-26 16:43         ` Yoan Picchi
@ 2023-10-27  9:52           ` Juraj Linkeš
  0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-10-27  9:52 UTC (permalink / raw)
  To: Yoan Picchi
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb, dev

Thanks for the comments, Yoan.

> > diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
> > index 32c18ee472..98923b1467 100644
> > --- a/doc/guides/tools/dts.rst
> > +++ b/doc/guides/tools/dts.rst
> > @@ -335,3 +335,32 @@ There are three tools used in DTS to help with code checking, style and formatti
> >   These three tools are all used in ``devtools/dts-check-format.sh``,
> >   the DTS code check and format script.
> >   Refer to the script for usage: ``devtools/dts-check-format.sh -h``.
> > +
> > +
> > +Building DTS API docs
> > +---------------------
> > +
> > +To build DTS API docs, install the dependencies with Poetry, then enter its shell:
> > +
> > +   .. code-block:: console
>
> I believe the code-block line should not be indented. As it is, it did
> not generate properly when I built the doc. Same for the other code
> block below.
>

Good catch, thanks.

> > diff --git a/dts/doc/meson.build b/dts/doc/meson.build
> > new file mode 100644
> > index 0000000000..8e70eabc51
> > --- /dev/null
> > +++ b/dts/doc/meson.build
> > @@ -0,0 +1,50 @@
> > +# SPDX-License-Identifier: BSD-3-Clause
> > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +
> > +sphinx = find_program('sphinx-build')
> > +sphinx_apidoc = find_program('sphinx-apidoc')
> > +
> > +if not sphinx.found() or not sphinx_apidoc.found()
> > +    subdir_done()
> > +endif
> > +
> > +dts_api_framework_dir = join_paths(dts_dir, 'framework')
> > +dts_api_build_dir = join_paths(doc_api_build_dir, 'dts')
> > +dts_api_src = custom_target('dts_api_src',
> > +        output: 'modules.rst',
> > +        command: [sphinx_apidoc, '--append-syspath', '--force',
> > +            '--module-first', '--separate', '-V', meson.project_version(),
> > +            '-o', dts_api_build_dir, '--no-toc', '--implicit-namespaces',
> > +            dts_api_framework_dir],
> > +        build_by_default: false)
> > +doc_targets += dts_api_src
> > +doc_target_names += 'DTS_API_sphinx_sources'
> > +
> > +cp = find_program('cp')
> > +cp_index = custom_target('cp_index',
> > +        input: 'doc-index.rst',
> > +        output: 'index.rst',
> > +        depends: dts_api_src,
> > +        command: [cp, '@INPUT@', join_paths(dts_api_build_dir, 'index.rst')],
> > +        build_by_default: false)
> > +doc_targets += cp_index
> > +doc_target_names += 'DTS_API_sphinx_index'
> > +
> > +extra_sphinx_args = ['-a', '-c', doc_guides_source_dir]
> > +if get_option('werror')
> > +    extra_sphinx_args += '-W'
> > +endif
> > +
> > +htmldir = join_paths(get_option('datadir'), 'doc', 'dpdk')
> > +dts_api_html = custom_target('dts_api_html',
> > +        output: 'html',
> > +        depends: cp_index,
> > +        command: ['DTS_ROOT=@0@'.format(dts_dir),
> > +            sphinx_wrapper, sphinx, meson.project_version(),
> > +            dts_api_build_dir, dts_api_build_dir,
> > +            '--dts-root', dts_dir, extra_sphinx_args],
> > +        build_by_default: false,
> > +        install: false,
> > +        install_dir: htmldir)
>
> I don't entirely understand what this command do. I suspect it's the one
> that create and populate the html directory in build/doc/api/dts/

Yes, this one generates the sphinx html docs from the .rst files
generated in the first step.

> The main thing that confuse me is this htmldir. Mine seemed to point to
> share/doc/dpdk. Such a directory doesn't seems to be built (or I could
> not find where). That might be intended given install is set to false.
> But in that case, why bother setting it?
> The thing that worries me more is that dpdk/doc/guide/meson.build do
> build some html and have the same htmldir. If we were to enable the
> install in both, wouldn't the html overwrite each other (in particular
> index.html)?
> If my understanding is correct, I believe htmldir should either be
> removed, or set to a different value (<datadir>/doc/dpdk/dts ?)

These install related configs are basically a placeholder (install:
false) and a copy-paste from the doxygen API generation because meson
required these to be filles (IIRC).
I was hoping Bruce would give me some guidance in this.

Bruce, how should we install the DTS API docs? I'm not really sure how
exactly the meson install procedure works, what's going to be copied
and so on. I don't really know what to do with this.

>
> > +doc_targets += dts_api_html
> > +doc_target_names += 'DTS_API_HTML'
> > diff --git a/dts/meson.build b/dts/meson.build
> > new file mode 100644
> > index 0000000000..17bda07636
> > --- /dev/null
> > +++ b/dts/meson.build
> > @@ -0,0 +1,16 @@
> > +# SPDX-License-Identifier: BSD-3-Clause
> > +# Copyright(c) 2023 PANTHEON.tech s.r.o.
> > +
> > +doc_targets = []
> > +doc_target_names = []
> > +dts_dir = meson.current_source_dir()
> > +
> > +subdir('doc')
> > +
> > +if doc_targets.length() == 0
> > +    message = 'No docs targets found'
> > +else
> > +    message = 'Built docs:'
> > +endif
> > +run_target('dts/doc', command: [echo, message, doc_target_names],
> > +    depends: doc_targets)
> > diff --git a/meson.build b/meson.build
> > index 39cb73846d..4d34dc531c 100644
> > --- a/meson.build
> > +++ b/meson.build
> > @@ -85,6 +85,7 @@ subdir('app')
> >
> >   # build docs
> >   subdir('doc')
> > +subdir('dts')
> >
> >   # build any examples explicitly requested - useful for developers - and
> >   # install any example code into the appropriate install path
>
> When building from scratch, I had several warning/errors.
>
> $ meson build
> [usual meson output block...]
> WARNING: Target "dts/doc" has a path separator in its name.
> This is not supported, it can cause unexpected failures and will become
> a hard error in the future.

This is likely due to a newer meson version that I'm using, which is
0.53.2 because that's used in CI. We can rename it to satisfy newer
versions.

> Configuring rte_build_config.h using configuration
> Message:
> =================
> Applications Enabled
> =================
> [...]
>
> If you change the target name, remember to change it in the doc too.
>
>
>
> $ ninja -C build dts/doc
> [create all the rst...]
> [3/4] Generating dts_api_html with a custom command.
> dpdk/dts/framework/remote_session/interactive_shell.py:docstring of
> framework.remote_session.interactive_shell.InteractiveShell:25: WARNING:
> Definition list ends without a blank line; unexpected unindent.
>
> dpdk/dts/framework/remote_session/interactive_shell.py:docstring of
> framework.remote_session.interactive_shell.InteractiveShell:27: ERROR:
> Unexpected indentation.
>
> dpdk/dts/framework/testbed_model/common.py:docstring of
> framework.testbed_model.common.MesonArgs:3: ERROR: Unexpected indentation.
>
> dpdk/dts/framework/testbed_model/common.py:docstring of
> framework.testbed_model.common.MesonArgs:4: WARNING: Block quote ends
> without a blank line; unexpected unindent.
>
> dpdk/dts/framework/testbed_model/linux_session.py:docstring of
> framework.testbed_model.linux_session.LshwOutput:10: ERROR: Unexpected
> indentation.
>
> dpdk/dts/framework/testbed_model/linux_session.py:docstring of
> framework.testbed_model.linux_session.LshwOutput:13: WARNING: Block
> quote ends without a blank line; unexpected unindent.
>
> dpdk/dts/framework/testbed_model/sut_node.py:docstring of
> framework.testbed_model.sut_node.SutNode.create_eal_parameters:3: ERROR:
> Unexpected indentation.
>
> dpdk/dts/framework/testbed_model/sut_node.py:docstring of
> framework.testbed_model.sut_node.SutNode.create_eal_parameters:6:
> WARNING: Block quote ends without a blank line; unexpected unindent.
>
> dpdk/dts/framework/testbed_model/sut_node.py:docstring of
> framework.testbed_model.sut_node.SutNode.create_eal_parameters:18:
> ERROR: Unexpected indentation.
>
> dpdk/dts/framework/testbed_model/sut_node.py:docstring of
> framework.testbed_model.sut_node.SutNode.create_eal_parameters:20:
> WARNING: Block quote ends without a blank line; unexpected unindent.
>
> dpdk/dts/framework/test_suite.py:docstring of
> framework.test_suite.TestSuite:1: ERROR: Unknown target name: "test".
>
> dpdk/dts/framework/test_suite.py:docstring of
> framework.test_suite.TestSuite:1: ERROR: Unknown target name: "test_perf".
>

These errors are here because the docstrings are either incomplete or
not yet reformatted. These will be addressed in a new version that's
coming soon.

> If I then try to rerun ninja, those errors don't appear, so it seems to
> happen mostly on fresh build.
>

Yes, that is expected.

This made me look into how we're running sphinx-build. We're using:
  -a                write all files (default: only write new and changed files)

in the hope of forcing a full rebuild, but that doesn't seem to be
working properly, if at all. What I actually want sphinx to do is to
update the .html files after a rebuild and -a doesn't help with that.

I've tried the -E option instead and that seems to be working - it
updates the modified .html files so I'll replace -a with -E in the new
version.

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 2/4] dts: add doc generation dependencies
  2023-08-31 10:04       ` [RFC PATCH v4 2/4] dts: add doc generation dependencies Juraj Linkeš
@ 2023-10-27 15:27         ` Yoan Picchi
  0 siblings, 0 replies; 393+ messages in thread
From: Yoan Picchi @ 2023-10-27 15:27 UTC (permalink / raw)
  To: Juraj Linkeš,
	thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev

On 8/31/23 11:04, Juraj Linkeš wrote:
> Sphinx imports every Python module when generating documentation from
> docstrings, meaning all dts dependencies, including Python version,
> must be satisfied.
> By adding Sphinx to dts dependencies we make sure that the proper
> Python version and dependencies are used when Sphinx is executed.
> 
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> ---
>   dts/poetry.lock    | 447 ++++++++++++++++++++++++++++++++++++++++++++-
>   dts/pyproject.toml |   7 +
>   2 files changed, 453 insertions(+), 1 deletion(-)
> 
> diff --git a/dts/poetry.lock b/dts/poetry.lock
> index f7b3b6d602..91afe5231a 100644
> --- a/dts/poetry.lock
> +++ b/dts/poetry.lock
> @@ -1,5 +1,16 @@
>   # This file is automatically @generated by Poetry 1.5.1 and should not be changed by hand.
>   
> +[[package]]
> +name = "alabaster"
> +version = "0.7.13"
> +description = "A configurable sidebar-enabled Sphinx theme"
> +optional = false
> +python-versions = ">=3.6"
> +files = [
> +    {file = "alabaster-0.7.13-py3-none-any.whl", hash = "sha256:1ee19aca801bbabb5ba3f5f258e4422dfa86f82f3e9cefb0859b283cdd7f62a3"},
> +    {file = "alabaster-0.7.13.tar.gz", hash = "sha256:a27a4a084d5e690e16e01e03ad2b2e552c61a65469419b907243193de1a84ae2"},
> +]
> +
>   [[package]]
>   name = "attrs"
>   version = "23.1.0"
> @@ -18,6 +29,17 @@ docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-
>   tests = ["attrs[tests-no-zope]", "zope-interface"]
>   tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
>   
> +[[package]]
> +name = "babel"
> +version = "2.12.1"
> +description = "Internationalization utilities"
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> +    {file = "Babel-2.12.1-py3-none-any.whl", hash = "sha256:b4246fb7677d3b98f501a39d43396d3cafdc8eadb045f4a31be01863f655c610"},
> +    {file = "Babel-2.12.1.tar.gz", hash = "sha256:cc2d99999cd01d44420ae725a21c9e3711b3aadc7976d6147f622d8581963455"},
> +]
> +
>   [[package]]
>   name = "bcrypt"
>   version = "4.0.1"
> @@ -86,6 +108,17 @@ d = ["aiohttp (>=3.7.4)"]
>   jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
>   uvloop = ["uvloop (>=0.15.2)"]
>   
> +[[package]]
> +name = "certifi"
> +version = "2023.7.22"
> +description = "Python package for providing Mozilla's CA Bundle."
> +optional = false
> +python-versions = ">=3.6"
> +files = [
> +    {file = "certifi-2023.7.22-py3-none-any.whl", hash = "sha256:92d6037539857d8206b8f6ae472e8b77db8058fec5937a1ef3f54304089edbb9"},
> +    {file = "certifi-2023.7.22.tar.gz", hash = "sha256:539cc1d13202e33ca466e88b2807e29f4c13049d6d87031a3c110744495cb082"},
> +]
> +
>   [[package]]
>   name = "cffi"
>   version = "1.15.1"
> @@ -162,6 +195,90 @@ files = [
>   [package.dependencies]
>   pycparser = "*"
>   
> +[[package]]
> +name = "charset-normalizer"
> +version = "3.2.0"
> +description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet."
> +optional = false
> +python-versions = ">=3.7.0"
> +files = [
> +    {file = "charset-normalizer-3.2.0.tar.gz", hash = "sha256:3bb3d25a8e6c0aedd251753a79ae98a093c7e7b471faa3aa9a93a81431987ace"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:0b87549028f680ca955556e3bd57013ab47474c3124dc069faa0b6545b6c9710"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7c70087bfee18a42b4040bb9ec1ca15a08242cf5867c58726530bdf3945672ed"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a103b3a7069b62f5d4890ae1b8f0597618f628b286b03d4bc9195230b154bfa9"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:94aea8eff76ee6d1cdacb07dd2123a68283cb5569e0250feab1240058f53b623"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:db901e2ac34c931d73054d9797383d0f8009991e723dab15109740a63e7f902a"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b0dac0ff919ba34d4df1b6131f59ce95b08b9065233446be7e459f95554c0dc8"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:193cbc708ea3aca45e7221ae58f0fd63f933753a9bfb498a3b474878f12caaad"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:09393e1b2a9461950b1c9a45d5fd251dc7c6f228acab64da1c9c0165d9c7765c"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:baacc6aee0b2ef6f3d308e197b5d7a81c0e70b06beae1f1fcacffdbd124fe0e3"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:bf420121d4c8dce6b889f0e8e4ec0ca34b7f40186203f06a946fa0276ba54029"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_ppc64le.whl", hash = "sha256:c04a46716adde8d927adb9457bbe39cf473e1e2c2f5d0a16ceb837e5d841ad4f"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_s390x.whl", hash = "sha256:aaf63899c94de41fe3cf934601b0f7ccb6b428c6e4eeb80da72c58eab077b19a"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:d62e51710986674142526ab9f78663ca2b0726066ae26b78b22e0f5e571238dd"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-win32.whl", hash = "sha256:04e57ab9fbf9607b77f7d057974694b4f6b142da9ed4a199859d9d4d5c63fe96"},
> +    {file = "charset_normalizer-3.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:48021783bdf96e3d6de03a6e39a1171ed5bd7e8bb93fc84cc649d11490f87cea"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:4957669ef390f0e6719db3613ab3a7631e68424604a7b448f079bee145da6e09"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:46fb8c61d794b78ec7134a715a3e564aafc8f6b5e338417cb19fe9f57a5a9bf2"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f779d3ad205f108d14e99bb3859aa7dd8e9c68874617c72354d7ecaec2a054ac"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f25c229a6ba38a35ae6e25ca1264621cc25d4d38dca2942a7fce0b67a4efe918"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2efb1bd13885392adfda4614c33d3b68dee4921fd0ac1d3988f8cbb7d589e72a"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:1f30b48dd7fa1474554b0b0f3fdfdd4c13b5c737a3c6284d3cdc424ec0ffff3a"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:246de67b99b6851627d945db38147d1b209a899311b1305dd84916f2b88526c6"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9bd9b3b31adcb054116447ea22caa61a285d92e94d710aa5ec97992ff5eb7cf3"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:8c2f5e83493748286002f9369f3e6607c565a6a90425a3a1fef5ae32a36d749d"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:3170c9399da12c9dc66366e9d14da8bf7147e1e9d9ea566067bbce7bb74bd9c2"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_ppc64le.whl", hash = "sha256:7a4826ad2bd6b07ca615c74ab91f32f6c96d08f6fcc3902ceeedaec8cdc3bcd6"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_s390x.whl", hash = "sha256:3b1613dd5aee995ec6d4c69f00378bbd07614702a315a2cf6c1d21461fe17c23"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9e608aafdb55eb9f255034709e20d5a83b6d60c054df0802fa9c9883d0a937aa"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-win32.whl", hash = "sha256:f2a1d0fd4242bd8643ce6f98927cf9c04540af6efa92323e9d3124f57727bfc1"},
> +    {file = "charset_normalizer-3.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:681eb3d7e02e3c3655d1b16059fbfb605ac464c834a0c629048a30fad2b27489"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c57921cda3a80d0f2b8aec7e25c8aa14479ea92b5b51b6876d975d925a2ea346"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:41b25eaa7d15909cf3ac4c96088c1f266a9a93ec44f87f1d13d4a0e86c81b982"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f058f6963fd82eb143c692cecdc89e075fa0828db2e5b291070485390b2f1c9c"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a7647ebdfb9682b7bb97e2a5e7cb6ae735b1c25008a70b906aecca294ee96cf4"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eef9df1eefada2c09a5e7a40991b9fc6ac6ef20b1372abd48d2794a316dc0449"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e03b8895a6990c9ab2cdcd0f2fe44088ca1c65ae592b8f795c3294af00a461c3"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:ee4006268ed33370957f55bf2e6f4d263eaf4dc3cfc473d1d90baff6ed36ce4a"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c4983bf937209c57240cff65906b18bb35e64ae872da6a0db937d7b4af845dd7"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_ppc64le.whl", hash = "sha256:3bb7fda7260735efe66d5107fb7e6af6a7c04c7fce9b2514e04b7a74b06bf5dd"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_s390x.whl", hash = "sha256:72814c01533f51d68702802d74f77ea026b5ec52793c791e2da806a3844a46c3"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:70c610f6cbe4b9fce272c407dd9d07e33e6bf7b4aa1b7ffb6f6ded8e634e3592"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-win32.whl", hash = "sha256:a401b4598e5d3f4a9a811f3daf42ee2291790c7f9d74b18d75d6e21dda98a1a1"},
> +    {file = "charset_normalizer-3.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:c0b21078a4b56965e2b12f247467b234734491897e99c1d51cee628da9786959"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:95eb302ff792e12aba9a8b8f8474ab229a83c103d74a750ec0bd1c1eea32e669"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a100c6d595a7f316f1b6f01d20815d916e75ff98c27a01ae817439ea7726329"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6339d047dab2780cc6220f46306628e04d9750f02f983ddb37439ca47ced7149"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e4b749b9cc6ee664a3300bb3a273c1ca8068c46be705b6c31cf5d276f8628a94"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a38856a971c602f98472050165cea2cdc97709240373041b69030be15047691f"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f87f746ee241d30d6ed93969de31e5ffd09a2961a051e60ae6bddde9ec3583aa"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:89f1b185a01fe560bc8ae5f619e924407efca2191b56ce749ec84982fc59a32a"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e1c8a2f4c69e08e89632defbfabec2feb8a8d99edc9f89ce33c4b9e36ab63037"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:2f4ac36d8e2b4cc1aa71df3dd84ff8efbe3bfb97ac41242fbcfc053c67434f46"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:a386ebe437176aab38c041de1260cd3ea459c6ce5263594399880bbc398225b2"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_ppc64le.whl", hash = "sha256:ccd16eb18a849fd8dcb23e23380e2f0a354e8daa0c984b8a732d9cfaba3a776d"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_s390x.whl", hash = "sha256:e6a5bf2cba5ae1bb80b154ed68a3cfa2fa00fde979a7f50d6598d3e17d9ac20c"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:45de3f87179c1823e6d9e32156fb14c1927fcc9aba21433f088fdfb555b77c10"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-win32.whl", hash = "sha256:1000fba1057b92a65daec275aec30586c3de2401ccdcd41f8a5c1e2c87078706"},
> +    {file = "charset_normalizer-3.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:8b2c760cfc7042b27ebdb4a43a4453bd829a5742503599144d54a032c5dc7e9e"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:855eafa5d5a2034b4621c74925d89c5efef61418570e5ef9b37717d9c796419c"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:203f0c8871d5a7987be20c72442488a0b8cfd0f43b7973771640fc593f56321f"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e857a2232ba53ae940d3456f7533ce6ca98b81917d47adc3c7fd55dad8fab858"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5e86d77b090dbddbe78867a0275cb4df08ea195e660f1f7f13435a4649e954e5"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c4fb39a81950ec280984b3a44f5bd12819953dc5fa3a7e6fa7a80db5ee853952"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2dee8e57f052ef5353cf608e0b4c871aee320dd1b87d351c28764fc0ca55f9f4"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8700f06d0ce6f128de3ccdbc1acaea1ee264d2caa9ca05daaf492fde7c2a7200"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1920d4ff15ce893210c1f0c0e9d19bfbecb7983c76b33f046c13a8ffbd570252"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:c1c76a1743432b4b60ab3358c937a3fe1341c828ae6194108a94c69028247f22"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f7560358a6811e52e9c4d142d497f1a6e10103d3a6881f18d04dbce3729c0e2c"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_ppc64le.whl", hash = "sha256:c8063cf17b19661471ecbdb3df1c84f24ad2e389e326ccaf89e3fb2484d8dd7e"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_s390x.whl", hash = "sha256:cd6dbe0238f7743d0efe563ab46294f54f9bc8f4b9bcf57c3c666cc5bc9d1299"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:1249cbbf3d3b04902ff081ffbb33ce3377fa6e4c7356f759f3cd076cc138d020"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-win32.whl", hash = "sha256:6c409c0deba34f147f77efaa67b8e4bb83d2f11c8806405f76397ae5b8c0d1c9"},
> +    {file = "charset_normalizer-3.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:7095f6fbfaa55defb6b733cfeb14efaae7a29f0b59d8cf213be4e7ca0b857b80"},
> +    {file = "charset_normalizer-3.2.0-py3-none-any.whl", hash = "sha256:8e098148dd37b4ce3baca71fb394c81dc5d9c7728c95df695d2dca218edf40e6"},
> +]
> +
>   [[package]]
>   name = "click"
>   version = "8.1.6"
> @@ -232,6 +349,17 @@ ssh = ["bcrypt (>=3.1.5)"]
>   test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
>   test-randomorder = ["pytest-randomly"]
>   
> +[[package]]
> +name = "docutils"
> +version = "0.18.1"
> +description = "Docutils -- Python Documentation Utilities"
> +optional = false
> +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
> +files = [
> +    {file = "docutils-0.18.1-py2.py3-none-any.whl", hash = "sha256:23010f129180089fbcd3bc08cfefccb3b890b0050e1ca00c867036e9d161b98c"},
> +    {file = "docutils-0.18.1.tar.gz", hash = "sha256:679987caf361a7539d76e584cbeddc311e3aee937877c87346f31debc63e9d06"},
> +]
> +
>   [[package]]
>   name = "fabric"
>   version = "2.7.1"
> @@ -252,6 +380,28 @@ pathlib2 = "*"
>   pytest = ["mock (>=2.0.0,<3.0)", "pytest (>=3.2.5,<4.0)"]
>   testing = ["mock (>=2.0.0,<3.0)"]
>   
> +[[package]]
> +name = "idna"
> +version = "3.4"
> +description = "Internationalized Domain Names in Applications (IDNA)"
> +optional = false
> +python-versions = ">=3.5"
> +files = [
> +    {file = "idna-3.4-py3-none-any.whl", hash = "sha256:90b77e79eaa3eba6de819a0c442c0b4ceefc341a7a2ab77d7562bf49f425c5c2"},
> +    {file = "idna-3.4.tar.gz", hash = "sha256:814f528e8dead7d329833b91c5faa87d60bf71824cd12a7530b5526063d02cb4"},
> +]
> +
> +[[package]]
> +name = "imagesize"
> +version = "1.4.1"
> +description = "Getting image size from png/jpeg/jpeg2000/gif file"
> +optional = false
> +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
> +files = [
> +    {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"},
> +    {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
> +]
> +
>   [[package]]
>   name = "invoke"
>   version = "1.7.3"
> @@ -280,6 +430,23 @@ pipfile-deprecated-finder = ["pip-shims (>=0.5.2)", "pipreqs", "requirementslib"
>   plugins = ["setuptools"]
>   requirements-deprecated-finder = ["pip-api", "pipreqs"]
>   
> +[[package]]
> +name = "jinja2"
> +version = "3.1.2"
> +description = "A very fast and expressive template engine."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> +    {file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"},
> +    {file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"},
> +]
> +
> +[package.dependencies]
> +MarkupSafe = ">=2.0"
> +
> +[package.extras]
> +i18n = ["Babel (>=2.7)"]
> +
>   [[package]]
>   name = "jsonpatch"
>   version = "1.33"
> @@ -340,6 +507,65 @@ files = [
>   [package.dependencies]
>   referencing = ">=0.28.0"
>   
> +[[package]]
> +name = "markupsafe"
> +version = "2.1.3"
> +description = "Safely add untrusted strings to HTML/XML markup."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cd0f502fe016460680cd20aaa5a76d241d6f35a1c3350c474bac1273803893fa"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e09031c87a1e51556fdcb46e5bd4f59dfb743061cf93c4d6831bf894f125eb57"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e78619a61ecf91e76aa3e6e8e33fc4894a2bebe93410754bd28fce0a8a4f9f"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:65c1a9bcdadc6c28eecee2c119465aebff8f7a584dd719facdd9e825ec61ab52"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:525808b8019e36eb524b8c68acdd63a37e75714eac50e988180b169d64480a00"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:962f82a3086483f5e5f64dbad880d31038b698494799b097bc59c2edf392fce6"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:aa7bd130efab1c280bed0f45501b7c8795f9fdbeb02e965371bbef3523627779"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:c9c804664ebe8f83a211cace637506669e7890fec1b4195b505c214e50dd4eb7"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-win32.whl", hash = "sha256:10bbfe99883db80bdbaff2dcf681dfc6533a614f700da1287707e8a5d78a8431"},
> +    {file = "MarkupSafe-2.1.3-cp310-cp310-win_amd64.whl", hash = "sha256:1577735524cdad32f9f694208aa75e422adba74f1baee7551620e43a3141f559"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ad9e82fb8f09ade1c3e1b996a6337afac2b8b9e365f926f5a61aacc71adc5b3c"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3c0fae6c3be832a0a0473ac912810b2877c8cb9d76ca48de1ed31e1c68386575"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b076b6226fb84157e3f7c971a47ff3a679d837cf338547532ab866c57930dbee"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bfce63a9e7834b12b87c64d6b155fdd9b3b96191b6bd334bf37db7ff1fe457f2"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:338ae27d6b8745585f87218a3f23f1512dbf52c26c28e322dbe54bcede54ccb9"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e4dd52d80b8c83fdce44e12478ad2e85c64ea965e75d66dbeafb0a3e77308fcc"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:df0be2b576a7abbf737b1575f048c23fb1d769f267ec4358296f31c2479db8f9"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5bbe06f8eeafd38e5d0a4894ffec89378b6c6a625ff57e3028921f8ff59318ac"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-win32.whl", hash = "sha256:dd15ff04ffd7e05ffcb7fe79f1b98041b8ea30ae9234aed2a9168b5797c3effb"},
> +    {file = "MarkupSafe-2.1.3-cp311-cp311-win_amd64.whl", hash = "sha256:134da1eca9ec0ae528110ccc9e48041e0828d79f24121a1a146161103c76e686"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e254ae696c88d98da6555f5ace2279cf7cd5b3f52be2b5cf97feafe883b58d2"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cb0932dc158471523c9637e807d9bfb93e06a95cbf010f1a38b98623b929ef2b"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9402b03f1a1b4dc4c19845e5c749e3ab82d5078d16a2a4c2cd2df62d57bb0707"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ca379055a47383d02a5400cb0d110cef0a776fc644cda797db0c5696cfd7e18e"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:b7ff0f54cb4ff66dd38bebd335a38e2c22c41a8ee45aa608efc890ac3e3931bc"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:c011a4149cfbcf9f03994ec2edffcb8b1dc2d2aede7ca243746df97a5d41ce48"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:56d9f2ecac662ca1611d183feb03a3fa4406469dafe241673d521dd5ae92a155"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-win32.whl", hash = "sha256:8758846a7e80910096950b67071243da3e5a20ed2546e6392603c096778d48e0"},
> +    {file = "MarkupSafe-2.1.3-cp37-cp37m-win_amd64.whl", hash = "sha256:787003c0ddb00500e49a10f2844fac87aa6ce977b90b0feaaf9de23c22508b24"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:2ef12179d3a291be237280175b542c07a36e7f60718296278d8593d21ca937d4"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2c1b19b3aaacc6e57b7e25710ff571c24d6c3613a45e905b1fde04d691b98ee0"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8afafd99945ead6e075b973fefa56379c5b5c53fd8937dad92c662da5d8fd5ee"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8c41976a29d078bb235fea9b2ecd3da465df42a562910f9022f1a03107bd02be"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d080e0a5eb2529460b30190fcfcc4199bd7f827663f858a226a81bc27beaa97e"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:69c0f17e9f5a7afdf2cc9fb2d1ce6aabdb3bafb7f38017c0b77862bcec2bbad8"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:504b320cd4b7eff6f968eddf81127112db685e81f7e36e75f9f84f0df46041c3"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42de32b22b6b804f42c5d98be4f7e5e977ecdd9ee9b660fda1a3edf03b11792d"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-win32.whl", hash = "sha256:ceb01949af7121f9fc39f7d27f91be8546f3fb112c608bc4029aef0bab86a2a5"},
> +    {file = "MarkupSafe-2.1.3-cp38-cp38-win_amd64.whl", hash = "sha256:1b40069d487e7edb2676d3fbdb2b0829ffa2cd63a2ec26c4938b2d34391b4ecc"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8023faf4e01efadfa183e863fefde0046de576c6f14659e8782065bcece22198"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:6b2b56950d93e41f33b4223ead100ea0fe11f8e6ee5f641eb753ce4b77a7042b"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9dcdfd0eaf283af041973bff14a2e143b8bd64e069f4c383416ecd79a81aab58"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05fb21170423db021895e1ea1e1f3ab3adb85d1c2333cbc2310f2a26bc77272e"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:282c2cb35b5b673bbcadb33a585408104df04f14b2d9b01d4c345a3b92861c2c"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:ab4a0df41e7c16a1392727727e7998a467472d0ad65f3ad5e6e765015df08636"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:7ef3cb2ebbf91e330e3bb937efada0edd9003683db6b57bb108c4001f37a02ea"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0a4e4a1aff6c7ac4cd55792abf96c915634c2b97e3cc1c7129578aa68ebd754e"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-win32.whl", hash = "sha256:fec21693218efe39aa7f8599346e90c705afa52c5b31ae019b2e57e8f6542bb2"},
> +    {file = "MarkupSafe-2.1.3-cp39-cp39-win_amd64.whl", hash = "sha256:3fd4abcb888d15a94f32b75d8fd18ee162ca0c064f35b11134be77050296d6ba"},
> +    {file = "MarkupSafe-2.1.3.tar.gz", hash = "sha256:af598ed32d6ae86f1b747b82783958b1a4ab8f617b06fe68795c7f026abbdcad"},
> +]
> +
>   [[package]]
>   name = "mccabe"
>   version = "0.7.0"
> @@ -404,6 +630,17 @@ files = [
>       {file = "mypy_extensions-1.0.0.tar.gz", hash = "sha256:75dbf8955dc00442a438fc4d0666508a9a97b6bd41aa2f0ffe9d2f2725af0782"},
>   ]
>   
> +[[package]]
> +name = "packaging"
> +version = "23.1"
> +description = "Core utilities for Python packages"
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> +    {file = "packaging-23.1-py3-none-any.whl", hash = "sha256:994793af429502c4ea2ebf6bf664629d07c1a9fe974af92966e4b8d2df7edc61"},
> +    {file = "packaging-23.1.tar.gz", hash = "sha256:a392980d2b6cffa644431898be54b0045151319d1e7ec34f0cfed48767dd334f"},
> +]
> +
>   [[package]]
>   name = "paramiko"
>   version = "3.2.0"
> @@ -515,6 +752,20 @@ files = [
>       {file = "pyflakes-2.5.0.tar.gz", hash = "sha256:491feb020dca48ccc562a8c0cbe8df07ee13078df59813b83959cbdada312ea3"},
>   ]
>   
> +[[package]]
> +name = "pygments"
> +version = "2.15.1"
> +description = "Pygments is a syntax highlighting package written in Python."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> +    {file = "Pygments-2.15.1-py3-none-any.whl", hash = "sha256:db2db3deb4b4179f399a09054b023b6a586b76499d36965813c71aa8ed7b5fd1"},
> +    {file = "Pygments-2.15.1.tar.gz", hash = "sha256:8ace4d3c1dd481894b2005f560ead0f9f19ee64fe983366be1a21e171d12775c"},
> +]
> +
> +[package.extras]
> +plugins = ["importlib-metadata"]
> +
>   [[package]]
>   name = "pylama"
>   version = "8.4.1"
> @@ -632,6 +883,27 @@ files = [
>   attrs = ">=22.2.0"
>   rpds-py = ">=0.7.0"
>   
> +[[package]]
> +name = "requests"
> +version = "2.31.0"
> +description = "Python HTTP for Humans."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> +    {file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
> +    {file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
> +]
> +
> +[package.dependencies]
> +certifi = ">=2017.4.17"
> +charset-normalizer = ">=2,<4"
> +idna = ">=2.5,<4"
> +urllib3 = ">=1.21.1,<3"
> +
> +[package.extras]
> +socks = ["PySocks (>=1.5.6,!=1.5.7)"]
> +use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"]
> +
>   [[package]]
>   name = "rpds-py"
>   version = "0.9.2"
> @@ -775,6 +1047,162 @@ files = [
>       {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"},
>   ]
>   
> +[[package]]
> +name = "sphinx"
> +version = "6.2.1"
> +description = "Python documentation generator"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
> +    {file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
> +]
> +
> +[package.dependencies]
> +alabaster = ">=0.7,<0.8"
> +babel = ">=2.9"
> +colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
> +docutils = ">=0.18.1,<0.20"
> +imagesize = ">=1.3"
> +Jinja2 = ">=3.0"
> +packaging = ">=21.0"
> +Pygments = ">=2.13"
> +requests = ">=2.25.0"
> +snowballstemmer = ">=2.0"
> +sphinxcontrib-applehelp = "*"
> +sphinxcontrib-devhelp = "*"
> +sphinxcontrib-htmlhelp = ">=2.0.0"
> +sphinxcontrib-jsmath = "*"
> +sphinxcontrib-qthelp = "*"
> +sphinxcontrib-serializinghtml = ">=1.1.5"
> +
> +[package.extras]
> +docs = ["sphinxcontrib-websupport"]
> +lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
> +test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
> +
> +[[package]]
> +name = "sphinx-rtd-theme"
> +version = "1.2.2"
> +description = "Read the Docs theme for Sphinx"
> +optional = false
> +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7"
> +files = [
> +    {file = "sphinx_rtd_theme-1.2.2-py2.py3-none-any.whl", hash = "sha256:6a7e7d8af34eb8fc57d52a09c6b6b9c46ff44aea5951bc831eeb9245378f3689"},
> +    {file = "sphinx_rtd_theme-1.2.2.tar.gz", hash = "sha256:01c5c5a72e2d025bd23d1f06c59a4831b06e6ce6c01fdd5ebfe9986c0a880fc7"},
> +]
> +
> +[package.dependencies]
> +docutils = "<0.19"
> +sphinx = ">=1.6,<7"
> +sphinxcontrib-jquery = ">=4,<5"
> +
> +[package.extras]
> +dev = ["bump2version", "sphinxcontrib-httpdomain", "transifex-client", "wheel"]
> +
> +[[package]]
> +name = "sphinxcontrib-applehelp"
> +version = "1.0.4"
> +description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "sphinxcontrib-applehelp-1.0.4.tar.gz", hash = "sha256:828f867945bbe39817c210a1abfd1bc4895c8b73fcaade56d45357a348a07d7e"},
> +    {file = "sphinxcontrib_applehelp-1.0.4-py3-none-any.whl", hash = "sha256:29d341f67fb0f6f586b23ad80e072c8e6ad0b48417db2bde114a4c9746feb228"},
> +]
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-devhelp"
> +version = "1.0.2"
> +description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp document."
> +optional = false
> +python-versions = ">=3.5"
> +files = [
> +    {file = "sphinxcontrib-devhelp-1.0.2.tar.gz", hash = "sha256:ff7f1afa7b9642e7060379360a67e9c41e8f3121f2ce9164266f61b9f4b338e4"},
> +    {file = "sphinxcontrib_devhelp-1.0.2-py2.py3-none-any.whl", hash = "sha256:8165223f9a335cc1af7ffe1ed31d2871f325254c0423bc0c4c7cd1c1e4734a2e"},
> +]
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-htmlhelp"
> +version = "2.0.1"
> +description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files"
> +optional = false
> +python-versions = ">=3.8"
> +files = [
> +    {file = "sphinxcontrib-htmlhelp-2.0.1.tar.gz", hash = "sha256:0cbdd302815330058422b98a113195c9249825d681e18f11e8b1f78a2f11efff"},
> +    {file = "sphinxcontrib_htmlhelp-2.0.1-py3-none-any.whl", hash = "sha256:c38cb46dccf316c79de6e5515e1770414b797162b23cd3d06e67020e1d2a6903"},
> +]
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["html5lib", "pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-jquery"
> +version = "4.1"
> +description = "Extension to include jQuery on newer Sphinx releases"
> +optional = false
> +python-versions = ">=2.7"
> +files = [
> +    {file = "sphinxcontrib-jquery-4.1.tar.gz", hash = "sha256:1620739f04e36a2c779f1a131a2dfd49b2fd07351bf1968ced074365933abc7a"},
> +    {file = "sphinxcontrib_jquery-4.1-py2.py3-none-any.whl", hash = "sha256:f936030d7d0147dd026a4f2b5a57343d233f1fc7b363f68b3d4f1cb0993878ae"},
> +]
> +
> +[package.dependencies]
> +Sphinx = ">=1.8"
> +
> +[[package]]
> +name = "sphinxcontrib-jsmath"
> +version = "1.0.1"
> +description = "A sphinx extension which renders display math in HTML via JavaScript"
> +optional = false
> +python-versions = ">=3.5"
> +files = [
> +    {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"},
> +    {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"},
> +]
> +
> +[package.extras]
> +test = ["flake8", "mypy", "pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-qthelp"
> +version = "1.0.3"
> +description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp document."
> +optional = false
> +python-versions = ">=3.5"
> +files = [
> +    {file = "sphinxcontrib-qthelp-1.0.3.tar.gz", hash = "sha256:4c33767ee058b70dba89a6fc5c1892c0d57a54be67ddd3e7875a18d14cba5a72"},
> +    {file = "sphinxcontrib_qthelp-1.0.3-py2.py3-none-any.whl", hash = "sha256:bd9fc24bcb748a8d51fd4ecaade681350aa63009a347a8c14e637895444dfab6"},
> +]
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["pytest"]
> +
> +[[package]]
> +name = "sphinxcontrib-serializinghtml"
> +version = "1.1.5"
> +description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)."
> +optional = false
> +python-versions = ">=3.5"
> +files = [
> +    {file = "sphinxcontrib-serializinghtml-1.1.5.tar.gz", hash = "sha256:aa5f6de5dfdf809ef505c4895e51ef5c9eac17d0f287933eb49ec495280b6952"},
> +    {file = "sphinxcontrib_serializinghtml-1.1.5-py2.py3-none-any.whl", hash = "sha256:352a9a00ae864471d3a7ead8d7d79f5fc0b57e8b3f95e9867eb9eb28999b92fd"},
> +]
> +
> +[package.extras]
> +lint = ["docutils-stubs", "flake8", "mypy"]
> +test = ["pytest"]
> +
>   [[package]]
>   name = "toml"
>   version = "0.10.2"
> @@ -819,6 +1247,23 @@ files = [
>       {file = "typing_extensions-4.7.1.tar.gz", hash = "sha256:b75ddc264f0ba5615db7ba217daeb99701ad295353c45f9e95963337ceeeffb2"},
>   ]
>   
> +[[package]]
> +name = "urllib3"
> +version = "2.0.4"
> +description = "HTTP library with thread-safe connection pooling, file post, and more."
> +optional = false
> +python-versions = ">=3.7"
> +files = [
> +    {file = "urllib3-2.0.4-py3-none-any.whl", hash = "sha256:de7df1803967d2c2a98e4b11bb7d6bd9210474c46e8a0401514e3a42a75ebde4"},
> +    {file = "urllib3-2.0.4.tar.gz", hash = "sha256:8d22f86aae8ef5e410d4f539fde9ce6b2113a001bb4d189e0aed70642d602b11"},
> +]
> +
> +[package.extras]
> +brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"]
> +secure = ["certifi", "cryptography (>=1.9)", "idna (>=2.0.0)", "pyopenssl (>=17.1.0)", "urllib3-secure-extra"]
> +socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
> +zstd = ["zstandard (>=0.18.0)"]
> +
>   [[package]]
>   name = "warlock"
>   version = "2.0.1"
> @@ -837,4 +1282,4 @@ jsonschema = ">=4,<5"
>   [metadata]
>   lock-version = "2.0"
>   python-versions = "^3.10"
> -content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
> +content-hash = "fea1a3eddd1286d2ccd3bdb61c6ce085403f31567dbe4f55b6775bcf1e325372"
> diff --git a/dts/pyproject.toml b/dts/pyproject.toml
> index 6762edfa6b..159940ce02 100644
> --- a/dts/pyproject.toml
> +++ b/dts/pyproject.toml
> @@ -34,6 +34,13 @@ pylama = "^8.4.1"
>   pyflakes = "^2.5.0"
>   toml = "^0.10.2"
>   
> +[tool.poetry.group.docs]
> +optional = true
> +
> +[tool.poetry.group.docs.dependencies]
> +sphinx = "<7"
> +sphinx-rtd-theme = "^1.2.2"
> +
>   [build-system]
>   requires = ["poetry-core>=1.0.0"]
>   build-backend = "poetry.core.masonry.api"
Reviewed-by: Yoan Picchi <yoan.picchi@arm.com>

^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 4/4] dts: format docstrigs to google format
  2023-08-31 10:04       ` [RFC PATCH v4 4/4] dts: format docstrigs to google format Juraj Linkeš
  2023-09-01 17:02         ` Jeremy Spewock
@ 2023-10-31 12:10         ` Yoan Picchi
  2023-11-02 10:17           ` Juraj Linkeš
  1 sibling, 1 reply; 393+ messages in thread
From: Yoan Picchi @ 2023-10-31 12:10 UTC (permalink / raw)
  To: Juraj Linkeš,
	thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb
  Cc: dev

On 8/31/23 11:04, Juraj Linkeš wrote:
> WIP: only one module is reformatted to serve as a demonstration.
> 
> The google format is documented here [0].
> 
> [0]: https://google.github.io/styleguide/pyguide.html
> 
> Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> Acked-by: Jeremy Spweock <jspweock@iol.unh.edu>
> ---
>   dts/framework/testbed_model/node.py | 171 +++++++++++++++++++---------
>   1 file changed, 118 insertions(+), 53 deletions(-)
> 
> diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> index 23efa79c50..619743ebe7 100644
> --- a/dts/framework/testbed_model/node.py
> +++ b/dts/framework/testbed_model/node.py
> @@ -3,8 +3,13 @@
>   # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
>   # Copyright(c) 2022-2023 University of New Hampshire
>   
> -"""
> -A node is a generic host that DTS connects to and manages.
> +"""Common functionality for node management.
> +
> +There's a base class, Node, that's supposed to be extended by other classes

This comment and all the following ones are all something of a nitpick.
This sounds too passive. Why not having something simpler like:
The virtual class Node is meant to be extended by other classes
with functionality specific to that node type.

> +with functionality specific to that node type.
> +The only part that can be used standalone is the Node.skip_setup static method,
> +which is a decorator used to skip method execution if skip_setup is passed
> +by the user on the cmdline or in an env variable.

I'd extend env to the full word as this is meant to go in the documentation.

>   """
>   
>   from abc import ABC
> @@ -35,10 +40,26 @@
>   
>   
>   class Node(ABC):
> -    """
> -    Basic class for node management. This class implements methods that
> -    manage a node, such as information gathering (of CPU/PCI/NIC) and
> -    environment setup.
> +    """The base class for node management.
> +
> +    It shouldn't be instantiated, but rather extended.
> +    It implements common methods to manage any node:
> +
> +       * connection to the node
> +       * information gathering of CPU
> +       * hugepages setup
> +
> +    Arguments:
> +        node_config: The config from the input configuration file.
> +
> +    Attributes:
> +        main_session: The primary OS-agnostic remote session used
> +            to communicate with the node.
> +        config: The configuration used to create the node.
> +        name: The name of the node.
> +        lcores: The list of logical cores that DTS can use on the node.
> +            It's derived from logical cores present on the node and user configuration.
> +        ports: The ports of this node specified in user configuration.
>       """
>   
>       main_session: OSSession
> @@ -77,9 +98,14 @@ def _init_ports(self) -> None:
>               self.configure_port_state(port)
>   
>       def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> -        """
> -        Perform the execution setup that will be done for each execution
> -        this node is part of.
> +        """Execution setup steps.
> +
> +        Configure hugepages and call self._set_up_execution where
> +        the rest of the configuration steps (if any) are implemented.
> +
> +        Args:
> +            execution_config: The execution configuration according to which
> +                the setup steps will be taken.
>           """
>           self._setup_hugepages()
>           self._set_up_execution(execution_config)
> @@ -88,58 +114,78 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
>               self.virtual_devices.append(VirtualDevice(vdev))
>   
>       def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived classes and
> -        is not decorated so that the derived class doesn't have to use the decorator.
> +        """Optional additional execution setup steps for derived classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional execution setup steps.

I'd probably use need or require instead of want (it's the dev that 
wants, not the class)

>           """
>   
>       def tear_down_execution(self) -> None:
> -        """
> -        Perform the execution teardown that will be done after each execution
> -        this node is part of concludes.
> +        """Execution teardown steps.
> +
> +        There are currently no common execution teardown steps
> +        common to all DTS node types.
>           """
>           self.virtual_devices = []
>           self._tear_down_execution()
>   
>       def _tear_down_execution(self) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived classes and
> -        is not decorated so that the derived class doesn't have to use the decorator.
> +        """Optional additional execution teardown steps for derived classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional execution teardown steps.
>           """
>   
>       def set_up_build_target(
>           self, build_target_config: BuildTargetConfiguration
>       ) -> None:
> -        """
> -        Perform the build target setup that will be done for each build target
> -        tested on this node.
> +        """Build target setup steps.
> +
> +        There are currently no common build target setup steps
> +        common to all DTS node types.
> +
> +        Args:
> +            build_target_config: The build target configuration according to which
> +                the setup steps will be taken.
>           """
>           self._set_up_build_target(build_target_config)
>   
>       def _set_up_build_target(
>           self, build_target_config: BuildTargetConfiguration
>       ) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived classes and
> -        is not decorated so that the derived class doesn't have to use the decorator.
> +        """Optional additional build target setup steps for derived classes.
> +
> +        Derived classes should optionally overwrite this

Is there a reason for the "optionally" here when it's absent in all the 
other functions?

> +        if they want to add additional build target setup steps.
>           """
>   
>       def tear_down_build_target(self) -> None:
> -        """
> -        Perform the build target teardown that will be done after each build target
> -        tested on this node.
> +        """Build target teardown steps.
> +
> +        There are currently no common build target teardown steps
> +        common to all DTS node types.
>           """
>           self._tear_down_build_target()
>   
>       def _tear_down_build_target(self) -> None:
> -        """
> -        This method exists to be optionally overwritten by derived classes and
> -        is not decorated so that the derived class doesn't have to use the decorator.
> +        """Optional additional build target teardown steps for derived classes.
> +
> +        Derived classes should overwrite this
> +        if they want to add additional build target teardown steps.
>           """
>   
>       def create_session(self, name: str) -> OSSession:
> -        """
> -        Create and return a new OSSession tailored to the remote OS.
> +        """Create and return a new OS-agnostic remote session.
> +
> +        The returned session won't be used by the node creating it.
> +        The session must be used by the caller.
> +        Will be cleaned up automatically.

I had a double take reading this before I understood that the subject 
was the previously mentioned session. I'd recommend to either add "it" 
or "the session". Also, it will be cleaned automatically... when? When I 
create a new session? when the node is deleted?

> +
> +        Args:
> +            name: The name of the session.
> +
> +        Returns:
> +            A new OS-agnostic remote session.
>           """
>           session_name = f"{self.name} {name}"
>           connection = create_session(
> @@ -186,14 +232,24 @@ def filter_lcores(
>           filter_specifier: LogicalCoreCount | LogicalCoreList,
>           ascending: bool = True,
>       ) -> list[LogicalCore]:
> -        """
> -        Filter the LogicalCores found on the Node according to
> -        a LogicalCoreCount or a LogicalCoreList.
> +        """Filter the node's logical cores that DTS can use.
>   
> -        If ascending is True, use cores with the lowest numerical id first
> -        and continue in ascending order. If False, start with the highest
> -        id and continue in descending order. This ordering affects which
> -        sockets to consider first as well.
> +        Logical cores that DTS can use are ones that are present on the node,
> +        but filtered according to user config.
> +        The filter_specifier will filter cores from those logical cores.
> +
> +        Args:
> +            filter_specifier: Two different filters can be used, one that specifies
> +                the number of logical cores per core, cores per socket and
> +                the number of sockets,
> +                the other that specifies a logical core list.
> +            ascending: If True, use cores with the lowest numerical id first
> +                and continue in ascending order. If False, start with the highest
> +                id and continue in descending order. This ordering affects which
> +                sockets to consider first as well.
> +
> +        Returns:
> +            A list of logical cores.
>           """
>           self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
>           return lcore_filter(
> @@ -203,17 +259,14 @@ def filter_lcores(
>           ).filter()
>   
>       def _get_remote_cpus(self) -> None:
> -        """
> -        Scan CPUs in the remote OS and store a list of LogicalCores.
> -        """
> +        """Scan CPUs in the remote OS and store a list of LogicalCores."""
>           self._logger.info("Getting CPU information.")
>           self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
>   
>       def _setup_hugepages(self):
> -        """
> -        Setup hugepages on the Node. Different architectures can supply different
> -        amounts of memory for hugepages and numa-based hugepage allocation may need
> -        to be considered.
> +        """Setup hugepages on the Node.
> +
> +        Configure the hugepages only if they're specified in user configuration.
>           """
>           if self.config.hugepages:
>               self.main_session.setup_hugepages(
> @@ -221,8 +274,11 @@ def _setup_hugepages(self):
>               )
>   
>       def configure_port_state(self, port: Port, enable: bool = True) -> None:
> -        """
> -        Enable/disable port.
> +        """Enable/disable port.
> +
> +        Args:
> +            port: The port to enable/disable.
> +            enable: True to enable, false to disable.
>           """
>           self.main_session.configure_port_state(port, enable)
>   
> @@ -232,15 +288,19 @@ def configure_port_ip_address(
>           port: Port,
>           delete: bool = False,
>       ) -> None:
> -        """
> -        Configure the IP address of a port on this node.
> +        """Add an IP address to a port on this node.
> +
> +        Args:
> +            address: The IP address with mask in CIDR format.
> +                Can be either IPv4 or IPv6.
> +            port: The port to which to add the address.
> +            delete: If True, will delete the address from the port
> +                instead of adding it.
>           """
>           self.main_session.configure_port_ip_address(address, port, delete)
>   
>       def close(self) -> None:
> -        """
> -        Close all connections and free other resources.
> -        """
> +        """Close all connections and free other resources."""
>           if self.main_session:
>               self.main_session.close()
>           for session in self._other_sessions:
> @@ -249,6 +309,11 @@ def close(self) -> None:
>   
>       @staticmethod
>       def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> +        """A decorator that skips the decorated function.
> +
> +        When used, the decorator executes an empty lambda function
> +        instead of the decorated function.
> +        """
>           if SETTINGS.skip_setup:
>               return lambda *args: None
>           else:


^ permalink raw reply	[flat|nested] 393+ messages in thread

* Re: [RFC PATCH v4 4/4] dts: format docstrigs to google format
  2023-10-31 12:10         ` Yoan Picchi
@ 2023-11-02 10:17           ` Juraj Linkeš
  0 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-02 10:17 UTC (permalink / raw)
  To: Yoan Picchi
  Cc: thomas, Honnappa.Nagarahalli, lijuan.tu, bruce.richardson,
	jspewock, probb, dev

On Tue, Oct 31, 2023 at 1:10 PM Yoan Picchi <yoan.picchi@foss.arm.com> wrote:
>
> On 8/31/23 11:04, Juraj Linkeš wrote:
> > WIP: only one module is reformatted to serve as a demonstration.
> >
> > The google format is documented here [0].
> >
> > [0]: https://google.github.io/styleguide/pyguide.html
> >
> > Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
> > Acked-by: Jeremy Spweock <jspweock@iol.unh.edu>
> > ---
> >   dts/framework/testbed_model/node.py | 171 +++++++++++++++++++---------
> >   1 file changed, 118 insertions(+), 53 deletions(-)
> >
> > diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
> > index 23efa79c50..619743ebe7 100644
> > --- a/dts/framework/testbed_model/node.py
> > +++ b/dts/framework/testbed_model/node.py
> > @@ -3,8 +3,13 @@
> >   # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
> >   # Copyright(c) 2022-2023 University of New Hampshire
> >
> > -"""
> > -A node is a generic host that DTS connects to and manages.
> > +"""Common functionality for node management.
> > +
> > +There's a base class, Node, that's supposed to be extended by other classes
>
> This comment and all the following ones are all something of a nitpick.

I think they're pretty helpful.

> This sounds too passive. Why not having something simpler like:
> The virtual class Node is meant to be extended by other classes
> with functionality specific to that node type.
>

Okay, makes sense.

> > +with functionality specific to that node type.
> > +The only part that can be used standalone is the Node.skip_setup static method,
> > +which is a decorator used to skip method execution if skip_setup is passed
> > +by the user on the cmdline or in an env variable.
>
> I'd extend env to the full word as this is meant to go in the documentation.
>

I'll try to do this everywhere in docs.

> >   """
> >
> >   from abc import ABC
> > @@ -35,10 +40,26 @@
> >
> >
> >   class Node(ABC):
> > -    """
> > -    Basic class for node management. This class implements methods that
> > -    manage a node, such as information gathering (of CPU/PCI/NIC) and
> > -    environment setup.
> > +    """The base class for node management.
> > +
> > +    It shouldn't be instantiated, but rather extended.
> > +    It implements common methods to manage any node:
> > +
> > +       * connection to the node
> > +       * information gathering of CPU
> > +       * hugepages setup
> > +
> > +    Arguments:
> > +        node_config: The config from the input configuration file.
> > +
> > +    Attributes:
> > +        main_session: The primary OS-agnostic remote session used
> > +            to communicate with the node.
> > +        config: The configuration used to create the node.
> > +        name: The name of the node.
> > +        lcores: The list of logical cores that DTS can use on the node.
> > +            It's derived from logical cores present on the node and user configuration.
> > +        ports: The ports of this node specified in user configuration.
> >       """
> >
> >       main_session: OSSession
> > @@ -77,9 +98,14 @@ def _init_ports(self) -> None:
> >               self.configure_port_state(port)
> >
> >       def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> > -        """
> > -        Perform the execution setup that will be done for each execution
> > -        this node is part of.
> > +        """Execution setup steps.
> > +
> > +        Configure hugepages and call self._set_up_execution where
> > +        the rest of the configuration steps (if any) are implemented.
> > +
> > +        Args:
> > +            execution_config: The execution configuration according to which
> > +                the setup steps will be taken.
> >           """
> >           self._setup_hugepages()
> >           self._set_up_execution(execution_config)
> > @@ -88,58 +114,78 @@ def set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> >               self.virtual_devices.append(VirtualDevice(vdev))
> >
> >       def _set_up_execution(self, execution_config: ExecutionConfiguration) -> None:
> > -        """
> > -        This method exists to be optionally overwritten by derived classes and
> > -        is not decorated so that the derived class doesn't have to use the decorator.
> > +        """Optional additional execution setup steps for derived classes.
> > +
> > +        Derived classes should overwrite this
> > +        if they want to add additional execution setup steps.
>
> I'd probably use need or require instead of want (it's the dev that
> wants, not the class)
>

Good point, that's a better wording.

> >           """
> >
> >       def tear_down_execution(self) -> None:
> > -        """
> > -        Perform the execution teardown that will be done after each execution
> > -        this node is part of concludes.
> > +        """Execution teardown steps.
> > +
> > +        There are currently no common execution teardown steps
> > +        common to all DTS node types.
> >           """
> >           self.virtual_devices = []
> >           self._tear_down_execution()
> >
> >       def _tear_down_execution(self) -> None:
> > -        """
> > -        This method exists to be optionally overwritten by derived classes and
> > -        is not decorated so that the derived class doesn't have to use the decorator.
> > +        """Optional additional execution teardown steps for derived classes.
> > +
> > +        Derived classes should overwrite this
> > +        if they want to add additional execution teardown steps.
> >           """
> >
> >       def set_up_build_target(
> >           self, build_target_config: BuildTargetConfiguration
> >       ) -> None:
> > -        """
> > -        Perform the build target setup that will be done for each build target
> > -        tested on this node.
> > +        """Build target setup steps.
> > +
> > +        There are currently no common build target setup steps
> > +        common to all DTS node types.
> > +
> > +        Args:
> > +            build_target_config: The build target configuration according to which
> > +                the setup steps will be taken.
> >           """
> >           self._set_up_build_target(build_target_config)
> >
> >       def _set_up_build_target(
> >           self, build_target_config: BuildTargetConfiguration
> >       ) -> None:
> > -        """
> > -        This method exists to be optionally overwritten by derived classes and
> > -        is not decorated so that the derived class doesn't have to use the decorator.
> > +        """Optional additional build target setup steps for derived classes.
> > +
> > +        Derived classes should optionally overwrite this
>
> Is there a reason for the "optionally" here when it's absent in all the
> other functions?
>

I've actually caught this as well when addressing the previous
comment. I unified the wording (without optionally, it's reduntant).

> > +        if they want to add additional build target setup steps.
> >           """
> >
> >       def tear_down_build_target(self) -> None:
> > -        """
> > -        Perform the build target teardown that will be done after each build target
> > -        tested on this node.
> > +        """Build target teardown steps.
> > +
> > +        There are currently no common build target teardown steps
> > +        common to all DTS node types.
> >           """
> >           self._tear_down_build_target()
> >
> >       def _tear_down_build_target(self) -> None:
> > -        """
> > -        This method exists to be optionally overwritten by derived classes and
> > -        is not decorated so that the derived class doesn't have to use the decorator.
> > +        """Optional additional build target teardown steps for derived classes.
> > +
> > +        Derived classes should overwrite this
> > +        if they want to add additional build target teardown steps.
> >           """
> >
> >       def create_session(self, name: str) -> OSSession:
> > -        """
> > -        Create and return a new OSSession tailored to the remote OS.
> > +        """Create and return a new OS-agnostic remote session.
> > +
> > +        The returned session won't be used by the node creating it.
> > +        The session must be used by the caller.
> > +        Will be cleaned up automatically.
>
> I had a double take reading this before I understood that the subject
> was the previously mentioned session. I'd recommend to either add "it"
> or "the session". Also, it will be cleaned automatically... when? When I
> create a new session? when the node is deleted?
>

The session will be cleaned up as part of the node's cleanup. What
about this new wording:

The returned session won't be used by the node creating it. The
session must be used by
the caller. The session will be maintained for the entire lifecycle of
the node object,
at the end of which the session will be cleaned up automatically.

I've added a note:

Note:
    Any number of these supplementary sessions may be created.


> > +
> > +        Args:
> > +            name: The name of the session.
> > +
> > +        Returns:
> > +            A new OS-agnostic remote session.
> >           """
> >           session_name = f"{self.name} {name}"
> >           connection = create_session(
> > @@ -186,14 +232,24 @@ def filter_lcores(
> >           filter_specifier: LogicalCoreCount | LogicalCoreList,
> >           ascending: bool = True,
> >       ) -> list[LogicalCore]:
> > -        """
> > -        Filter the LogicalCores found on the Node according to
> > -        a LogicalCoreCount or a LogicalCoreList.
> > +        """Filter the node's logical cores that DTS can use.
> >
> > -        If ascending is True, use cores with the lowest numerical id first
> > -        and continue in ascending order. If False, start with the highest
> > -        id and continue in descending order. This ordering affects which
> > -        sockets to consider first as well.
> > +        Logical cores that DTS can use are ones that are present on the node,
> > +        but filtered according to user config.
> > +        The filter_specifier will filter cores from those logical cores.
> > +
> > +        Args:
> > +            filter_specifier: Two different filters can be used, one that specifies
> > +                the number of logical cores per core, cores per socket and
> > +                the number of sockets,
> > +                the other that specifies a logical core list.
> > +            ascending: If True, use cores with the lowest numerical id first
> > +                and continue in ascending order. If False, start with the highest
> > +                id and continue in descending order. This ordering affects which
> > +                sockets to consider first as well.
> > +
> > +        Returns:
> > +            A list of logical cores.
> >           """
> >           self._logger.debug(f"Filtering {filter_specifier} from {self.lcores}.")
> >           return lcore_filter(
> > @@ -203,17 +259,14 @@ def filter_lcores(
> >           ).filter()
> >
> >       def _get_remote_cpus(self) -> None:
> > -        """
> > -        Scan CPUs in the remote OS and store a list of LogicalCores.
> > -        """
> > +        """Scan CPUs in the remote OS and store a list of LogicalCores."""
> >           self._logger.info("Getting CPU information.")
> >           self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
> >
> >       def _setup_hugepages(self):
> > -        """
> > -        Setup hugepages on the Node. Different architectures can supply different
> > -        amounts of memory for hugepages and numa-based hugepage allocation may need
> > -        to be considered.
> > +        """Setup hugepages on the Node.
> > +
> > +        Configure the hugepages only if they're specified in user configuration.
> >           """
> >           if self.config.hugepages:
> >               self.main_session.setup_hugepages(
> > @@ -221,8 +274,11 @@ def _setup_hugepages(self):
> >               )
> >
> >       def configure_port_state(self, port: Port, enable: bool = True) -> None:
> > -        """
> > -        Enable/disable port.
> > +        """Enable/disable port.
> > +
> > +        Args:
> > +            port: The port to enable/disable.
> > +            enable: True to enable, false to disable.
> >           """
> >           self.main_session.configure_port_state(port, enable)
> >
> > @@ -232,15 +288,19 @@ def configure_port_ip_address(
> >           port: Port,
> >           delete: bool = False,
> >       ) -> None:
> > -        """
> > -        Configure the IP address of a port on this node.
> > +        """Add an IP address to a port on this node.
> > +
> > +        Args:
> > +            address: The IP address with mask in CIDR format.
> > +                Can be either IPv4 or IPv6.
> > +            port: The port to which to add the address.
> > +            delete: If True, will delete the address from the port
> > +                instead of adding it.
> >           """
> >           self.main_session.configure_port_ip_address(address, port, delete)
> >
> >       def close(self) -> None:
> > -        """
> > -        Close all connections and free other resources.
> > -        """
> > +        """Close all connections and free other resources."""
> >           if self.main_session:
> >               self.main_session.close()
> >           for session in self._other_sessions:
> > @@ -249,6 +309,11 @@ def close(self) -> None:
> >
> >       @staticmethod
> >       def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
> > +        """A decorator that skips the decorated function.
> > +
> > +        When used, the decorator executes an empty lambda function
> > +        instead of the decorated function.
> > +        """
> >           if SETTINGS.skip_setup:
> >               return lambda *args: None
> >           else:
>

^ permalink raw reply	[flat|nested] 393+ messages in thread

* [PATCH v5 00/23] dts: add dts api docs
  2023-08-31 10:04     ` [RFC PATCH v4 " Juraj Linkeš
                         ` (3 preceding siblings ...)
  2023-08-31 10:04       ` [RFC PATCH v4 4/4] dts: format docstrigs to google format Juraj Linkeš
@ 2023-11-06 17:15       ` Juraj Linkeš
  2023-11-06 17:15         ` [PATCH v5 01/23] dts: code adjustments for doc generation Juraj Linkeš
                           ` (23 more replies)
  4 siblings, 24 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
	paul.szczepanek, yoan.picchi
  Cc: dev, Juraj Linkeš

The commits can be split into groups.

The first commit makes changes to the code. These code changes mainly
change the structure of the code so that the actual API docs generation
works. There are also some code changes which get reflected in the
documentation, such as making functions/methods/attributes private or
public.

The second set of commits (2-21) deal with the actual docstring
documentation (from which the API docs are generated). The format of
docstrings is the Google format [0] with PEP257 [1] and some guidelines
captured in the last commit of this group covering what the Google
format doesn't.
The docstring updates are split into many commits to make review
possible. When accepted, these may be squashed (commits 4-21).
The docstrings have been composed in anticipation of [2], adhering to
maximum line length of 100. We don't have a tool for automatic docstring
formatting, hence the usage of 100 right away to save time.

NOTE: The logger.py module is not fully documented, as it's being
refactored and the refactor will be submitted in the near future.
Documenting it now seems unnecessary.

The last two commits comprise the final group, enabling the actual
generation of documentation.
The generation is done with Sphinx, which DPDK already uses, with
slightly modified configuration (the sidebar: unlimited depth and better
collapsing - I need comment on this).

The first two groups are the most important to merge, as future
development can't proceed without them. The third group may be
finished/accepted at a later date, as it's fairly independent.

The build requires the same Python version and dependencies as DTS,
because Sphinx imports the Python modules. The modules are imported
individually, requiring the code refactoring mentioned above.
Dependencies are installed
using Poetry from the dts directory:

poetry install --with docs

After installing, enter the Poetry shell:

poetry shell

And then run the build:
ninja -C <meson_build_dir> dts-doc

[0] https://google.github.io/styleguide/pyguide.html#s3.8.4-comments-in-classes
[1] https://peps.python.org/pep-0257/
[2] https://patches.dpdk.org/project/dpdk/list/?series=29844

Juraj Linkeš (23):
  dts: code adjustments for doc generation
  dts: add docstring checker
  dts: add basic developer docs
  dts: exceptions docstring update
  dts: settings docstring update
  dts: logger and settings docstring update
  dts: dts runner and main docstring update
  dts: test suite docstring update
  dts: test result docstring update
  dts: config docstring update
  dts: remote session docstring update
  dts: interactive remote session docstring update
  dts: port and virtual device docstring update
  dts: cpu docstring update
  dts: os session docstring update
  dts: posix and linux sessions docstring update
  dts: node docstring update
  dts: sut and tg nodes docstring update
  dts: base traffic generators docstring update
  dts: scapy tg docstring update
  dts: test suites docstring update
  dts: add doc generation dependencies
  dts: add doc generation

 buildtools/call-sphinx-build.py               |  29 +-
 doc/api/meson.build                           |   1 +
 doc/guides/conf.py                            |  34 +-
 doc/guides/meson.build                        |   1 +
 doc/guides/tools/dts.rst                      | 103 ++++
 dts/doc/conf_yaml_schema.json                 |   1 +
 dts/doc/index.rst                             |  17 +
 dts/doc/meson.build                           |  49 ++
 dts/framework/__init__.py                     |  12 +-
 dts/framework/config/__init__.py              | 379 ++++++++++---
 dts/framework/config/types.py                 | 132 +++++
 dts/framework/dts.py                          | 161 +++++-
 dts/framework/exception.py                    | 156 +++---
 dts/framework/logger.py                       |  72 ++-
 dts/framework/remote_session/__init__.py      |  80 ++-
 .../interactive_remote_session.py             |  36 +-
 .../remote_session/interactive_shell.py       | 152 ++++++
 dts/framework/remote_session/os_session.py    | 284 ----------
 dts/framework/remote_session/python_shell.py  |  32 ++
 .../remote_session/remote/__init__.py         |  27 -
 .../remote/interactive_shell.py               | 133 -----
 .../remote_session/remote/python_shell.py     |  12 -
 .../remote_session/remote/remote_session.py   | 172 ------
 .../remote_session/remote/testpmd_shell.py    |  49 --
 .../remote_session/remote_session.py          | 232 ++++++++
 .../{remote => }/ssh_session.py               |  28 +-
 dts/framework/remote_session/testpmd_shell.py |  86 +++
 dts/framework/settings.py                     | 188 +++++--
 dts/framework/test_result.py                  | 296 +++++++---
 dts/framework/test_suite.py                   | 230 ++++++--
 dts/framework/testbed_model/__init__.py       |  28 +-
 dts/framework/testbed_model/{hw => }/cpu.py   | 209 +++++--
 dts/framework/testbed_model/hw/__init__.py    |  27 -
 dts/framework/testbed_model/hw/port.py        |  60 ---
 .../testbed_model/hw/virtual_device.py        |  16 -
 .../linux_session.py                          |  69 ++-
 dts/framework/testbed_model/node.py           | 217 +++++---
 dts/framework/testbed_model/os_session.py     | 425 +++++++++++++++
 dts/framework/testbed_model/port.py           |  93 ++++
 .../posix_session.py                          |  85 ++-
 dts/framework/testbed_model/sut_node.py       | 227 +++++---
 dts/framework/testbed_model/tg_node.py        |  70 +--
 .../testbed_model/traffic_generator.py        |  72 ---
 .../traffic_generator/__init__.py             |  44 ++
 .../capturing_traffic_generator.py            |  52 +-
 .../{ => traffic_generator}/scapy.py          | 114 ++--
 .../traffic_generator/traffic_generator.py    |  87 +++
 dts/framework/testbed_model/virtual_device.py |  29 +
 dts/framework/utils.py                        | 136 +++--
 dts/main.py                                   |  17 +-
 dts/meson.build                               |  16 +
 dts/poetry.lock                               | 509 +++++++++++++++++-
 dts/pyproject.toml                            |  13 +-
 dts/tests/TestSuite_hello_world.py            |  16 +-
 dts/tests/TestSuite_os_udp.py                 |  16 +-
 dts/tests/TestSuite_smoke_tests.py            |  53 +-
 meson.build                                   |   1 +
 57 files changed, 4181 insertions(+), 1704 deletions(-)
 create mode 120000 dts/doc/conf_yaml_schema.json
 create mode 100644 dts/doc/index.rst
 create mode 100644 dts/doc/meson.build
 create mode 100644 dts/framework/config/types.py
 rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (76%)
 create mode 100644 dts/framework/remote_session/interactive_shell.py
 delete mode 100644 dts/framework/remote_session/os_session.py
 create mode 100644 dts/framework/remote_session/python_shell.py
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 delete mode 100644 dts/framework/remote_session/remote/interactive_shell.py
 delete mode 100644 dts/framework/remote_session/remote/python_shell.py
 delete mode 100644 dts/framework/remote_session/remote/remote_session.py
 delete mode 100644 dts/framework/remote_session/remote/testpmd_shell.py
 create mode 100644 dts/framework/remote_session/remote_session.py
 rename dts/framework/remote_session/{remote => }/ssh_session.py (83%)
 create mode 100644 dts/framework/remote_session/testpmd_shell.py
 rename dts/framework/testbed_model/{hw => }/cpu.py (50%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 delete mode 100644 dts/framework/testbed_model/hw/port.py
 delete mode 100644 dts/framework/testbed_model/hw/virtual_device.py
 rename dts/framework/{remote_session => testbed_model}/linux_session.py (79%)
 create mode 100644 dts/framework/testbed_model/os_session.py
 create mode 100644 dts/framework/testbed_model/port.py
 rename dts/framework/{remote_session => testbed_model}/posix_session.py (74%)
 delete mode 100644 dts/framework/testbed_model/traffic_generator.py
 create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
 rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (66%)
 rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (71%)
 create mode 100644 dts/framework/testbed_model/traffic_generator/traffic_generator.py
 create mode 100644 dts/framework/testbed_model/virtual_device.py
 create mode 100644 dts/meson.build

-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [PATCH v5 01/23] dts: code adjustments for doc generation
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
@ 2023-11-06 17:15         ` Juraj Linkeš
  2023-11-08 13:35           ` Yoan Picchi
  2023-11-06 17:15         ` [PATCH v5 02/23] dts: add docstring checker Juraj Linkeš
                           ` (22 subsequent siblings)
  23 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
	paul.szczepanek, yoan.picchi
  Cc: dev, Juraj Linkeš

The standard Python tool for generating API documentation, Sphinx,
imports modules one-by-one when generating the documentation. This
requires code changes:
* properly guarding argument parsing in the if __name__ == '__main__'
  block,
* the logger used by DTS runner underwent the same treatment so that it
  doesn't create log files outside of a DTS run,
* however, DTS uses the arguments to construct an object holding global
  variables. The defaults for the global variables needed to be moved
  from argument parsing elsewhere,
* importing the remote_session module from framework resulted in
  circular imports because of one module trying to import another
  module. This is fixed by reorganizing the code,
* some code reorganization was done because the resulting structure
  makes more sense, improving documentation clarity.

The are some other changes which are documentation related:
* added missing type annotation so they appear in the generated docs,
* reordered arguments in some methods,
* removed superfluous arguments and attributes,
* change private functions/methods/attributes to private and vice-versa.

The above all appear in the generated documentation and the with them,
the documentation is improved.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/config/__init__.py              | 10 ++-
 dts/framework/dts.py                          | 33 +++++--
 dts/framework/exception.py                    | 54 +++++-------
 dts/framework/remote_session/__init__.py      | 41 ++++-----
 .../interactive_remote_session.py             |  0
 .../{remote => }/interactive_shell.py         |  0
 .../{remote => }/python_shell.py              |  0
 .../remote_session/remote/__init__.py         | 27 ------
 .../{remote => }/remote_session.py            |  0
 .../{remote => }/ssh_session.py               | 12 +--
 .../{remote => }/testpmd_shell.py             |  0
 dts/framework/settings.py                     | 87 +++++++++++--------
 dts/framework/test_result.py                  |  4 +-
 dts/framework/test_suite.py                   |  7 +-
 dts/framework/testbed_model/__init__.py       | 12 +--
 dts/framework/testbed_model/{hw => }/cpu.py   | 13 +++
 dts/framework/testbed_model/hw/__init__.py    | 27 ------
 .../linux_session.py                          |  6 +-
 dts/framework/testbed_model/node.py           | 26 ++++--
 .../os_session.py                             | 22 ++---
 dts/framework/testbed_model/{hw => }/port.py  |  0
 .../posix_session.py                          |  4 +-
 dts/framework/testbed_model/sut_node.py       |  8 +-
 dts/framework/testbed_model/tg_node.py        | 30 +------
 .../traffic_generator/__init__.py             | 24 +++++
 .../capturing_traffic_generator.py            |  6 +-
 .../{ => traffic_generator}/scapy.py          | 23 ++---
 .../traffic_generator.py                      | 16 +++-
 .../testbed_model/{hw => }/virtual_device.py  |  0
 dts/framework/utils.py                        | 46 +++-------
 dts/main.py                                   |  9 +-
 31 files changed, 259 insertions(+), 288 deletions(-)
 rename dts/framework/remote_session/{remote => }/interactive_remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/interactive_shell.py (100%)
 rename dts/framework/remote_session/{remote => }/python_shell.py (100%)
 delete mode 100644 dts/framework/remote_session/remote/__init__.py
 rename dts/framework/remote_session/{remote => }/remote_session.py (100%)
 rename dts/framework/remote_session/{remote => }/ssh_session.py (91%)
 rename dts/framework/remote_session/{remote => }/testpmd_shell.py (100%)
 rename dts/framework/testbed_model/{hw => }/cpu.py (95%)
 delete mode 100644 dts/framework/testbed_model/hw/__init__.py
 rename dts/framework/{remote_session => testbed_model}/linux_session.py (97%)
 rename dts/framework/{remote_session => testbed_model}/os_session.py (95%)
 rename dts/framework/testbed_model/{hw => }/port.py (100%)
 rename dts/framework/{remote_session => testbed_model}/posix_session.py (98%)
 create mode 100644 dts/framework/testbed_model/traffic_generator/__init__.py
 rename dts/framework/testbed_model/{ => traffic_generator}/capturing_traffic_generator.py (96%)
 rename dts/framework/testbed_model/{ => traffic_generator}/scapy.py (95%)
 rename dts/framework/testbed_model/{ => traffic_generator}/traffic_generator.py (80%)
 rename dts/framework/testbed_model/{hw => }/virtual_device.py (100%)

diff --git a/dts/framework/config/__init__.py b/dts/framework/config/__init__.py
index cb7e00ba34..2044c82611 100644
--- a/dts/framework/config/__init__.py
+++ b/dts/framework/config/__init__.py
@@ -17,6 +17,7 @@
 import warlock  # type: ignore[import]
 import yaml
 
+from framework.exception import ConfigurationError
 from framework.settings import SETTINGS
 from framework.utils import StrEnum
 
@@ -89,7 +90,7 @@ class TrafficGeneratorConfig:
     traffic_generator_type: TrafficGeneratorType
 
     @staticmethod
-    def from_dict(d: dict):
+    def from_dict(d: dict) -> "ScapyTrafficGeneratorConfig":
         # This looks useless now, but is designed to allow expansion to traffic
         # generators that require more configuration later.
         match TrafficGeneratorType(d["type"]):
@@ -97,6 +98,10 @@ def from_dict(d: dict):
                 return ScapyTrafficGeneratorConfig(
                     traffic_generator_type=TrafficGeneratorType.SCAPY
                 )
+            case _:
+                raise ConfigurationError(
+                    f'Unknown traffic generator type "{d["type"]}".'
+                )
 
 
 @dataclass(slots=True, frozen=True)
@@ -324,6 +329,3 @@ def load_config() -> Configuration:
     config: dict[str, Any] = warlock.model_factory(schema, name="_Config")(config_data)
     config_obj: Configuration = Configuration.from_dict(dict(config))
     return config_obj
-
-
-CONFIGURATION = load_config()
diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index f773f0c38d..4c7fb0c40a 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -6,19 +6,19 @@
 import sys
 
 from .config import (
-    CONFIGURATION,
     BuildTargetConfiguration,
     ExecutionConfiguration,
     TestSuiteConfig,
+    load_config,
 )
 from .exception import BlockingTestSuiteError
 from .logger import DTSLOG, getLogger
 from .test_result import BuildTargetResult, DTSResult, ExecutionResult, Result
 from .test_suite import get_test_suites
 from .testbed_model import SutNode, TGNode
-from .utils import check_dts_python_version
 
-dts_logger: DTSLOG = getLogger("DTSRunner")
+# dummy defaults to satisfy linters
+dts_logger: DTSLOG = None  # type: ignore[assignment]
 result: DTSResult = DTSResult(dts_logger)
 
 
@@ -30,14 +30,18 @@ def run_all() -> None:
     global dts_logger
     global result
 
+    # create a regular DTS logger and create a new result with it
+    dts_logger = getLogger("DTSRunner")
+    result = DTSResult(dts_logger)
+
     # check the python version of the server that run dts
-    check_dts_python_version()
+    _check_dts_python_version()
 
     sut_nodes: dict[str, SutNode] = {}
     tg_nodes: dict[str, TGNode] = {}
     try:
         # for all Execution sections
-        for execution in CONFIGURATION.executions:
+        for execution in load_config().executions:
             sut_node = sut_nodes.get(execution.system_under_test_node.name)
             tg_node = tg_nodes.get(execution.traffic_generator_node.name)
 
@@ -82,6 +86,25 @@ def run_all() -> None:
     _exit_dts()
 
 
+def _check_dts_python_version() -> None:
+    def RED(text: str) -> str:
+        return f"\u001B[31;1m{str(text)}\u001B[0m"
+
+    if sys.version_info.major < 3 or (
+        sys.version_info.major == 3 and sys.version_info.minor < 10
+    ):
+        print(
+            RED(
+                (
+                    "WARNING: DTS execution node's python version is lower than"
+                    "python 3.10, is deprecated and will not work in future releases."
+                )
+            ),
+            file=sys.stderr,
+        )
+        print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+
+
 def _run_execution(
     sut_node: SutNode,
     tg_node: TGNode,
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 001a5a5496..7489c03570 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -42,19 +42,14 @@ class SSHTimeoutError(DTSError):
     Command execution timeout.
     """
 
-    command: str
-    output: str
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+    _command: str
 
-    def __init__(self, command: str, output: str):
-        self.command = command
-        self.output = output
+    def __init__(self, command: str):
+        self._command = command
 
     def __str__(self) -> str:
-        return f"TIMEOUT on {self.command}"
-
-    def get_output(self) -> str:
-        return self.output
+        return f"TIMEOUT on {self._command}"
 
 
 class SSHConnectionError(DTSError):
@@ -62,18 +57,18 @@ class SSHConnectionError(DTSError):
     SSH connection error.
     """
 
-    host: str
-    errors: list[str]
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+    _host: str
+    _errors: list[str]
 
     def __init__(self, host: str, errors: list[str] | None = None):
-        self.host = host
-        self.errors = [] if errors is None else errors
+        self._host = host
+        self._errors = [] if errors is None else errors
 
     def __str__(self) -> str:
-        message = f"Error trying to connect with {self.host}."
-        if self.errors:
-            message += f" Errors encountered while retrying: {', '.join(self.errors)}"
+        message = f"Error trying to connect with {self._host}."
+        if self._errors:
+            message += f" Errors encountered while retrying: {', '.join(self._errors)}"
 
         return message
 
@@ -84,14 +79,14 @@ class SSHSessionDeadError(DTSError):
     It can no longer be used.
     """
 
-    host: str
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
+    _host: str
 
     def __init__(self, host: str):
-        self.host = host
+        self._host = host
 
     def __str__(self) -> str:
-        return f"SSH session with {self.host} has died"
+        return f"SSH session with {self._host} has died"
 
 
 class ConfigurationError(DTSError):
@@ -107,18 +102,18 @@ class RemoteCommandExecutionError(DTSError):
     Raised when a command executed on a Node returns a non-zero exit status.
     """
 
-    command: str
-    command_return_code: int
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+    command: str
+    _command_return_code: int
 
     def __init__(self, command: str, command_return_code: int):
         self.command = command
-        self.command_return_code = command_return_code
+        self._command_return_code = command_return_code
 
     def __str__(self) -> str:
         return (
             f"Command {self.command} returned a non-zero exit code: "
-            f"{self.command_return_code}"
+            f"{self._command_return_code}"
         )
 
 
@@ -143,22 +138,15 @@ class TestCaseVerifyError(DTSError):
     Used in test cases to verify the expected behavior.
     """
 
-    value: str
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
 
-    def __init__(self, value: str):
-        self.value = value
-
-    def __str__(self) -> str:
-        return repr(self.value)
-
 
 class BlockingTestSuiteError(DTSError):
-    suite_name: str
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
+    _suite_name: str
 
     def __init__(self, suite_name: str) -> None:
-        self.suite_name = suite_name
+        self._suite_name = suite_name
 
     def __str__(self) -> str:
-        return f"Blocking suite {self.suite_name} failed."
+        return f"Blocking suite {self._suite_name} failed."
diff --git a/dts/framework/remote_session/__init__.py b/dts/framework/remote_session/__init__.py
index 00b6d1f03a..5e7ddb2b05 100644
--- a/dts/framework/remote_session/__init__.py
+++ b/dts/framework/remote_session/__init__.py
@@ -12,29 +12,24 @@
 
 # pylama:ignore=W0611
 
-from framework.config import OS, NodeConfiguration
-from framework.exception import ConfigurationError
+from framework.config import NodeConfiguration
 from framework.logger import DTSLOG
 
-from .linux_session import LinuxSession
-from .os_session import InteractiveShellType, OSSession
-from .remote import (
-    CommandResult,
-    InteractiveRemoteSession,
-    InteractiveShell,
-    PythonShell,
-    RemoteSession,
-    SSHSession,
-    TestPmdDevice,
-    TestPmdShell,
-)
-
-
-def create_session(
+from .interactive_remote_session import InteractiveRemoteSession
+from .interactive_shell import InteractiveShell
+from .python_shell import PythonShell
+from .remote_session import CommandResult, RemoteSession
+from .ssh_session import SSHSession
+from .testpmd_shell import TestPmdShell
+
+
+def create_remote_session(
     node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> OSSession:
-    match node_config.os:
-        case OS.linux:
-            return LinuxSession(node_config, name, logger)
-        case _:
-            raise ConfigurationError(f"Unsupported OS {node_config.os}")
+) -> RemoteSession:
+    return SSHSession(node_config, name, logger)
+
+
+def create_interactive_session(
+    node_config: NodeConfiguration, logger: DTSLOG
+) -> InteractiveRemoteSession:
+    return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/interactive_remote_session.py b/dts/framework/remote_session/interactive_remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_remote_session.py
rename to dts/framework/remote_session/interactive_remote_session.py
diff --git a/dts/framework/remote_session/remote/interactive_shell.py b/dts/framework/remote_session/interactive_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/interactive_shell.py
rename to dts/framework/remote_session/interactive_shell.py
diff --git a/dts/framework/remote_session/remote/python_shell.py b/dts/framework/remote_session/python_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/python_shell.py
rename to dts/framework/remote_session/python_shell.py
diff --git a/dts/framework/remote_session/remote/__init__.py b/dts/framework/remote_session/remote/__init__.py
deleted file mode 100644
index 06403691a5..0000000000
--- a/dts/framework/remote_session/remote/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-# Copyright(c) 2023 University of New Hampshire
-
-# pylama:ignore=W0611
-
-from framework.config import NodeConfiguration
-from framework.logger import DTSLOG
-
-from .interactive_remote_session import InteractiveRemoteSession
-from .interactive_shell import InteractiveShell
-from .python_shell import PythonShell
-from .remote_session import CommandResult, RemoteSession
-from .ssh_session import SSHSession
-from .testpmd_shell import TestPmdDevice, TestPmdShell
-
-
-def create_remote_session(
-    node_config: NodeConfiguration, name: str, logger: DTSLOG
-) -> RemoteSession:
-    return SSHSession(node_config, name, logger)
-
-
-def create_interactive_session(
-    node_config: NodeConfiguration, logger: DTSLOG
-) -> InteractiveRemoteSession:
-    return InteractiveRemoteSession(node_config, logger)
diff --git a/dts/framework/remote_session/remote/remote_session.py b/dts/framework/remote_session/remote_session.py
similarity index 100%
rename from dts/framework/remote_session/remote/remote_session.py
rename to dts/framework/remote_session/remote_session.py
diff --git a/dts/framework/remote_session/remote/ssh_session.py b/dts/framework/remote_session/ssh_session.py
similarity index 91%
rename from dts/framework/remote_session/remote/ssh_session.py
rename to dts/framework/remote_session/ssh_session.py
index 8d127f1601..cee11d14d6 100644
--- a/dts/framework/remote_session/remote/ssh_session.py
+++ b/dts/framework/remote_session/ssh_session.py
@@ -18,9 +18,7 @@
     SSHException,
 )
 
-from framework.config import NodeConfiguration
 from framework.exception import SSHConnectionError, SSHSessionDeadError, SSHTimeoutError
-from framework.logger import DTSLOG
 
 from .remote_session import CommandResult, RemoteSession
 
@@ -45,14 +43,6 @@ class SSHSession(RemoteSession):
 
     session: Connection
 
-    def __init__(
-        self,
-        node_config: NodeConfiguration,
-        session_name: str,
-        logger: DTSLOG,
-    ):
-        super(SSHSession, self).__init__(node_config, session_name, logger)
-
     def _connect(self) -> None:
         errors = []
         retry_attempts = 10
@@ -117,7 +107,7 @@ def _send_command(
 
         except CommandTimedOut as e:
             self._logger.exception(e)
-            raise SSHTimeoutError(command, e.result.stderr) from e
+            raise SSHTimeoutError(command) from e
 
         return CommandResult(
             self.name, command, output.stdout, output.stderr, output.return_code
diff --git a/dts/framework/remote_session/remote/testpmd_shell.py b/dts/framework/remote_session/testpmd_shell.py
similarity index 100%
rename from dts/framework/remote_session/remote/testpmd_shell.py
rename to dts/framework/remote_session/testpmd_shell.py
diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index cfa39d011b..7f5841d073 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -6,7 +6,7 @@
 import argparse
 import os
 from collections.abc import Callable, Iterable, Sequence
-from dataclasses import dataclass
+from dataclasses import dataclass, field
 from pathlib import Path
 from typing import Any, TypeVar
 
@@ -22,8 +22,8 @@ def __init__(
             option_strings: Sequence[str],
             dest: str,
             nargs: str | int | None = None,
-            const: str | None = None,
-            default: str = None,
+            const: bool | None = None,
+            default: Any = None,
             type: Callable[[str], _T | argparse.FileType | None] = None,
             choices: Iterable[_T] | None = None,
             required: bool = False,
@@ -32,6 +32,12 @@ def __init__(
         ) -> None:
             env_var_value = os.environ.get(env_var)
             default = env_var_value or default
+            if const is not None:
+                nargs = 0
+                default = const if env_var_value else default
+                type = None
+                choices = None
+                metavar = None
             super(_EnvironmentArgument, self).__init__(
                 option_strings,
                 dest,
@@ -52,22 +58,28 @@ def __call__(
             values: Any,
             option_string: str = None,
         ) -> None:
-            setattr(namespace, self.dest, values)
+            if self.const is not None:
+                setattr(namespace, self.dest, self.const)
+            else:
+                setattr(namespace, self.dest, values)
 
     return _EnvironmentArgument
 
 
-@dataclass(slots=True, frozen=True)
-class _Settings:
-    config_file_path: str
-    output_dir: str
-    timeout: float
-    verbose: bool
-    skip_setup: bool
-    dpdk_tarball_path: Path
-    compile_timeout: float
-    test_cases: list
-    re_run: int
+@dataclass(slots=True)
+class Settings:
+    config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+    output_dir: str = "output"
+    timeout: float = 15
+    verbose: bool = False
+    skip_setup: bool = False
+    dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+    compile_timeout: float = 1200
+    test_cases: list[str] = field(default_factory=list)
+    re_run: int = 0
+
+
+SETTINGS: Settings = Settings()
 
 
 def _get_parser() -> argparse.ArgumentParser:
@@ -81,7 +93,8 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--config-file",
         action=_env_arg("DTS_CFG_FILE"),
-        default="conf.yaml",
+        default=SETTINGS.config_file_path,
+        type=Path,
         help="[DTS_CFG_FILE] configuration file that describes the test cases, SUTs "
         "and targets.",
     )
@@ -90,7 +103,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--output-dir",
         "--output",
         action=_env_arg("DTS_OUTPUT_DIR"),
-        default="output",
+        default=SETTINGS.output_dir,
         help="[DTS_OUTPUT_DIR] Output directory where dts logs and results are saved.",
     )
 
@@ -98,7 +111,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "-t",
         "--timeout",
         action=_env_arg("DTS_TIMEOUT"),
-        default=15,
+        default=SETTINGS.timeout,
         type=float,
         help="[DTS_TIMEOUT] The default timeout for all DTS operations except for "
         "compiling DPDK.",
@@ -108,8 +121,9 @@ def _get_parser() -> argparse.ArgumentParser:
         "-v",
         "--verbose",
         action=_env_arg("DTS_VERBOSE"),
-        default="N",
-        help="[DTS_VERBOSE] Set to 'Y' to enable verbose output, logging all messages "
+        default=SETTINGS.verbose,
+        const=True,
+        help="[DTS_VERBOSE] Specify to enable verbose output, logging all messages "
         "to the console.",
     )
 
@@ -117,8 +131,8 @@ def _get_parser() -> argparse.ArgumentParser:
         "-s",
         "--skip-setup",
         action=_env_arg("DTS_SKIP_SETUP"),
-        default="N",
-        help="[DTS_SKIP_SETUP] Set to 'Y' to skip all setup steps on SUT and TG nodes.",
+        const=True,
+        help="[DTS_SKIP_SETUP] Specify to skip all setup steps on SUT and TG nodes.",
     )
 
     parser.add_argument(
@@ -126,7 +140,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--snapshot",
         "--git-ref",
         action=_env_arg("DTS_DPDK_TARBALL"),
-        default="dpdk.tar.xz",
+        default=SETTINGS.dpdk_tarball_path,
         type=Path,
         help="[DTS_DPDK_TARBALL] Path to DPDK source code tarball or a git commit ID, "
         "tag ID or tree ID to test. To test local changes, first commit them, "
@@ -136,7 +150,7 @@ def _get_parser() -> argparse.ArgumentParser:
     parser.add_argument(
         "--compile-timeout",
         action=_env_arg("DTS_COMPILE_TIMEOUT"),
-        default=1200,
+        default=SETTINGS.compile_timeout,
         type=float,
         help="[DTS_COMPILE_TIMEOUT] The timeout for compiling DPDK.",
     )
@@ -153,7 +167,7 @@ def _get_parser() -> argparse.ArgumentParser:
         "--re-run",
         "--re_run",
         action=_env_arg("DTS_RERUN"),
-        default=0,
+        default=SETTINGS.re_run,
         type=int,
         help="[DTS_RERUN] Re-run each test case the specified amount of times "
         "if a test failure occurs",
@@ -162,23 +176,22 @@ def _get_parser() -> argparse.ArgumentParser:
     return parser
 
 
-def _get_settings() -> _Settings:
+def get_settings() -> Settings:
     parsed_args = _get_parser().parse_args()
-    return _Settings(
+    return Settings(
         config_file_path=parsed_args.config_file,
         output_dir=parsed_args.output_dir,
         timeout=parsed_args.timeout,
-        verbose=(parsed_args.verbose == "Y"),
-        skip_setup=(parsed_args.skip_setup == "Y"),
+        verbose=parsed_args.verbose,
+        skip_setup=parsed_args.skip_setup,
         dpdk_tarball_path=Path(
-            DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir)
-        )
-        if not os.path.exists(parsed_args.tarball)
-        else Path(parsed_args.tarball),
+            Path(DPDKGitTarball(parsed_args.tarball, parsed_args.output_dir))
+            if not os.path.exists(parsed_args.tarball)
+            else Path(parsed_args.tarball)
+        ),
         compile_timeout=parsed_args.compile_timeout,
-        test_cases=parsed_args.test_cases.split(",") if parsed_args.test_cases else [],
+        test_cases=(
+            parsed_args.test_cases.split(",") if parsed_args.test_cases else []
+        ),
         re_run=parsed_args.re_run,
     )
-
-
-SETTINGS: _Settings = _get_settings()
diff --git a/dts/framework/test_result.py b/dts/framework/test_result.py
index f0fbe80f6f..603e18872c 100644
--- a/dts/framework/test_result.py
+++ b/dts/framework/test_result.py
@@ -254,7 +254,7 @@ def add_build_target(
         self._inner_results.append(build_target_result)
         return build_target_result
 
-    def add_sut_info(self, sut_info: NodeInfo):
+    def add_sut_info(self, sut_info: NodeInfo) -> None:
         self.sut_os_name = sut_info.os_name
         self.sut_os_version = sut_info.os_version
         self.sut_kernel_version = sut_info.kernel_version
@@ -297,7 +297,7 @@ def add_execution(self, sut_node: NodeConfiguration) -> ExecutionResult:
         self._inner_results.append(execution_result)
         return execution_result
 
-    def add_error(self, error) -> None:
+    def add_error(self, error: Exception) -> None:
         self._errors.append(error)
 
     def process(self) -> None:
diff --git a/dts/framework/test_suite.py b/dts/framework/test_suite.py
index 3b890c0451..d53553bf34 100644
--- a/dts/framework/test_suite.py
+++ b/dts/framework/test_suite.py
@@ -11,7 +11,7 @@
 import re
 from ipaddress import IPv4Interface, IPv6Interface, ip_interface
 from types import MethodType
-from typing import Union
+from typing import Any, Union
 
 from scapy.layers.inet import IP  # type: ignore[import]
 from scapy.layers.l2 import Ether  # type: ignore[import]
@@ -26,8 +26,7 @@
 from .logger import DTSLOG, getLogger
 from .settings import SETTINGS
 from .test_result import BuildTargetResult, Result, TestCaseResult, TestSuiteResult
-from .testbed_model import SutNode, TGNode
-from .testbed_model.hw.port import Port, PortLink
+from .testbed_model import Port, PortLink, SutNode, TGNode
 from .utils import get_packet_summaries
 
 
@@ -453,7 +452,7 @@ def _execute_test_case(
 
 
 def get_test_suites(testsuite_module_path: str) -> list[type[TestSuite]]:
-    def is_test_suite(object) -> bool:
+    def is_test_suite(object: Any) -> bool:
         try:
             if issubclass(object, TestSuite) and object is not TestSuite:
                 return True
diff --git a/dts/framework/testbed_model/__init__.py b/dts/framework/testbed_model/__init__.py
index 5cbb859e47..8ced05653b 100644
--- a/dts/framework/testbed_model/__init__.py
+++ b/dts/framework/testbed_model/__init__.py
@@ -9,15 +9,9 @@
 
 # pylama:ignore=W0611
 
-from .hw import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-    VirtualDevice,
-    lcore_filter,
-)
+from .cpu import LogicalCoreCount, LogicalCoreCountFilter, LogicalCoreList
 from .node import Node
+from .port import Port, PortLink
 from .sut_node import SutNode
 from .tg_node import TGNode
+from .virtual_device import VirtualDevice
diff --git a/dts/framework/testbed_model/hw/cpu.py b/dts/framework/testbed_model/cpu.py
similarity index 95%
rename from dts/framework/testbed_model/hw/cpu.py
rename to dts/framework/testbed_model/cpu.py
index d1918a12dc..8fe785dfe4 100644
--- a/dts/framework/testbed_model/hw/cpu.py
+++ b/dts/framework/testbed_model/cpu.py
@@ -272,3 +272,16 @@ def filter(self) -> list[LogicalCore]:
             )
 
         return filtered_lcores
+
+
+def lcore_filter(
+    core_list: list[LogicalCore],
+    filter_specifier: LogicalCoreCount | LogicalCoreList,
+    ascending: bool,
+) -> LogicalCoreFilter:
+    if isinstance(filter_specifier, LogicalCoreList):
+        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
+    elif isinstance(filter_specifier, LogicalCoreCount):
+        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
+    else:
+        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/testbed_model/hw/__init__.py b/dts/framework/testbed_model/hw/__init__.py
deleted file mode 100644
index 88ccac0b0e..0000000000
--- a/dts/framework/testbed_model/hw/__init__.py
+++ /dev/null
@@ -1,27 +0,0 @@
-# SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2023 PANTHEON.tech s.r.o.
-
-# pylama:ignore=W0611
-
-from .cpu import (
-    LogicalCore,
-    LogicalCoreCount,
-    LogicalCoreCountFilter,
-    LogicalCoreFilter,
-    LogicalCoreList,
-    LogicalCoreListFilter,
-)
-from .virtual_device import VirtualDevice
-
-
-def lcore_filter(
-    core_list: list[LogicalCore],
-    filter_specifier: LogicalCoreCount | LogicalCoreList,
-    ascending: bool,
-) -> LogicalCoreFilter:
-    if isinstance(filter_specifier, LogicalCoreList):
-        return LogicalCoreListFilter(core_list, filter_specifier, ascending)
-    elif isinstance(filter_specifier, LogicalCoreCount):
-        return LogicalCoreCountFilter(core_list, filter_specifier, ascending)
-    else:
-        raise ValueError(f"Unsupported filter r{filter_specifier}")
diff --git a/dts/framework/remote_session/linux_session.py b/dts/framework/testbed_model/linux_session.py
similarity index 97%
rename from dts/framework/remote_session/linux_session.py
rename to dts/framework/testbed_model/linux_session.py
index a3f1a6bf3b..f472bb8f0f 100644
--- a/dts/framework/remote_session/linux_session.py
+++ b/dts/framework/testbed_model/linux_session.py
@@ -9,10 +9,10 @@
 from typing_extensions import NotRequired
 
 from framework.exception import RemoteCommandExecutionError
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
 from framework.utils import expand_range
 
+from .cpu import LogicalCore
+from .port import Port
 from .posix_session import PosixSession
 
 
@@ -64,7 +64,7 @@ def get_remote_cpus(self, use_first_core: bool) -> list[LogicalCore]:
             lcores.append(LogicalCore(lcore, core, socket, node))
         return lcores
 
-    def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+    def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
         return dpdk_prefix
 
     def setup_hugepages(self, hugepage_amount: int, force_first_numa: bool) -> None:
diff --git a/dts/framework/testbed_model/node.py b/dts/framework/testbed_model/node.py
index fc01e0bf8e..7571e7b98d 100644
--- a/dts/framework/testbed_model/node.py
+++ b/dts/framework/testbed_model/node.py
@@ -12,23 +12,26 @@
 from typing import Any, Callable, Type, Union
 
 from framework.config import (
+    OS,
     BuildTargetConfiguration,
     ExecutionConfiguration,
     NodeConfiguration,
 )
+from framework.exception import ConfigurationError
 from framework.logger import DTSLOG, getLogger
-from framework.remote_session import InteractiveShellType, OSSession, create_session
 from framework.settings import SETTINGS
 
-from .hw import (
+from .cpu import (
     LogicalCore,
     LogicalCoreCount,
     LogicalCoreList,
     LogicalCoreListFilter,
-    VirtualDevice,
     lcore_filter,
 )
-from .hw.port import Port
+from .linux_session import LinuxSession
+from .os_session import InteractiveShellType, OSSession
+from .port import Port
+from .virtual_device import VirtualDevice
 
 
 class Node(ABC):
@@ -69,6 +72,7 @@ def __init__(self, node_config: NodeConfiguration):
     def _init_ports(self) -> None:
         self.ports = [Port(self.name, port_config) for port_config in self.config.ports]
         self.main_session.update_ports(self.ports)
+
         for port in self.ports:
             self.configure_port_state(port)
 
@@ -172,9 +176,9 @@ def create_interactive_shell(
 
         return self.main_session.create_interactive_shell(
             shell_cls,
-            app_args,
             timeout,
             privileged,
+            app_args,
         )
 
     def filter_lcores(
@@ -205,7 +209,7 @@ def _get_remote_cpus(self) -> None:
         self._logger.info("Getting CPU information.")
         self.lcores = self.main_session.get_remote_cpus(self.config.use_first_core)
 
-    def _setup_hugepages(self):
+    def _setup_hugepages(self) -> None:
         """
         Setup hugepages on the Node. Different architectures can supply different
         amounts of memory for hugepages and numa-based hugepage allocation may need
@@ -249,3 +253,13 @@ def skip_setup(func: Callable[..., Any]) -> Callable[..., Any]:
             return lambda *args: None
         else:
             return func
+
+
+def create_session(
+    node_config: NodeConfiguration, name: str, logger: DTSLOG
+) -> OSSession:
+    match node_config.os:
+        case OS.linux:
+            return LinuxSession(node_config, name, logger)
+        case _:
+            raise ConfigurationError(f"Unsupported OS {node_config.os}")
diff --git a/dts/framework/remote_session/os_session.py b/dts/framework/testbed_model/os_session.py
similarity index 95%
rename from dts/framework/remote_session/os_session.py
rename to dts/framework/testbed_model/os_session.py
index 8a709eac1c..76e595a518 100644
--- a/dts/framework/remote_session/os_session.py
+++ b/dts/framework/testbed_model/os_session.py
@@ -10,19 +10,19 @@
 
 from framework.config import Architecture, NodeConfiguration, NodeInfo
 from framework.logger import DTSLOG
-from framework.remote_session.remote import InteractiveShell
-from framework.settings import SETTINGS
-from framework.testbed_model import LogicalCore
-from framework.testbed_model.hw.port import Port
-from framework.utils import MesonArgs
-
-from .remote import (
+from framework.remote_session import (
     CommandResult,
     InteractiveRemoteSession,
+    InteractiveShell,
     RemoteSession,
     create_interactive_session,
     create_remote_session,
 )
+from framework.settings import SETTINGS
+from framework.utils import MesonArgs
+
+from .cpu import LogicalCore
+from .port import Port
 
 InteractiveShellType = TypeVar("InteractiveShellType", bound=InteractiveShell)
 
@@ -85,9 +85,9 @@ def send_command(
     def create_interactive_shell(
         self,
         shell_cls: Type[InteractiveShellType],
-        eal_parameters: str,
         timeout: float,
         privileged: bool,
+        app_args: str,
     ) -> InteractiveShellType:
         """
         See "create_interactive_shell" in SutNode
@@ -96,7 +96,7 @@ def create_interactive_shell(
             self.interactive_session.session,
             self._logger,
             self._get_privileged_command if privileged else None,
-            eal_parameters,
+            app_args,
             timeout,
         )
 
@@ -113,7 +113,7 @@ def _get_privileged_command(command: str) -> str:
         """
 
     @abstractmethod
-    def guess_dpdk_remote_dir(self, remote_dir) -> PurePath:
+    def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePath:
         """
         Try to find DPDK remote dir in remote_dir.
         """
@@ -227,7 +227,7 @@ def kill_cleanup_dpdk_apps(self, dpdk_prefix_list: Iterable[str]) -> None:
         """
 
     @abstractmethod
-    def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+    def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
         """
         Get the DPDK file prefix that will be used when running DPDK apps.
         """
diff --git a/dts/framework/testbed_model/hw/port.py b/dts/framework/testbed_model/port.py
similarity index 100%
rename from dts/framework/testbed_model/hw/port.py
rename to dts/framework/testbed_model/port.py
diff --git a/dts/framework/remote_session/posix_session.py b/dts/framework/testbed_model/posix_session.py
similarity index 98%
rename from dts/framework/remote_session/posix_session.py
rename to dts/framework/testbed_model/posix_session.py
index 5da0516e05..1d1d5b1b26 100644
--- a/dts/framework/remote_session/posix_session.py
+++ b/dts/framework/testbed_model/posix_session.py
@@ -32,7 +32,7 @@ def combine_short_options(**opts: bool) -> str:
 
         return ret_opts
 
-    def guess_dpdk_remote_dir(self, remote_dir) -> PurePosixPath:
+    def guess_dpdk_remote_dir(self, remote_dir: str | PurePath) -> PurePosixPath:
         remote_guess = self.join_remote_path(remote_dir, "dpdk-*")
         result = self.send_command(f"ls -d {remote_guess} | tail -1")
         return PurePosixPath(result.stdout)
@@ -219,7 +219,7 @@ def _remove_dpdk_runtime_dirs(
         for dpdk_runtime_dir in dpdk_runtime_dirs:
             self.remove_remote_dir(dpdk_runtime_dir)
 
-    def get_dpdk_file_prefix(self, dpdk_prefix) -> str:
+    def get_dpdk_file_prefix(self, dpdk_prefix: str) -> str:
         return ""
 
     def get_compiler_version(self, compiler_name: str) -> str:
diff --git a/dts/framework/testbed_model/sut_node.py b/dts/framework/testbed_model/sut_node.py
index 202aebfd06..4e33cf02ea 100644
--- a/dts/framework/testbed_model/sut_node.py
+++ b/dts/framework/testbed_model/sut_node.py
@@ -15,12 +15,14 @@
     NodeInfo,
     SutNodeConfiguration,
 )
-from framework.remote_session import CommandResult, InteractiveShellType, OSSession
+from framework.remote_session import CommandResult
 from framework.settings import SETTINGS
 from framework.utils import MesonArgs
 
-from .hw import LogicalCoreCount, LogicalCoreList, VirtualDevice
+from .cpu import LogicalCoreCount, LogicalCoreList
 from .node import Node
+from .os_session import InteractiveShellType, OSSession
+from .virtual_device import VirtualDevice
 
 
 class EalParameters(object):
@@ -289,7 +291,7 @@ def create_eal_parameters(
         prefix: str = "dpdk",
         append_prefix_timestamp: bool = True,
         no_pci: bool = False,
-        vdevs: list[VirtualDevice] = None,
+        vdevs: list[VirtualDevice] | None = None,
         other_eal_param: str = "",
     ) -> "EalParameters":
         """
diff --git a/dts/framework/testbed_model/tg_node.py b/dts/framework/testbed_model/tg_node.py
index 27025cfa31..166eb8430e 100644
--- a/dts/framework/testbed_model/tg_node.py
+++ b/dts/framework/testbed_model/tg_node.py
@@ -16,16 +16,11 @@
 
 from scapy.packet import Packet  # type: ignore[import]
 
-from framework.config import (
-    ScapyTrafficGeneratorConfig,
-    TGNodeConfiguration,
-    TrafficGeneratorType,
-)
-from framework.exception import ConfigurationError
-
-from .capturing_traffic_generator import CapturingTrafficGenerator
-from .hw.port import Port
+from framework.config import TGNodeConfiguration
+
 from .node import Node
+from .port import Port
+from .traffic_generator import CapturingTrafficGenerator, create_traffic_generator
 
 
 class TGNode(Node):
@@ -80,20 +75,3 @@ def close(self) -> None:
         """Free all resources used by the node"""
         self.traffic_generator.close()
         super(TGNode, self).close()
-
-
-def create_traffic_generator(
-    tg_node: TGNode, traffic_generator_config: ScapyTrafficGeneratorConfig
-) -> CapturingTrafficGenerator:
-    """A factory function for creating traffic generator object from user config."""
-
-    from .scapy import ScapyTrafficGenerator
-
-    match traffic_generator_config.traffic_generator_type:
-        case TrafficGeneratorType.SCAPY:
-            return ScapyTrafficGenerator(tg_node, traffic_generator_config)
-        case _:
-            raise ConfigurationError(
-                "Unknown traffic generator: "
-                f"{traffic_generator_config.traffic_generator_type}"
-            )
diff --git a/dts/framework/testbed_model/traffic_generator/__init__.py b/dts/framework/testbed_model/traffic_generator/__init__.py
new file mode 100644
index 0000000000..11bfa1ee0f
--- /dev/null
+++ b/dts/framework/testbed_model/traffic_generator/__init__.py
@@ -0,0 +1,24 @@
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2023 PANTHEON.tech s.r.o.
+
+from framework.config import ScapyTrafficGeneratorConfig, TrafficGeneratorType
+from framework.exception import ConfigurationError
+from framework.testbed_model.node import Node
+
+from .capturing_traffic_generator import CapturingTrafficGenerator
+from .scapy import ScapyTrafficGenerator
+
+
+def create_traffic_generator(
+    tg_node: Node, traffic_generator_config: ScapyTrafficGeneratorConfig
+) -> CapturingTrafficGenerator:
+    """A factory function for creating traffic generator object from user config."""
+
+    match traffic_generator_config.traffic_generator_type:
+        case TrafficGeneratorType.SCAPY:
+            return ScapyTrafficGenerator(tg_node, traffic_generator_config)
+        case _:
+            raise ConfigurationError(
+                "Unknown traffic generator: "
+                f"{traffic_generator_config.traffic_generator_type}"
+            )
diff --git a/dts/framework/testbed_model/capturing_traffic_generator.py b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
similarity index 96%
rename from dts/framework/testbed_model/capturing_traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
index ab98987f8e..e521211ef0 100644
--- a/dts/framework/testbed_model/capturing_traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/capturing_traffic_generator.py
@@ -16,9 +16,9 @@
 from scapy.packet import Packet  # type: ignore[import]
 
 from framework.settings import SETTINGS
+from framework.testbed_model.port import Port
 from framework.utils import get_packet_summaries
 
-from .hw.port import Port
 from .traffic_generator import TrafficGenerator
 
 
@@ -130,7 +130,9 @@ def _send_packets_and_capture(
         for the specified duration. It must be able to handle no received packets.
         """
 
-    def _write_capture_from_packets(self, capture_name: str, packets: list[Packet]):
+    def _write_capture_from_packets(
+        self, capture_name: str, packets: list[Packet]
+    ) -> None:
         file_name = f"{SETTINGS.output_dir}/{capture_name}.pcap"
         self._logger.debug(f"Writing packets to {file_name}.")
         scapy.utils.wrpcap(file_name, packets)
diff --git a/dts/framework/testbed_model/scapy.py b/dts/framework/testbed_model/traffic_generator/scapy.py
similarity index 95%
rename from dts/framework/testbed_model/scapy.py
rename to dts/framework/testbed_model/traffic_generator/scapy.py
index af0d4dbb25..51864b6e6b 100644
--- a/dts/framework/testbed_model/scapy.py
+++ b/dts/framework/testbed_model/traffic_generator/scapy.py
@@ -24,16 +24,15 @@
 from scapy.packet import Packet  # type: ignore[import]
 
 from framework.config import OS, ScapyTrafficGeneratorConfig
-from framework.logger import DTSLOG, getLogger
 from framework.remote_session import PythonShell
 from framework.settings import SETTINGS
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
 
 from .capturing_traffic_generator import (
     CapturingTrafficGenerator,
     _get_default_capture_name,
 )
-from .hw.port import Port
-from .tg_node import TGNode
 
 """
 ========= BEGIN RPC FUNCTIONS =========
@@ -146,7 +145,7 @@ def quit(self) -> None:
         self._BaseServer__shutdown_request = True
         return None
 
-    def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary):
+    def add_rpc_function(self, name: str, function_bytes: xmlrpc.client.Binary) -> None:
         """Add a function to the server.
 
         This is meant to be executed remotely.
@@ -191,15 +190,9 @@ class ScapyTrafficGenerator(CapturingTrafficGenerator):
     session: PythonShell
     rpc_server_proxy: xmlrpc.client.ServerProxy
     _config: ScapyTrafficGeneratorConfig
-    _tg_node: TGNode
-    _logger: DTSLOG
-
-    def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
-        self._config = config
-        self._tg_node = tg_node
-        self._logger = getLogger(
-            f"{self._tg_node.name} {self._config.traffic_generator_type}"
-        )
+
+    def __init__(self, tg_node: Node, config: ScapyTrafficGeneratorConfig):
+        super().__init__(tg_node, config)
 
         assert (
             self._tg_node.config.os == OS.linux
@@ -235,7 +228,7 @@ def __init__(self, tg_node: TGNode, config: ScapyTrafficGeneratorConfig):
             function_bytes = marshal.dumps(function.__code__)
             self.rpc_server_proxy.add_rpc_function(function.__name__, function_bytes)
 
-    def _start_xmlrpc_server_in_remote_python(self, listen_port: int):
+    def _start_xmlrpc_server_in_remote_python(self, listen_port: int) -> None:
         # load the source of the function
         src = inspect.getsource(QuittableXMLRPCServer)
         # Lines with only whitespace break the repl if in the middle of a function
@@ -280,7 +273,7 @@ def _send_packets_and_capture(
         scapy_packets = [Ether(packet.data) for packet in xmlrpc_packets]
         return scapy_packets
 
-    def close(self):
+    def close(self) -> None:
         try:
             self.rpc_server_proxy.quit()
         except ConnectionRefusedError:
diff --git a/dts/framework/testbed_model/traffic_generator.py b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
similarity index 80%
rename from dts/framework/testbed_model/traffic_generator.py
rename to dts/framework/testbed_model/traffic_generator/traffic_generator.py
index 28c35d3ce4..ea7c3963da 100644
--- a/dts/framework/testbed_model/traffic_generator.py
+++ b/dts/framework/testbed_model/traffic_generator/traffic_generator.py
@@ -12,11 +12,12 @@
 
 from scapy.packet import Packet  # type: ignore[import]
 
-from framework.logger import DTSLOG
+from framework.config import TrafficGeneratorConfig
+from framework.logger import DTSLOG, getLogger
+from framework.testbed_model.node import Node
+from framework.testbed_model.port import Port
 from framework.utils import get_packet_summaries
 
-from .hw.port import Port
-
 
 class TrafficGenerator(ABC):
     """The base traffic generator.
@@ -24,8 +25,17 @@ class TrafficGenerator(ABC):
     Defines the few basic methods that each traffic generator must implement.
     """
 
+    _config: TrafficGeneratorConfig
+    _tg_node: Node
     _logger: DTSLOG
 
+    def __init__(self, tg_node: Node, config: TrafficGeneratorConfig):
+        self._config = config
+        self._tg_node = tg_node
+        self._logger = getLogger(
+            f"{self._tg_node.name} {self._config.traffic_generator_type}"
+        )
+
     def send_packet(self, packet: Packet, port: Port) -> None:
         """Send a packet and block until it is fully sent.
 
diff --git a/dts/framework/testbed_model/hw/virtual_device.py b/dts/framework/testbed_model/virtual_device.py
similarity index 100%
rename from dts/framework/testbed_model/hw/virtual_device.py
rename to dts/framework/testbed_model/virtual_device.py
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index d27c2c5b5f..f0c916471c 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -7,7 +7,6 @@
 import json
 import os
 import subprocess
-import sys
 from enum import Enum
 from pathlib import Path
 from subprocess import SubprocessError
@@ -16,35 +15,7 @@
 
 from .exception import ConfigurationError
 
-
-class StrEnum(Enum):
-    @staticmethod
-    def _generate_next_value_(
-        name: str, start: int, count: int, last_values: object
-    ) -> str:
-        return name
-
-    def __str__(self) -> str:
-        return self.name
-
-
-REGEX_FOR_PCI_ADDRESS = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
-
-
-def check_dts_python_version() -> None:
-    if sys.version_info.major < 3 or (
-        sys.version_info.major == 3 and sys.version_info.minor < 10
-    ):
-        print(
-            RED(
-                (
-                    "WARNING: DTS execution node's python version is lower than"
-                    "python 3.10, is deprecated and will not work in future releases."
-                )
-            ),
-            file=sys.stderr,
-        )
-        print(RED("Please use Python >= 3.10 instead"), file=sys.stderr)
+REGEX_FOR_PCI_ADDRESS: str = "/[0-9a-fA-F]{4}:[0-9a-fA-F]{2}:[0-9a-fA-F]{2}.[0-9]{1}/"
 
 
 def expand_range(range_str: str) -> list[int]:
@@ -67,7 +38,7 @@ def expand_range(range_str: str) -> list[int]:
     return expanded_range
 
 
-def get_packet_summaries(packets: list[Packet]):
+def get_packet_summaries(packets: list[Packet]) -> str:
     if len(packets) == 1:
         packet_summaries = packets[0].summary()
     else:
@@ -77,8 +48,15 @@ def get_packet_summaries(packets: list[Packet]):
     return f"Packet contents: \n{packet_summaries}"
 
 
-def RED(text: str) -> str:
-    return f"\u001B[31;1m{str(text)}\u001B[0m"
+class StrEnum(Enum):
+    @staticmethod
+    def _generate_next_value_(
+        name: str, start: int, count: int, last_values: object
+    ) -> str:
+        return name
+
+    def __str__(self) -> str:
+        return self.name
 
 
 class MesonArgs(object):
@@ -225,5 +203,5 @@ def _delete_tarball(self) -> None:
         if self._tarball_path and os.path.exists(self._tarball_path):
             os.remove(self._tarball_path)
 
-    def __fspath__(self):
+    def __fspath__(self) -> str:
         return str(self._tarball_path)
diff --git a/dts/main.py b/dts/main.py
index 43311fa847..5d4714b0c3 100755
--- a/dts/main.py
+++ b/dts/main.py
@@ -10,10 +10,17 @@
 
 import logging
 
-from framework import dts
+from framework import settings
 
 
 def main() -> None:
+    """Set DTS settings, then run DTS.
+
+    The DTS settings are taken from the command line arguments and the environment variables.
+    """
+    settings.SETTINGS = settings.get_settings()
+    from framework import dts
+
     dts.run_all()
 
 
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [PATCH v5 02/23] dts: add docstring checker
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
  2023-11-06 17:15         ` [PATCH v5 01/23] dts: code adjustments for doc generation Juraj Linkeš
@ 2023-11-06 17:15         ` Juraj Linkeš
  2023-11-07 17:38           ` Yoan Picchi
  2023-11-06 17:15         ` [PATCH v5 03/23] dts: add basic developer docs Juraj Linkeš
                           ` (21 subsequent siblings)
  23 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
	paul.szczepanek, yoan.picchi
  Cc: dev, Juraj Linkeš

Python docstrings are the in-code way to document the code. The
docstring checker of choice is pydocstyle which we're executing from
Pylama, but the current latest versions are not complatible due to [0],
so pin the pydocstyle version to the latest working version.

[0] https://github.com/klen/pylama/issues/232

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/poetry.lock    | 12 ++++++------
 dts/pyproject.toml |  6 +++++-
 2 files changed, 11 insertions(+), 7 deletions(-)

diff --git a/dts/poetry.lock b/dts/poetry.lock
index f7b3b6d602..a734fa71f0 100644
--- a/dts/poetry.lock
+++ b/dts/poetry.lock
@@ -489,20 +489,20 @@ files = [
 
 [[package]]
 name = "pydocstyle"
-version = "6.3.0"
+version = "6.1.1"
 description = "Python docstring style checker"
 optional = false
 python-versions = ">=3.6"
 files = [
-    {file = "pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019"},
-    {file = "pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1"},
+    {file = "pydocstyle-6.1.1-py3-none-any.whl", hash = "sha256:6987826d6775056839940041beef5c08cc7e3d71d63149b48e36727f70144dc4"},
+    {file = "pydocstyle-6.1.1.tar.gz", hash = "sha256:1d41b7c459ba0ee6c345f2eb9ae827cab14a7533a88c5c6f7e94923f72df92dc"},
 ]
 
 [package.dependencies]
-snowballstemmer = ">=2.2.0"
+snowballstemmer = "*"
 
 [package.extras]
-toml = ["tomli (>=1.2.3)"]
+toml = ["toml"]
 
 [[package]]
 name = "pyflakes"
@@ -837,4 +837,4 @@ jsonschema = ">=4,<5"
 [metadata]
 lock-version = "2.0"
 python-versions = "^3.10"
-content-hash = "0b1e4a1cb8323e17e5ee5951c97e74bde6e60d0413d7b25b1803d5b2bab39639"
+content-hash = "3501e97b3dadc19fe8ae179fe21b1edd2488001da9a8e86ff2bca0b86b99b89b"
diff --git a/dts/pyproject.toml b/dts/pyproject.toml
index 6762edfa6b..3943c87c87 100644
--- a/dts/pyproject.toml
+++ b/dts/pyproject.toml
@@ -25,6 +25,7 @@ PyYAML = "^6.0"
 types-PyYAML = "^6.0.8"
 fabric = "^2.7.1"
 scapy = "^2.5.0"
+pydocstyle = "6.1.1"
 
 [tool.poetry.group.dev.dependencies]
 mypy = "^0.961"
@@ -39,10 +40,13 @@ requires = ["poetry-core>=1.0.0"]
 build-backend = "poetry.core.masonry.api"
 
 [tool.pylama]
-linters = "mccabe,pycodestyle,pyflakes"
+linters = "mccabe,pycodestyle,pydocstyle,pyflakes"
 format = "pylint"
 max_line_length = 88 # https://black.readthedocs.io/en/stable/the_black_code_style/current_style.html#line-length
 
+[tool.pylama.linter.pydocstyle]
+convention = "google"
+
 [tool.mypy]
 python_version = "3.10"
 enable_error_code = ["ignore-without-code"]
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [PATCH v5 03/23] dts: add basic developer docs
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
  2023-11-06 17:15         ` [PATCH v5 01/23] dts: code adjustments for doc generation Juraj Linkeš
  2023-11-06 17:15         ` [PATCH v5 02/23] dts: add docstring checker Juraj Linkeš
@ 2023-11-06 17:15         ` Juraj Linkeš
  2023-11-07 14:39           ` Yoan Picchi
  2023-11-06 17:15         ` [PATCH v5 04/23] dts: exceptions docstring update Juraj Linkeš
                           ` (20 subsequent siblings)
  23 siblings, 1 reply; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
	paul.szczepanek, yoan.picchi
  Cc: dev, Juraj Linkeš

Expand the framework contribution guidelines and add how to document the
code with Python docstrings.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 doc/guides/tools/dts.rst | 73 ++++++++++++++++++++++++++++++++++++++++
 1 file changed, 73 insertions(+)

diff --git a/doc/guides/tools/dts.rst b/doc/guides/tools/dts.rst
index 32c18ee472..b1e99107c3 100644
--- a/doc/guides/tools/dts.rst
+++ b/doc/guides/tools/dts.rst
@@ -264,6 +264,65 @@ which be changed with the ``--output-dir`` command line argument.
 The results contain basic statistics of passed/failed test cases and DPDK version.
 
 
+Contributing to DTS
+-------------------
+
+There are two areas of contribution: The DTS framework and DTS test suites.
+
+The framework contains the logic needed to run test cases, such as connecting to nodes,
+running DPDK apps and collecting results.
+
+The test cases call APIs from the framework to test their scenarios. Adding test cases may
+require adding code to the framework as well.
+
+
+Framework Coding Guidelines
+~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+When adding code to the DTS framework, pay attention to the rest of the code
+and try not to divert much from it. The :ref:`DTS developer tools <dts_dev_tools>` will issue
+warnings when some of the basics are not met.
+
+The code must be properly documented with docstrings. The style must conform to
+the `Google style <https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings>`_.
+See an example of the style
+`here <https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html>`_.
+For cases which are not covered by the Google style, refer
+to `PEP 257 <https://peps.python.org/pep-0257/>`_. There are some cases which are not covered by
+the two style guides, where we deviate or where some additional clarification is helpful:
+
+   * The __init__() methods of classes are documented separately from the docstring of the class
+     itself.
+   * The docstrigs of implemented abstract methods should refer to the superclass's definition
+     if there's no deviation.
+   * Instance variables/attributes should be documented in the docstring of the class
+     in the ``Attributes:`` section.
+   * The dataclass.dataclass decorator changes how the attributes are processed. The dataclass
+     attributes which result in instance variables/attributes should also be recorded
+     in the ``Attributes:`` section.
+   * Class variables/attributes, on the other hand, should be documented with ``#:`` above
+     the type annotated line. The description may be omitted if the meaning is obvious.
+   * The Enum and TypedDict also process the attributes in particular ways and should be documented
+     with ``#:`` as well. This is mainly so that the autogenerated docs contain the assigned value.
+   * When referencing a parameter of a function or a method in their docstring, don't use
+     any articles and put the parameter into single backticks. This mimics the style of
+     `Python's documentation <https://docs.python.org/3/index.html>`_.
+   * When specifying a value, use double backticks::
+
+        def foo(greet: bool) -> None:
+            """Demonstration of single and double backticks.
+
+            `greet` controls whether ``Hello World`` is printed.
+
+            Args:
+               greet: Whether to print the ``Hello World`` message.
+            """
+            if greet:
+               print(f"Hello World")
+
+   * The docstring maximum line length is the same as the code maximum line length.
+
+
 How To Write a Test Suite
 -------------------------
 
@@ -293,6 +352,18 @@ There are four types of methods that comprise a test suite:
    | These methods don't need to be implemented if there's no need for them in a test suite.
      In that case, nothing will happen when they're is executed.
 
+#. **Configuration, traffic and other logic**
+
+   The ``TestSuite`` class contains a variety of methods for anything that
+   a test suite setup or teardown or a test case may need to do.
+
+   The test suites also frequently use a DPDK app, such as testpmd, in interactive mode
+   and use the interactive shell instances directly.
+
+   These are the two main ways to call the framework logic in test suites. If there's any
+   functionality or logic missing from the framework, it should be implemented so that
+   the test suites can use one of these two ways.
+
 #. **Test case verification**
 
    Test case verification should be done with the ``verify`` method, which records the result.
@@ -308,6 +379,8 @@ There are four types of methods that comprise a test suite:
    and used by the test suite via the ``sut_node`` field.
 
 
+.. _dts_dev_tools:
+
 DTS Developer Tools
 -------------------
 
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [PATCH v5 04/23] dts: exceptions docstring update
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
                           ` (2 preceding siblings ...)
  2023-11-06 17:15         ` [PATCH v5 03/23] dts: add basic developer docs Juraj Linkeš
@ 2023-11-06 17:15         ` Juraj Linkeš
  2023-11-06 17:15         ` [PATCH v5 05/23] dts: settings " Juraj Linkeš
                           ` (19 subsequent siblings)
  23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
	paul.szczepanek, yoan.picchi
  Cc: dev, Juraj Linkeš

Format according to the Google format and PEP257, with slight
deviations.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/__init__.py  |  12 ++++-
 dts/framework/exception.py | 106 +++++++++++++++++++++++++------------
 2 files changed, 83 insertions(+), 35 deletions(-)

diff --git a/dts/framework/__init__.py b/dts/framework/__init__.py
index d551ad4bf0..662e6ccad2 100644
--- a/dts/framework/__init__.py
+++ b/dts/framework/__init__.py
@@ -1,3 +1,13 @@
 # SPDX-License-Identifier: BSD-3-Clause
-# Copyright(c) 2022 PANTHEON.tech s.r.o.
+# Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022 University of New Hampshire
+
+"""Libraries and utilities for running DPDK Test Suite (DTS).
+
+The various modules in the DTS framework offer:
+
+* Connections to nodes, both interactive and non-interactive,
+* A straightforward way to add support for different operating systems of remote nodes,
+* Test suite setup, execution and teardown, along with test case setup, execution and teardown,
+* Pre-test suite setup and post-test suite teardown.
+"""
diff --git a/dts/framework/exception.py b/dts/framework/exception.py
index 7489c03570..ee1562c672 100644
--- a/dts/framework/exception.py
+++ b/dts/framework/exception.py
@@ -3,8 +3,10 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
-"""
-User-defined exceptions used across the framework.
+"""DTS exceptions.
+
+The exceptions all have different severities expressed as an integer.
+The highest severity of all raised exception is used as the exit code of DTS.
 """
 
 from enum import IntEnum, unique
@@ -13,59 +15,79 @@
 
 @unique
 class ErrorSeverity(IntEnum):
-    """
-    The severity of errors that occur during DTS execution.
+    """The severity of errors that occur during DTS execution.
+
     All exceptions are caught and the most severe error is used as return code.
     """
 
+    #:
     NO_ERR = 0
+    #:
     GENERIC_ERR = 1
+    #:
     CONFIG_ERR = 2
+    #:
     REMOTE_CMD_EXEC_ERR = 3
+    #:
     SSH_ERR = 4
+    #:
     DPDK_BUILD_ERR = 10
+    #:
     TESTCASE_VERIFY_ERR = 20
+    #:
     BLOCKING_TESTSUITE_ERR = 25
 
 
 class DTSError(Exception):
-    """
-    The base exception from which all DTS exceptions are derived.
-    Stores error severity.
+    """The base exception from which all DTS exceptions are subclassed.
+
+    Do not use this exception, only use subclassed exceptions.
     """
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.GENERIC_ERR
 
 
 class SSHTimeoutError(DTSError):
-    """
-    Command execution timeout.
-    """
+    """The SSH execution of a command timed out."""
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
     _command: str
 
     def __init__(self, command: str):
+        """Define the meaning of the first argument.
+
+        Args:
+            command: The executed command.
+        """
         self._command = command
 
     def __str__(self) -> str:
-        return f"TIMEOUT on {self._command}"
+        """Add some context to the string representation."""
+        return f"{self._command} execution timed out."
 
 
 class SSHConnectionError(DTSError):
-    """
-    SSH connection error.
-    """
+    """An unsuccessful SSH connection."""
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
     _host: str
     _errors: list[str]
 
     def __init__(self, host: str, errors: list[str] | None = None):
+        """Define the meaning of the first two arguments.
+
+        Args:
+            host: The hostname to which we're trying to connect.
+            errors: Any errors that occurred during the connection attempt.
+        """
         self._host = host
         self._errors = [] if errors is None else errors
 
     def __str__(self) -> str:
+        """Include the errors in the string representation."""
         message = f"Error trying to connect with {self._host}."
         if self._errors:
             message += f" Errors encountered while retrying: {', '.join(self._errors)}"
@@ -74,43 +96,53 @@ def __str__(self) -> str:
 
 
 class SSHSessionDeadError(DTSError):
-    """
-    SSH session is not alive.
-    It can no longer be used.
-    """
+    """The SSH session is no longer alive."""
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.SSH_ERR
     _host: str
 
     def __init__(self, host: str):
+        """Define the meaning of the first argument.
+
+        Args:
+            host: The hostname of the disconnected node.
+        """
         self._host = host
 
     def __str__(self) -> str:
-        return f"SSH session with {self._host} has died"
+        """Add some context to the string representation."""
+        return f"SSH session with {self._host} has died."
 
 
 class ConfigurationError(DTSError):
-    """
-    Raised when an invalid configuration is encountered.
-    """
+    """An invalid configuration."""
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.CONFIG_ERR
 
 
 class RemoteCommandExecutionError(DTSError):
-    """
-    Raised when a command executed on a Node returns a non-zero exit status.
-    """
+    """An unsuccessful execution of a remote command."""
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
+    #: The executed command.
     command: str
     _command_return_code: int
 
     def __init__(self, command: str, command_return_code: int):
+        """Define the meaning of the first two arguments.
+
+        Args:
+            command: The executed command.
+            command_return_code: The return code of the executed command.
+        """
         self.command = command
         self._command_return_code = command_return_code
 
     def __str__(self) -> str:
+        """Include both the command and return code in the string representation."""
         return (
             f"Command {self.command} returned a non-zero exit code: "
             f"{self._command_return_code}"
@@ -118,35 +150,41 @@ def __str__(self) -> str:
 
 
 class RemoteDirectoryExistsError(DTSError):
-    """
-    Raised when a remote directory to be created already exists.
-    """
+    """A directory that exists on a remote node."""
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.REMOTE_CMD_EXEC_ERR
 
 
 class DPDKBuildError(DTSError):
-    """
-    Raised when DPDK build fails for any reason.
-    """
+    """A DPDK build failure."""
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.DPDK_BUILD_ERR
 
 
 class TestCaseVerifyError(DTSError):
-    """
-    Used in test cases to verify the expected behavior.
-    """
+    """A test case failure."""
 
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.TESTCASE_VERIFY_ERR
 
 
 class BlockingTestSuiteError(DTSError):
+    """A failure in a blocking test suite."""
+
+    #:
     severity: ClassVar[ErrorSeverity] = ErrorSeverity.BLOCKING_TESTSUITE_ERR
     _suite_name: str
 
     def __init__(self, suite_name: str) -> None:
+        """Define the meaning of the first argument.
+
+        Args:
+            suite_name: The blocking test suite.
+        """
         self._suite_name = suite_name
 
     def __str__(self) -> str:
+        """Add some context to the string representation."""
         return f"Blocking suite {self._suite_name} failed."
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [PATCH v5 05/23] dts: settings docstring update
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
                           ` (3 preceding siblings ...)
  2023-11-06 17:15         ` [PATCH v5 04/23] dts: exceptions docstring update Juraj Linkeš
@ 2023-11-06 17:15         ` Juraj Linkeš
  2023-11-06 17:15         ` [PATCH v5 06/23] dts: logger and " Juraj Linkeš
                           ` (18 subsequent siblings)
  23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
	paul.szczepanek, yoan.picchi
  Cc: dev, Juraj Linkeš

Format according to the Google format and PEP257, with slight
deviations.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/settings.py | 101 +++++++++++++++++++++++++++++++++++++-
 1 file changed, 100 insertions(+), 1 deletion(-)

diff --git a/dts/framework/settings.py b/dts/framework/settings.py
index 7f5841d073..787db7c198 100644
--- a/dts/framework/settings.py
+++ b/dts/framework/settings.py
@@ -3,6 +3,70 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022 University of New Hampshire
 
+"""Environment variables and command line arguments parsing.
+
+This is a simple module utilizing the built-in argparse module to parse command line arguments,
+augment them with values from environment variables and make them available across the framework.
+
+The command line value takes precedence, followed by the environment variable value,
+followed by the default value defined in this module.
+
+The command line arguments along with the supported environment variables are:
+
+.. option:: --config-file
+.. envvar:: DTS_CFG_FILE
+
+    The path to the YAML test run configuration file.
+
+.. option:: --output-dir, --output
+.. envvar:: DTS_OUTPUT_DIR
+
+    The directory where DTS logs and results are saved.
+
+.. option:: --compile-timeout
+.. envvar:: DTS_COMPILE_TIMEOUT
+
+    The timeout for compiling DPDK.
+
+.. option:: -t, --timeout
+.. envvar:: DTS_TIMEOUT
+
+    The timeout for all DTS operation except for compiling DPDK.
+
+.. option:: -v, --verbose
+.. envvar:: DTS_VERBOSE
+
+    Set to any value to enable logging everything to the console.
+
+.. option:: -s, --skip-setup
+.. envvar:: DTS_SKIP_SETUP
+
+    Set to any value to skip building DPDK.
+
+.. option:: --tarball, --snapshot, --git-ref
+.. envvar:: DTS_DPDK_TARBALL
+
+    The path to a DPDK tarball, git commit ID, tag ID or tree ID to test.
+
+.. option:: --test-cases
+.. envvar:: DTS_TESTCASES
+
+    A comma-separated list of test cases to execute. Unknown test cases will be silently ignored.
+
+.. option:: --re-run, --re_run
+.. envvar:: DTS_RERUN
+
+    Re-run each test case this many times in case of a failure.
+
+Attributes:
+    SETTINGS: The module level variable storing framework-wide DTS settings.
+
+Typical usage example::
+
+  from framework.settings import SETTINGS
+  foo = SETTINGS.foo
+"""
+
 import argparse
 import os
 from collections.abc import Callable, Iterable, Sequence
@@ -16,6 +80,23 @@
 
 
 def _env_arg(env_var: str) -> Any:
+    """A helper method augmenting the argparse Action with environment variables.
+
+    If the supplied environment variable is defined, then the default value
+    of the argument is modified. This satisfies the priority order of
+    command line argument > environment variable > default value.
+
+    Arguments with no values (flags) should be defined using the const keyword argument
+    (True or False). When the argument is specified, it will be set to const, if not specified,
+    the default will be stored (possibly modified by the corresponding environment variable).
+
+    Other arguments work the same as default argparse arguments, that is using
+    the default 'store' action.
+
+    Returns:
+          The modified argparse.Action.
+    """
+
     class _EnvironmentArgument(argparse.Action):
         def __init__(
             self,
@@ -68,14 +149,28 @@ def __call__(
 
 @dataclass(slots=True)
 class Settings:
+    """Default framework-wide user settings.
+
+    The defaults may be modified at the start of the run.
+    """
+
+    #:
     config_file_path: Path = Path(__file__).parent.parent.joinpath("conf.yaml")
+    #:
     output_dir: str = "output"
+    #:
     timeout: float = 15
+    #:
     verbose: bool = False
+    #:
     skip_setup: bool = False
+    #:
     dpdk_tarball_path: Path | str = "dpdk.tar.xz"
+    #:
     compile_timeout: float = 1200
+    #:
     test_cases: list[str] = field(default_factory=list)
+    #:
     re_run: int = 0
 
 
@@ -169,7 +264,7 @@ def _get_parser() -> argparse.ArgumentParser:
         action=_env_arg("DTS_RERUN"),
         default=SETTINGS.re_run,
         type=int,
-        help="[DTS_RERUN] Re-run each test case the specified amount of times "
+        help="[DTS_RERUN] Re-run each test case the specified number of times "
         "if a test failure occurs",
     )
 
@@ -177,6 +272,10 @@ def _get_parser() -> argparse.ArgumentParser:
 
 
 def get_settings() -> Settings:
+    """Create new settings with inputs from the user.
+
+    The inputs are taken from the command line and from environment variables.
+    """
     parsed_args = _get_parser().parse_args()
     return Settings(
         config_file_path=parsed_args.config_file,
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [PATCH v5 06/23] dts: logger and settings docstring update
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
                           ` (4 preceding siblings ...)
  2023-11-06 17:15         ` [PATCH v5 05/23] dts: settings " Juraj Linkeš
@ 2023-11-06 17:15         ` Juraj Linkeš
  2023-11-06 17:15         ` [PATCH v5 07/23] dts: dts runner and main " Juraj Linkeš
                           ` (17 subsequent siblings)
  23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
	paul.szczepanek, yoan.picchi
  Cc: dev, Juraj Linkeš

Format according to the Google format and PEP257, with slight
deviations.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/logger.py | 72 +++++++++++++++++++++----------
 dts/framework/utils.py  | 96 ++++++++++++++++++++++++++++++-----------
 2 files changed, 121 insertions(+), 47 deletions(-)

diff --git a/dts/framework/logger.py b/dts/framework/logger.py
index bb2991e994..d3eb75a4e4 100644
--- a/dts/framework/logger.py
+++ b/dts/framework/logger.py
@@ -3,9 +3,9 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
-"""
-DTS logger module with several log level. DTS framework and TestSuite logs
-are saved in different log files.
+"""DTS logger module.
+
+DTS framework and TestSuite logs are saved in different log files.
 """
 
 import logging
@@ -18,19 +18,21 @@
 stream_fmt = "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
 
 
-class LoggerDictType(TypedDict):
-    logger: "DTSLOG"
-    name: str
-    node: str
-
+class DTSLOG(logging.LoggerAdapter):
+    """DTS logger adapter class for framework and testsuites.
 
-# List for saving all using loggers
-Loggers: list[LoggerDictType] = []
+    The :option:`--verbose` command line argument and the :envvar:`DTS_VERBOSE` environment
+    variable control the verbosity of output. If enabled, all messages will be emitted to the
+    console.
 
+    The :option:`--output` command line argument and the :envvar:`DTS_OUTPUT_DIR` environment
+    variable modify the directory where the logs will be stored.
 
-class DTSLOG(logging.LoggerAdapter):
-    """
-    DTS log class for framework and testsuite.
+    Attributes:
+        node: The additional identifier. Currently unused.
+        sh: The handler which emits logs to console.
+        fh: The handler which emits logs to a file.
+        verbose_fh: Just as fh, but logs with a different, more verbose, format.
     """
 
     _logger: logging.Logger
@@ -40,6 +42,15 @@ class DTSLOG(logging.LoggerAdapter):
     verbose_fh: logging.FileHandler
 
     def __init__(self, logger: logging.Logger, node: str = "suite"):
+        """Extend the constructor with additional handlers.
+
+        One handler logs to the console, the other one to a file, with either a regular or verbose
+        format.
+
+        Args:
+            logger: The logger from which to create the logger adapter.
+            node: An additional identifier. Currently unused.
+        """
         self._logger = logger
         # 1 means log everything, this will be used by file handlers if their level
         # is not set
@@ -92,26 +103,43 @@ def __init__(self, logger: logging.Logger, node: str = "suite"):
         super(DTSLOG, self).__init__(self._logger, dict(node=self.node))
 
     def logger_exit(self) -> None:
-        """
-        Remove stream handler and logfile handler.
-        """
+        """Remove the stream handler and the logfile handler."""
         for handler in (self.sh, self.fh, self.verbose_fh):
             handler.flush()
             self._logger.removeHandler(handler)
 
 
+class _LoggerDictType(TypedDict):
+    logger: DTSLOG
+    name: str
+    node: str
+
+
+# List for saving all loggers in use
+_Loggers: list[_LoggerDictType] = []
+
+
 def getLogger(name: str, node: str = "suite") -> DTSLOG:
+    """Get DTS logger adapter identified by name and node.
+
+    An existing logger will be return if one with the exact name and node already exists.
+    A new one will be created and stored otherwise.
+
+    Args:
+        name: The name of the logger.
+        node: An additional identifier for the logger.
+
+    Returns:
+        A logger uniquely identified by both name and node.
     """
-    Get logger handler and if there's no handler for specified Node will create one.
-    """
-    global Loggers
+    global _Loggers
     # return saved logger
-    logger: LoggerDictType
-    for logger in Loggers:
+    logger: _LoggerDictType
+    for logger in _Loggers:
         if logger["name"] == name and logger["node"] == node:
             return logger["logger"]
 
     # return new logger
     dts_logger: DTSLOG = DTSLOG(logging.getLogger(name), node)
-    Loggers.append({"logger": dts_logger, "name": name, "node": node})
+    _Loggers.append({"logger": dts_logger, "name": name, "node": node})
     return dts_logger
diff --git a/dts/framework/utils.py b/dts/framework/utils.py
index f0c916471c..0613adf7ad 100644
--- a/dts/framework/utils.py
+++ b/dts/framework/utils.py
@@ -3,6 +3,16 @@
 # Copyright(c) 2022-2023 PANTHEON.tech s.r.o.
 # Copyright(c) 2022-2023 University of New Hampshire
 
+"""Various utility classes and functions.
+
+These are used in multiple modules across the framework. They're here because
+they provide some non-specific functionality, greatly simplify imports or just don't
+fit elsewhere.
+
+Attributes:
+    REGEX_FOR_PCI_ADDRESS: The regex representing a PCI address, e.g. ``0000:00:08.0``.
+"""
+
 import atexit
 import json
 import os
@@ -19,12 +29,20 @@
 
 
 def expand_range(range_str: str) -> list[int]:
-    """
-    Process range string into a list of integers. There are two possible formats:
-    n - a single integer
-    n-m - a range of integers
+    """Process `range_str` into a list of integers.
+
+    There are two possible formats of `range_str`:
+
+        * ``n`` - a single integer,
+        * ``n-m`` - a range of integers.
 
-    The returned range includes both n and m. Empty string returns an empty list.
+    The returned range includes both ``n`` and ``m``. Empty string returns an empty list.
+
+    Args:
+        range_str: The range to expand.
+
+    Returns:
+        All the numbers from the range.
     """
     expanded_range: list[int] = []
     if range_str:
@@ -39,6 +57,14 @@ def expand_range(range_str: str) -> list[int]:
 
 
 def get_packet_summaries(packets: list[Packet]) -> str:
+    """Format a string summary from `packets`.
+
+    Args:
+        packets: The packets to format.
+
+    Returns:
+        The summary of `packets`.
+    """
     if len(packets) == 1:
         packet_summaries = packets[0].summary()
     else:
@@ -49,6 +75,8 @@ def get_packet_summaries(packets: list[Packet]) -> str:
 
 
 class StrEnum(Enum):
+    """Enum with members stored as strings."""
+
     @staticmethod
     def _generate_next_value_(
         name: str, start: int, count: int, last_values: object
@@ -56,22 +84,29 @@ def _generate_next_value_(
         return name
 
     def __str__(self) -> str:
+        """The string representation is the name of the member."""
         return self.name
 
 
 class MesonArgs(object):
-    """
-    Aggregate the arguments needed to build DPDK:
-    default_library: Default library type, Meson allows "shared", "static" and "both".
-               Defaults to None, in which case the argument won't be used.
-    Keyword arguments: The arguments found in meson_options.txt in root DPDK directory.
-               Do not use -D with them, for example:
-               meson_args = MesonArgs(enable_kmods=True).
-    """
+    """Aggregate the arguments needed to build DPDK."""
 
     _default_library: str
 
     def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
+        """Initialize the meson arguments.
+
+        Args:
+            default_library: The default library type, Meson supports ``shared``, ``static`` and
+                ``both``. Defaults to :data:`None`, in which case the argument won't be used.
+            dpdk_args: The arguments found in ``meson_options.txt`` in root DPDK directory.
+                Do not use ``-D`` with them.
+
+        Example:
+            ::
+
+                meson_args = MesonArgs(enable_kmods=True).
+        """
         self._default_library = (
             f"--default-library={default_library}" if default_library else ""
         )
@@ -83,6 +118,7 @@ def __init__(self, default_library: str | None = None, **dpdk_args: str | bool):
         )
 
     def __str__(self) -> str:
+        """The actual args."""
         return " ".join(f"{self._default_library} {self._dpdk_args}".split())
 
 
@@ -93,35 +129,33 @@ class _TarCompressionFormat(StrEnum):
     and Enum values are the associated file extensions.
     """
 
+    #:
     gzip = "gz"
+    #:
     compress = "Z"
+    #:
     bzip2 = "bz2"
+    #:
     lzip = "lz"
+    #:
     lzma = "lzma"
+    #:
     lzop = "lzo"
+    #:
     xz = "xz"
+    #:
     zstd = "zst"
 
 
 class DPDKGitTarball(object):
-    """Create a compressed tarball of DPDK from the repository.
-
-    The DPDK version is specified with git object git_ref.
-    The tarball will be compressed with _TarCompressionFormat,
-    which must be supported by the DTS execution environment.
-    The resulting tarball will be put into output_dir.
+    """Compressed tarball of DPDK from the repository.
 
-    The class supports the os.PathLike protocol,
+    The class supports the :class:`os.PathLike` protocol,
     which is used to get the Path of the tarball::
 
         from pathlib import Path
         tarball = DPDKGitTarball("HEAD", "output")
         tarball_path = Path(tarball)
-
-    Arguments:
-        git_ref: A git commit ID, tag ID or tree ID.
-        output_dir: The directory where to put the resulting tarball.
-        tar_compression_format: The compression format to use.
     """
 
     _git_ref: str
@@ -136,6 +170,17 @@ def __init__(
         output_dir: str,
         tar_compression_format: _TarCompressionFormat = _TarCompressionFormat.xz,
     ):
+        """Create the tarball during initialization.
+
+        The DPDK version is specified with `git_ref`. The tarball will be compressed with
+        `tar_compression_format`, which must be supported by the DTS execution environment.
+        The resulting tarball will be put into `output_dir`.
+
+        Args:
+            git_ref: A git commit ID, tag ID or tree ID.
+            output_dir: The directory where to put the resulting tarball.
+            tar_compression_format: The compression format to use.
+        """
         self._git_ref = git_ref
         self._tar_compression_format = tar_compression_format
 
@@ -204,4 +249,5 @@ def _delete_tarball(self) -> None:
             os.remove(self._tarball_path)
 
     def __fspath__(self) -> str:
+        """The os.PathLike protocol implementation."""
         return str(self._tarball_path)
-- 
2.34.1


^ permalink raw reply	[flat|nested] 393+ messages in thread

* [PATCH v5 07/23] dts: dts runner and main docstring update
  2023-11-06 17:15       ` [PATCH v5 00/23] dts: add dts api docs Juraj Linkeš
                           ` (5 preceding siblings ...)
  2023-11-06 17:15         ` [PATCH v5 06/23] dts: logger and " Juraj Linkeš
@ 2023-11-06 17:15         ` Juraj Linkeš
  2023-11-06 17:15         ` [PATCH v5 08/23] dts: test suite " Juraj Linkeš
                           ` (16 subsequent siblings)
  23 siblings, 0 replies; 393+ messages in thread
From: Juraj Linkeš @ 2023-11-06 17:15 UTC (permalink / raw)
  To: thomas, Honnappa.Nagarahalli, bruce.richardson, jspewock, probb,
	paul.szczepanek, yoan.picchi
  Cc: dev, Juraj Linkeš

Format according to the Google format and PEP257, with slight
deviations.

Signed-off-by: Juraj Linkeš <juraj.linkes@pantheon.tech>
---
 dts/framework/dts.py | 128 ++++++++++++++++++++++++++++++++++++-------
 dts/main.py          |   8 ++-
 2 files changed, 112 insertions(+), 24 deletions(-)

diff --git a/dts/framework/dts.py b/dts/framework/dts.py
index 4c7fb0c40a..331fed7dc4 100644
--- a/dts/framework/dts.py
+++ b/dts/framework/dts.py
@@ -3,6 +3,3