DPDK patches and discussions
 help / color / mirror / Atom feed
From: Anatoly Burakov <anatoly.burakov@intel.com>
To: dev@dpdk.org, Bruce Richardson <bruce.richardson@intel.com>
Cc: john.mcnamara@intel.com
Subject: [RFC PATCH v3 1/1] buildtools: add vscode configuration generator
Date: Wed, 31 Jul 2024 14:33:49 +0100	[thread overview]
Message-ID: <de9fbb94025e11cae08cf3f343f673d9e635ceb3.1722432797.git.anatoly.burakov@intel.com> (raw)
In-Reply-To: <cover.1722432796.git.anatoly.burakov@intel.com>

A lot of developers use Visual Studio Code as their primary IDE. This
script will be called from within meson build process, and will generate
a configuration file for VSCode that sets up basic build tasks, launch
tasks, as well as C/C++ code analysis settings that will take into
account compile_commands.json that is automatically generated by meson.

Files generated by script:
 - .vscode/settings.json: stores variables needed by other files
 - .vscode/tasks.json: defines build tasks
 - .vscode/launch.json: defines launch tasks
 - .vscode/c_cpp_properties.json: defines code analysis settings

Multiple, as well as out-of-source-tree, build directories are supported,
and the script will generate separate configuration items for each build
directory created by user, tagging them for convenience.

Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
---

Notes:
    RFCv3 -> RFCv2:
    - Following feedback from Bruce, reworked to be minimal script run from meson
    - Moved to buildtools
    - Support for multiple build directories is now the default
    - All targets are automatically added to all configuration files
    
    RFCv1 -> RFCv2:
    
    - No longer disable apps and drivers if nothing was specified via command line
      or TUI, and warn user about things being built by default
    - Generate app launch configuration by default for when no apps are selected
    - Added paramters:
      - --force to avoid overwriting existing config
      - --common-conf to specify global meson flags applicable to all configs
      - --gdbsudo/--no-gdbsudo to specify gdbsudo behavior
    - Autodetect gdbsudo/gdb from UID
    - Updated comments, error messages, fixed issues with user interaction
    - Improved handling of wildcards and driver dependencies
    - Fixed a few bugs in dependency detection due to incorrect parsing
    - [Stephen] flake8 is happy

 app/meson.build               |  12 +-
 buildtools/gen-vscode-conf.py | 442 ++++++++++++++++++++++++++++++++++
 buildtools/meson.build        |   5 +
 examples/meson.build          |  13 +-
 meson.build                   |  11 +
 5 files changed, 481 insertions(+), 2 deletions(-)
 create mode 100755 buildtools/gen-vscode-conf.py

diff --git a/app/meson.build b/app/meson.build
index 5b2c80c7a1..cf0eda3d5f 100644
--- a/app/meson.build
+++ b/app/meson.build
@@ -114,7 +114,17 @@ foreach app:apps
         link_libs = dpdk_static_libraries + dpdk_drivers
     endif
 
-    exec = executable('dpdk-' + name,
+    # add to Visual Studio Code launch configuration
+    exe_name = 'dpdk-' + name
+    launch_path = join_paths(meson.current_build_dir(), exe_name)
+    # we don't want to block the build if this command fails
+    result = run_command(vscode_conf_gen_cmd + ['--launch', launch_path], check: false)
+    if result.returncode() != 0
+        warning('Failed to generate Visual Studio Code launch configuration for "' + name + '"')
+        message(result.stderr())
+    endif
+
+    exec = executable(exe_name,
             sources,
             c_args: cflags,
             link_args: ldflags,
diff --git a/buildtools/gen-vscode-conf.py b/buildtools/gen-vscode-conf.py
new file mode 100755
index 0000000000..fcc6469065
--- /dev/null
+++ b/buildtools/gen-vscode-conf.py
@@ -0,0 +1,442 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 Intel Corporation
+#
+
+"""Visual Studio Code configuration generator script."""
+
+# This script is meant to be run by meson build system to generate build and
+# launch commands for a specific build directory for Visual Studio Code IDE.
+#
+# Even though this script will generate settings/tasks/launch/code analysis
+# configuration for VSCode, we can't actually just regenerate the files,
+# because we want to support multiple build directories, as well as not
+# destroy any configuration user has created between runs of this script.
+# Therefore, we need some config file handling infrastructure. Luckily, VSCode
+# configs are all JSON, so we can just use json module to handle them. Of
+# course, we will lose any user comments in the files, but that's a small price
+# to pay for this sort of automation.
+#
+# Since this script will be run by meson, we can forego any parsing or anything
+# to do with the build system, and just rely on the fact that we get all of our
+# configuration from command-line.
+
+import argparse
+import ast
+import json
+import os
+import shutil
+from collections import OrderedDict
+from sys import stderr, exit as _exit
+from typing import List, Dict, Any
+
+
+class ConfigCtx:
+    """POD class to keep data associated with config."""
+    def __init__(self, build_dir: str, source_dir: str, launch: List[str]):
+        self.build_dir = build_dir
+        self.source_dir = source_dir
+        self.config_dir = os.path.join(source_dir, '.vscode')
+        # we don't have any mechanism to label things, so we're just going to
+        # use build dir basename as the label, and hope user doesn't create
+        # different build directories with the same name
+        self.label = os.path.basename(build_dir)
+        self.builddir_var = f'{self.label}.builddir'
+        self.launch = launch
+
+        settings_fname = 'settings.json'
+        tasks_fname = 'tasks.json'
+        launch_fname = 'launch.json'
+        analysis_fname = 'c_cpp_properties.json'
+        settings_tmp_fname = f'.{settings_fname}.{self.label}.tmp'
+        tasks_tmp_fname = f'.{tasks_fname}.{self.label}.tmp'
+        launch_tmp_fname = f'.{launch_fname}.{self.label}.tmp'
+        analysis_tmp_fname = f'.{analysis_fname}.{self.label}.tmp'
+
+        self.settings_path = os.path.join(self.config_dir, settings_fname)
+        self.tasks_path = os.path.join(self.config_dir, tasks_fname)
+        self.launch_path = os.path.join(self.config_dir, launch_fname)
+        self.analysis_path = os.path.join(self.config_dir, analysis_fname)
+
+        # we want to write into temporary files at first
+        self.settings_tmp = os.path.join(self.config_dir, settings_tmp_fname)
+        self.tasks_tmp = os.path.join(self.config_dir, tasks_tmp_fname)
+        self.launch_tmp = os.path.join(self.config_dir, launch_tmp_fname)
+        self.analysis_tmp = os.path.join(self.config_dir, analysis_tmp_fname)
+
+        # we don't want to mess with files if we didn't change anything
+        self.settings_changed = False
+        self.tasks_changed = False
+        self.launch_changed = False
+        self.analysis_changed = False
+
+
+class Boolifier(ast.NodeTransformer):
+    """Replace JSON "true" with Python "True"."""
+    def visit_Name(self, node: ast.Name) -> ast.Constant:
+        """Visitor for Name nodes."""
+        if node.id == 'true':
+            return ast.Constant(value=True)
+        elif node.id == 'false':
+            return ast.Constant(value=False)
+        return node
+
+
+def _parse_eval(data: str) -> Dict[str, Any]:
+    """Use AST and literal_eval to parse JSON."""
+    # JSON syntax is, for the most part, valid Python dictionary literal, aside
+    # from a small issue of capitalized booleans. so, we will try to parse
+    # JSON into an AST, replace "true"/"false" with "True"/"False", and then
+    # reparse the AST into a Python object
+    parsed = ast.parse(data)
+    unparsed = ast.unparse(Boolifier().visit(parsed))
+    # we parsed AST, now walk it and replace ast.Name nodes with booleans for
+    # actual AST boolean literals of type ast.Boolean
+    ast_data = ast.literal_eval(unparsed)
+    return ast_data
+
+
+def _load_json(file: str) -> Dict[str, Any]:
+    """Load JSON file."""
+    with open(file, 'r', encoding='utf-8') as f:
+        data = f.read()
+        try:
+            return json.loads(data)
+        except json.JSONDecodeError:
+            # Python's JSON parser doesn't like trailing commas but VSCode's
+            # JSON parser does not consider them to be syntax errors, so they
+            # may be present in user's configuration files. we can try to parse
+            # JSON as Python dictionary literal, and see if it works. if it
+            # doesn't, there's probably a syntax error anyway, so re-raise.
+            try:
+                return _parse_eval(data)
+            except (ValueError, TypeError, SyntaxError,
+                    MemoryError, RecursionError):
+                pass
+            raise
+
+
+def _dump_json(file: str, obj: Dict[str, Any]) -> None:
+    """Write JSON file."""
+    with open(file, 'w') as f:
+        json.dump(obj, f, indent=4)
+
+
+def _overwrite(src: str, dst: str) -> None:
+    """Overwrite dst file with src file."""
+    shutil.copyfile(src, dst)
+    # unlink src
+    os.unlink(src)
+
+
+def _gen_sorter(order: List[str]) -> Any:
+    """Sort dictionary by order."""
+
+    # JSON doesn't have sort order, but we want to be user friendly and display
+    # certain properties above others as they're more important. This function
+    # will return a closure that can be used to re-sort a specific object using
+    # OrderedDict and an ordered list of properties.
+    def _sorter(obj: Dict[str, Any]) -> OrderedDict[str, Any]:
+        d = OrderedDict()
+        # step 1: go through all properties in order and re-add them
+        for prop in order:
+            if prop in obj:
+                d[prop] = obj[prop]
+        # step 2: get all properties of the object, remove those that we have
+        #         already added, and sort them alphabetically
+        for prop in sorted(set(obj.keys()) - set(order)):
+            d[prop] = obj[prop]
+        # we're done: now all objects will have vaguely constant sort order
+        return d
+    return _sorter
+
+
+def _add_to_obj_list(obj_list: List[Dict[str, Any]],
+                     key: str, obj: Dict[str, Any]) -> bool:
+    """Add object to list if it doesn't already exist."""
+    for o in obj_list:
+        if o[key] == obj[key]:
+            return False
+    obj_list.append(obj)
+    return True
+
+
+def _process_settings(ctx: ConfigCtx) -> Dict[str, Any]:
+    """Update settings.json."""
+    try:
+        settings_obj = _load_json(ctx.settings_path)
+    except FileNotFoundError:
+        settings_obj = {}
+
+    # add build to settings if it doesn't exist
+    if ctx.builddir_var not in settings_obj:
+        ctx.settings_changed = True
+    settings_obj.setdefault(ctx.builddir_var, ctx.build_dir)
+
+    # add path ignore setting if it's inside the source dir
+    cpath = os.path.commonpath([ctx.source_dir, ctx.build_dir])
+    if cpath == ctx.source_dir:
+        # find path within source tree
+        relpath = os.path.relpath(ctx.build_dir, ctx.source_dir) + os.sep
+
+        # note if we need to change anything
+        if 'files.exclude' not in settings_obj:
+            ctx.settings_changed = True
+        elif relpath not in settings_obj['files.exclude']:
+            ctx.settings_changed = True
+
+        exclude = settings_obj.setdefault('files.exclude', {})
+        exclude.setdefault(relpath, True)
+        settings_obj['files.exclude'] = exclude
+
+    return settings_obj
+
+
+def _process_tasks(ctx: ConfigCtx) -> Dict[str, Any]:
+    """Update tasks.json."""
+    try:
+        outer_tasks_obj = _load_json(ctx.tasks_path)
+    except FileNotFoundError:
+        outer_tasks_obj = {
+            "version": "2.0.0",
+            "tasks": [],
+            "inputs": []
+        }
+    inner_tasks_obj = outer_tasks_obj.setdefault('tasks', [])
+    inputs_obj = outer_tasks_obj.setdefault('inputs', [])
+
+    # generate task object sorter
+    _sort_task = _gen_sorter(['label', 'detail', 'type', 'command', 'args',
+                              'options', 'problemMatcher', 'group'])
+
+    # generate our would-be configuration
+
+    # first, we need a build task
+    build_task = {
+        "label": f"[{ctx.label}] Compile",
+        "detail": f"Run `ninja` command for {ctx.label}",
+        "type": "shell",
+        "command": "meson compile",
+        "options": {
+            "cwd": f'${{config:{ctx.builddir_var}}}'
+        },
+        "problemMatcher": {
+            "base": "$gcc",
+            "fileLocation": ["relative", f"${{config:{ctx.builddir_var}}}"]
+        },
+        "group": "build"
+    }
+    # we also need a meson configure task with input
+    configure_task = {
+        "label": f"[{ctx.label}] Configure",
+        "detail": f"Run `meson configure` command for {ctx.label}",
+        "type": "shell",
+        "command": "meson configure ${input:mesonConfigureArg}",
+        "options": {
+            "cwd": f'${{config:{ctx.builddir_var}}}'
+        },
+        "problemMatcher": [],
+        "group": "build"
+    }
+    # finally, add input object
+    input_arg = {
+        "id": "mesonConfigureArg",
+        "type": "promptString",
+        "description": "Enter meson configure arguments",
+        "default": ""
+    }
+
+    # sort our tasks
+    build_task = _sort_task(build_task)
+    configure_task = _sort_task(configure_task)
+
+    # add only if task doesn't already exist
+    ctx.tasks_changed |= _add_to_obj_list(inner_tasks_obj, 'label',
+                                          build_task)
+    ctx.tasks_changed |= _add_to_obj_list(inner_tasks_obj, 'label',
+                                          configure_task)
+    ctx.tasks_changed |= _add_to_obj_list(inputs_obj, 'id', input_arg)
+
+    # replace nodes
+    outer_tasks_obj['tasks'] = inner_tasks_obj
+    outer_tasks_obj['inputs'] = inputs_obj
+
+    # we're ready
+    return outer_tasks_obj
+
+
+def _process_launch(ctx: ConfigCtx) -> Dict[str, Any]:
+    """Update launch.json."""
+    try:
+        launch_obj = _load_json(ctx.launch_path)
+    except FileNotFoundError:
+        launch_obj = {
+            "version": "0.2.0",
+            "configurations": []
+        }
+    configurations_obj = launch_obj.setdefault('configurations', [])
+
+    # generate launch task sorter
+    _sort_launch = _gen_sorter(['name', 'type', 'request', 'program', 'cwd',
+                                'preLaunchTask', 'environment', 'args',
+                                'MIMode', 'miDebuggerPath', 'setupCommands'])
+
+    gdb_path = shutil.which('gdb')
+    for target in ctx.launch:
+        # target will be a full path, we need to get relative to build path
+        exe_path = os.path.relpath(target, ctx.build_dir)
+        name = f"[{ctx.label}] Launch {exe_path}"
+        # generate config from template
+        launch_config = {
+            "name": name,
+            "type": "cppdbg",
+            "request": "launch",
+            "program": f"${{config:{ctx.builddir_var}}}/{exe_path}",
+            "args": [],
+            "cwd": "${workspaceFolder}",
+            "environment": [],
+            "MIMode": "gdb",
+            "miDebuggerPath": gdb_path,
+            "preLaunchTask": f"[{ctx.label}] Compile",
+            "setupCommands": [
+                {
+                    "description": "Enable pretty-printing for gdb",
+                    "text": "-gdb-set print pretty on",
+                    "ignoreFailures": True
+                }
+            ],
+        }
+        # sort keys
+        launch_config = _sort_launch(launch_config)
+        # add to configurations
+        ctx.launch_changed |= _add_to_obj_list(configurations_obj, 'name',
+                                               launch_config)
+
+    # replace the configuration object
+    launch_obj['configurations'] = configurations_obj
+
+    # we're ready
+    return launch_obj
+
+
+def _process_analysis(ctx: ConfigCtx) -> Dict[str, Any]:
+    """Update c_cpp_properties.json."""
+    try:
+        analysis_obj = _load_json(ctx.analysis_path)
+    except FileNotFoundError:
+        analysis_obj = {
+            "version": 4,
+            "configurations": []
+        }
+    configurations_obj = analysis_obj.setdefault('configurations', [])
+
+    # generate analysis config sorter
+    _sort_analysis = _gen_sorter(['name', 'includePath', 'compilerPath',
+                                  'cStandard', 'cppStandard',
+                                  'intelliSenseMode', 'compileCommands'])
+
+    # TODO: pick up more configuration from meson (e.g. OS, platform, compiler)
+
+    config_obj = {
+        "name": "Linux",
+        "includePath": [
+                f"${{config:{ctx.builddir_var}}}/",
+                # hardcode everything to x86/Linux for now
+                "${workspaceFolder}/lib/eal/x86",
+                "${workspaceFolder}/lib/eal/linux",
+                "${workspaceFolder}/**"
+        ],
+        "compilerPath": "/usr/bin/gcc",
+        "cStandard": "c99",
+        "cppStandard": "c++17",
+        "intelliSenseMode": "${default}",
+        "compileCommands":
+        f"${{config:{ctx.builddir_var}}}/compile_commands.json"
+    }
+    # sort configuration
+    config_obj = _sort_analysis(config_obj)
+
+    # add it to config obj
+    ctx.analysis_changed |= _add_to_obj_list(configurations_obj, 'name',
+                                             config_obj)
+
+    # we're done
+    analysis_obj['configurations'] = configurations_obj
+
+    return analysis_obj
+
+
+def _gen_config(ctx: ConfigCtx) -> None:
+    """Generate all config files."""
+    # ensure config dir exists
+    os.makedirs(ctx.config_dir, exist_ok=True)
+
+    # generate all JSON objects and write them to temp files
+    settings_obj = _process_settings(ctx)
+    _dump_json(ctx.settings_tmp, settings_obj)
+
+    tasks_obj = _process_tasks(ctx)
+    _dump_json(ctx.tasks_tmp, tasks_obj)
+
+    launch_obj = _process_launch(ctx)
+    _dump_json(ctx.launch_tmp, launch_obj)
+
+    analysis_obj = _process_analysis(ctx)
+    _dump_json(ctx.analysis_tmp, analysis_obj)
+
+
+def _main() -> int:
+    parser = argparse.ArgumentParser(
+        description='Generate VSCode configuration')
+    # where we are being called from
+    parser.add_argument('--build-dir', required=True, help='Build directory')
+    # where the sources are
+    parser.add_argument('--source-dir', required=True, help='Source directory')
+    # launch configuration item, can be multiple
+    parser.add_argument('--launch', action='append',
+                        help='Launch path for executable')
+    parser.epilog = "This script is not meant to be run manually."
+    # parse arguments
+    args = parser.parse_args()
+
+    # canonicalize all paths
+    build_dir = os.path.realpath(args.build_dir)
+    source_dir = os.path.realpath(args.source_dir)
+    if args.launch:
+        launch = [os.path.realpath(lp) for lp in args.launch]
+    else:
+        launch = []
+
+    ctx = ConfigCtx(build_dir, source_dir, launch)
+
+    try:
+        _gen_config(ctx)
+        # we finished configuration successfully, update if needed
+        update_dict = {
+            ctx.settings_path: (ctx.settings_tmp, ctx.settings_changed),
+            ctx.tasks_path: (ctx.tasks_tmp, ctx.tasks_changed),
+            ctx.launch_path: (ctx.launch_tmp, ctx.launch_changed),
+            ctx.analysis_path: (ctx.analysis_tmp, ctx.analysis_changed)
+        }
+        for path, t in update_dict.items():
+            tmp_path, changed = t
+            if changed:
+                _overwrite(tmp_path, path)
+            else:
+                os.unlink(tmp_path)
+
+        return 0
+    except json.JSONDecodeError as e:
+        # remove all temporary files we may have created
+        for tmp in [ctx.settings_tmp, ctx.tasks_tmp, ctx.launch_tmp,
+                    ctx.analysis_tmp]:
+            if os.path.exists(tmp):
+                os.unlink(tmp)
+        # if we fail to load JSON, output error
+        print(f"Error: {e}", file=stderr)
+
+        return 1
+
+
+if __name__ == '__main__':
+    _exit(_main())
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..7d2dc501d6 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,11 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
 get_test_suites_cmd = py3 + files('get-test-suites.py')
 has_hugepages_cmd = py3 + files('has-hugepages.py')
 cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+# Visual Studio Code conf generator always requires build and source root
+vscode_conf_gen_cmd = py3 + files('gen-vscode-conf.py') + [
+        '--build-dir', dpdk_build_root,
+        '--source-dir', dpdk_source_root
+    ]
 
 # install any build tools that end-users might want also
 install_data([
diff --git a/examples/meson.build b/examples/meson.build
index 8e8968a1fa..9e59223d3f 100644
--- a/examples/meson.build
+++ b/examples/meson.build
@@ -124,7 +124,18 @@ foreach example: examples
     if allow_experimental_apis
         cflags += '-DALLOW_EXPERIMENTAL_API'
     endif
-    executable('dpdk-' + name, sources,
+
+    # add to Visual Studio Code launch configuration
+    exe_name = 'dpdk-' + name
+    launch_path = join_paths(meson.current_build_dir(), exe_name)
+    # we don't want to block the build if this command fails
+    result = run_command(vscode_conf_gen_cmd + ['--launch', launch_path], check: false)
+    if result.returncode() != 0
+        warning('Failed to generate Visual Studio Code launch configuration for "' + name + '"')
+        message(result.stderr())
+    endif
+
+    executable(exe_name, sources,
             include_directories: includes,
             link_whole: link_whole_libs,
             link_args: ldflags,
diff --git a/meson.build b/meson.build
index 8b248d4505..df6115d098 100644
--- a/meson.build
+++ b/meson.build
@@ -117,6 +117,17 @@ if meson.is_subproject()
     subdir('buildtools/subproject')
 endif
 
+# if no apps or examples were enabled, no Visual Studio Code config was
+# generated, but we still need build, code analysis etc. configuration to be
+# present, so generate it just in case (it will have no effect if the
+# configuration was already generated by apps/examples). also, when running
+# this command, we don't want to block the build if it fails.
+result = run_command(vscode_conf_gen_cmd, check: false)
+if result.returncode() != 0
+    warning('Failed to generate Visual Studio Code configuration')
+    message(result.stderr())
+endif
+
 # Final output, list all the parts to be built.
 # This does not affect any part of the build, for information only.
 output_message = '\n=================\nApplications Enabled\n=================\n'
-- 
2.43.5


  reply	other threads:[~2024-07-31 13:34 UTC|newest]

Thread overview: 21+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2024-07-26 12:42 [RFC PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
2024-07-26 12:42 ` [RFC PATCH v1 1/1] devtools: add vscode configuration generator Anatoly Burakov
2024-07-26 15:36   ` Stephen Hemminger
2024-07-26 16:05     ` Burakov, Anatoly
2024-07-29 13:05 ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Anatoly Burakov
2024-07-29 13:05   ` [RFC PATCH v2 1/1] devtools: add vscode configuration generator Anatoly Burakov
2024-07-29 13:14     ` Bruce Richardson
2024-07-29 13:17       ` Burakov, Anatoly
2024-07-29 14:30     ` Bruce Richardson
2024-07-29 16:16       ` Burakov, Anatoly
2024-07-29 16:41         ` Bruce Richardson
2024-07-30  9:21           ` Burakov, Anatoly
2024-07-30 10:31             ` Bruce Richardson
2024-07-30 10:50               ` Burakov, Anatoly
2024-07-30 15:01   ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Bruce Richardson
2024-07-30 15:14     ` Burakov, Anatoly
2024-07-30 15:19       ` Bruce Richardson
2024-07-31 13:33 ` [RFC PATCH v3 " Anatoly Burakov
2024-07-31 13:33   ` Anatoly Burakov [this message]
2024-09-02 12:17 ` [PATCH v1 " Anatoly Burakov
2024-09-02 12:17   ` [PATCH v1 1/1] buildtools: add VSCode configuration generator Anatoly Burakov

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=de9fbb94025e11cae08cf3f343f673d9e635ceb3.1722432797.git.anatoly.burakov@intel.com \
    --to=anatoly.burakov@intel.com \
    --cc=bruce.richardson@intel.com \
    --cc=dev@dpdk.org \
    --cc=john.mcnamara@intel.com \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).