DPDK patches and discussions
 help / color / mirror / Atom feed
* [RFC PATCH v1 0/1] Add Visual Studio Code configuration script
@ 2024-07-26 12:42 Anatoly Burakov
  2024-07-26 12:42 ` [RFC PATCH v1 1/1] devtools: add vscode configuration generator Anatoly Burakov
                   ` (3 more replies)
  0 siblings, 4 replies; 21+ messages in thread
From: Anatoly Burakov @ 2024-07-26 12:42 UTC (permalink / raw)
  To: dev; +Cc: john.mcnamara

Lots of developers (myself included) uses Visual Studio Code as their primary
IDE for DPDK development. I have been successfully using various incarnations
of this script internally to quickly set up my development trees whenever I
need a new configuration, so this script is being shared in hopes that it will
be useful both to new developers starting with DPDK, and to seasoned DPDK
developers who are already using Visual Studio Code. It makes starting working
on DPDK in Visual Studio Code so much easier!

Philosophy behind this script is as follows:

- The assumption is made that a developer will not be using wildly different
  configurations from build to build - usually, they build the same things,
  work with the same set of apps/drivers for a while, then switch to something
  else, at which point a new configuration is needed
- Some configurations I consider to be "common" are included: debug build, debug
  optimized build, release build with docs, and ASan build
  (feel free to make suggestions here!)
- By default, the script will suggest enabling test, testpmd, and helloworld example
- No drivers are being enabled by default - use needs to explicitly enable them
  (another option could be to leave things as default and build everything, but I
  rather prefer minimalistic builds as they're faster to compile, and it would be
  semantically weird to not have any drivers selected yet all of them being built)
- All parameters that can be adjusted by TUI are also available as command line
  arguments, so while user interaction is the default (using whiptail), it's
  actually not required and can be bypassed.
- I usually work as a local user not as root, so by default the script will attempt
  to use "gdbsudo" (a "sudo gdb $@" script in /usr/local/bin) for launch tasks,
  and stop if it is not available.

Currently, it is only possible to define custom per-build configurations, while
any "global" meson settings would have to involve editing settings.json file. This
can be changed easily if required, but I've never needed this functionality.

Please feel free to make any suggestions!

Anatoly Burakov (1):
  devtools: add vscode configuration generator

 devtools/gen-vscode-config.py | 640 ++++++++++++++++++++++++++++++++++
 1 file changed, 640 insertions(+)
 create mode 100755 devtools/gen-vscode-config.py

-- 
2.43.5


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [RFC PATCH v1 1/1] devtools: add vscode configuration generator
  2024-07-26 12:42 [RFC PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
@ 2024-07-26 12:42 ` Anatoly Burakov
  2024-07-26 15:36   ` Stephen Hemminger
  2024-07-29 13:05 ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Anatoly Burakov
                   ` (2 subsequent siblings)
  3 siblings, 1 reply; 21+ messages in thread
From: Anatoly Burakov @ 2024-07-26 12:42 UTC (permalink / raw)
  To: dev; +Cc: john.mcnamara

A lot of developers use Visual Studio Code as their primary IDE. This
script generates a configuration file for VSCode that sets up basic build
tasks, launch tasks, as well as C/C++ code analysis settings that will
take into account compile_commands.json that is automatically generated
by meson.

Files generated by script:
 - .vscode/settings.json: stores variables needed by other files
 - .vscode/tasks.json: defines build tasks
 - .vscode/launch.json: defines launch tasks
 - .vscode/c_cpp_properties.json: defines code analysis settings

The script uses a combination of globbing and meson file parsing to
discover available apps, examples, and drivers, and generates a
project-wide settings file, so that the user can later switch between
debug/release/etc. configurations while keeping their desired apps,
examples, and drivers, built by meson, and ensuring launch configurations
still work correctly whatever the configuration selected.

This script uses whiptail as TUI, which is expected to be universally
available as it is shipped by default on most major distributions.
However, the script is also designed to be scriptable and can be run
without user interaction, and have its configuration supplied from
command-line arguments.

Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
---
 devtools/gen-vscode-config.py | 640 ++++++++++++++++++++++++++++++++++
 1 file changed, 640 insertions(+)
 create mode 100755 devtools/gen-vscode-config.py

diff --git a/devtools/gen-vscode-config.py b/devtools/gen-vscode-config.py
new file mode 100755
index 0000000000..0d291b6c17
--- /dev/null
+++ b/devtools/gen-vscode-config.py
@@ -0,0 +1,640 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 Intel Corporation
+#
+
+"""Visual Studio Code configuration generator script."""
+
+import os
+import json
+import argparse
+import fnmatch
+import shutil
+from typing import List, Dict, Tuple, Any
+from sys import exit as _exit, stderr
+from subprocess import run, CalledProcessError, PIPE
+from mesonbuild import mparser
+from mesonbuild.mesonlib import MesonException
+
+
+class DPDKBuildTask:
+    """A build task for DPDK"""
+
+    def __init__(self, label: str, description: str, param: str):
+        # label as it appears in build configuration
+        self.label = label
+        # description to be given in menu
+        self.description = description
+        # task-specific configuration parameters
+        self.param = param
+
+    def to_json_dict(self) -> Dict[str, Any]:
+        """Generate JSON dictionary for this task"""
+        return {
+            "label": f"Configure {self.label}",
+            "detail": self.description,
+            "type": "shell",
+            "dependsOn": "Remove builddir",
+            "command": f"meson setup ${{config:BUILDCONFIG}} {self.param} ${{config:BUILDDIR}}",
+            "problemMatcher": [],
+            "group": "build"
+        }
+
+
+class CmdlineCtx:
+    """POD class to set up command line parameters"""
+
+    def __init__(self):
+        self.use_ui = False
+        self.use_gdbsudo = False
+        self.build_dir: str = ""
+        self.dpdk_dir: str = ""
+        self.gdb_path: str = ""
+
+        self.avail_configs: List[Tuple[str, str, str]] = []
+        self.avail_apps: List[str] = []
+        self.avail_examples: List[str] = []
+        self.avail_drivers: List[str] = []
+
+        self.enabled_configs: List[Tuple[str, str, str]] = []
+        self.enabled_apps: List[str] = []
+        self.enabled_examples: List[str] = []
+        self.enabled_drivers: List[str] = []
+
+        self.driver_dep_map: Dict[str, List[str]] = {}
+
+
+class DPDKLaunchTask:
+    """A launch task for DPDK"""
+
+    def __init__(self, label: str, exe: str, gdb_path: str):
+        # label as it appears in launch configuration
+        self.label = label
+        # path to executable
+        self.exe = exe
+        self.gdb_path = gdb_path
+
+    def to_json_dict(self) -> Dict[str, Any]:
+        """Generate JSON dictionary for this task"""
+        return {
+            "name": f"Run {self.label}",
+            "type": "cppdbg",
+            "request": "launch",
+            "program": f"${{config:BUILDDIR}}/{self.exe}",
+            "args": [],
+            "stopAtEntry": False,
+            "cwd": "${workspaceFolder}",
+            "externalConsole": False,
+            "preLaunchTask": "Build",
+            "MIMode": "gdb",
+            "miDebuggerPath": self.gdb_path,
+            "setupCommands": [
+                {
+                    "description": "Enable pretty-printing for gdb",
+                    "text": "-gdb-set print pretty on",
+                    "ignoreFailures": True
+                }
+            ]
+        }
+
+
+class VSCodeConfig:
+    """Configuration for VSCode"""
+
+    def __init__(self, builddir: str, commoncfg: str):
+        # where will our build dir be located
+        self.builddir = builddir
+        # meson configuration common to all configs
+        self.commonconfig = commoncfg
+        # meson build configurations
+        self.build_tasks: List[DPDKBuildTask] = []
+        # meson launch configurations
+        self.launch_tasks: List[DPDKLaunchTask] = []
+
+    def settings_to_json_dict(self) -> Dict[str, Any]:
+        """Generate settings.json"""
+        return {
+            "BUILDDIR": self.builddir,
+            "BUILDCONFIG": self.commonconfig,
+        }
+
+    def tasks_to_json_dict(self) -> Dict[str, Any]:
+        """Generate tasks.json"""
+        # generate outer layer
+        build_tasks: Dict[str, Any] = {
+            "version": "2.0.0",
+            "tasks": []
+        }
+        # generate inner layer
+        tasks = build_tasks["tasks"]
+        # add common tasks
+        tasks.append({
+            "label": "Remove builddir",
+            "type": "shell",
+            "command": "rm -rf ${config:BUILDDIR}",
+        })
+        tasks.append({
+            "label": "Build",
+            "detail": "Run build command",
+            "type": "shell",
+            "command": "ninja",
+            "options": {
+                "cwd": "${config:BUILDDIR}"
+            },
+            "problemMatcher": {
+                "base": "$gcc",
+                "fileLocation": ["relative", "${config:BUILDDIR}"]
+            },
+            "group": "build"
+        })
+        # now, add generated tasks
+        tasks.extend([task.to_json_dict() for task in self.build_tasks])
+
+        # we're done
+        return build_tasks
+
+    def launch_to_json_dict(self) -> Dict[str, Any]:
+        """Generate launch.json"""
+        return {
+            "version": "0.2.0",
+            "configurations": [task.to_json_dict() for task in self.launch_tasks]
+        }
+
+    def c_cpp_properties_to_json_dict(self) -> Dict[str, Any]:
+        """Generate c_cpp_properties.json"""
+        return {
+            "configurations": [
+                {
+                    "name": "Linux",
+                    "includePath": [
+                        "${config:BUILDDIR}/",
+                        "${workspaceFolder}/lib/eal/x86",
+                        "${workspaceFolder}/lib/eal/linux",
+                        "${workspaceFolder}/**"
+                    ],
+                    "compilerPath": "/usr/bin/gcc",
+                    "cStandard": "c99",
+                    "cppStandard": "c++17",
+                    "intelliSenseMode": "${default}",
+                    "compileCommands": "${config:BUILDDIR}/compile_commands.json"
+                }
+            ],
+            "version": 4
+        }
+
+
+def _whiptail_checklist(prompt: str, labels: List[str],
+                        descriptions: List[str],
+                        checked: List[bool]) -> List[str]:
+    """Display a checklist and get user input."""
+    # build whiptail checklist
+    checklist = [
+        (label, desc, "on" if checked[i] else "off")
+        for i, (label, desc) in enumerate(zip(labels, descriptions))
+    ]
+    # flatten the list
+    checklist = [item for sublist in checklist for item in sublist]
+    # build whiptail arguments
+    args = [
+        "whiptail", "--separate-output", "--checklist",
+        prompt, "15", "80", "10"
+    ] + checklist
+
+    try:
+        result = run(args, stderr=PIPE, check=True)
+    except CalledProcessError:
+        # user probably pressed cancel, so bail out
+        _exit(1)
+    # capture selected options
+    selected = result.stderr.decode().strip().split()
+    return selected
+
+
+def _whiptail_inputbox(prompt: str, default: str = "") -> str:
+    """Display an input box and get user input."""
+    args = [
+        "whiptail", "--inputbox",
+        prompt, "10", "70", default
+    ]
+    result = run(args, stderr=PIPE, check=True)
+    return result.stderr.decode().strip()
+
+
+def _get_enabled_configurations(configs: List[Tuple[str, str, str]],
+                                enabled: List[Tuple[str, str, str]]) \
+        -> List[Tuple[str, str, str]]:
+    """Ask user which build configurations they want."""
+    stop = False
+    while not stop:
+        labels = [task[0] for task in configs]
+        descriptions = [task[1] for task in configs]
+        checked = [c in enabled for c in configs]
+        # when interacting using UI, allow user to specify one custom meson
+        # item
+        labels += ["add"]
+        descriptions += ["Add new option"]
+        checked += [False]
+
+        # ask user to select options
+        selected = _whiptail_checklist("Select build configurations to enable:",
+                                       labels, descriptions, checked)
+
+        # enable all previously existing selected configs
+        enabled.clear()
+        for task in configs:
+            if task[0] in selected:
+                # enable this task
+                enabled.append(task)
+        # if user selected "add", ask for custom meson configuration
+        if "add" in selected:
+            custom_label = _whiptail_inputbox(
+                "Enter custom meson configuration label:")
+            custom_description = _whiptail_inputbox(
+                "Enter custom meson configuration description:")
+            custom_mesonstr = _whiptail_inputbox(
+                "Enter custom meson configuration string:")
+            new_task = (custom_label, custom_description, custom_mesonstr)
+            configs += [new_task]
+            # enable the new configuration
+            enabled += [new_task]
+        else:
+            stop = True
+    # return our list of enabled configurations
+    return enabled
+
+
+def _get_enabled_list(apps: List[str], enabled: List[str]) -> List[str]:
+    """Display a list of items, optionally some enabled by default."""
+    checked = [app in enabled for app in apps]
+
+    # ask user to select options
+    selected = _whiptail_checklist("Select apps to enable:",
+                                   apps, ["" for _ in apps], checked)
+
+    return selected
+
+
+def _extract_var(path: str, var: str) -> Any:
+    """Extract a variable from a meson.build file."""
+    try:
+        # we don't want to deal with multiline variable assignments
+        # so just read entire file in one go
+        with open(path, 'r', encoding='utf-8') as file:
+            content = file.read()
+        parser = mparser.Parser(content, path)
+        ast = parser.parse()
+
+        for node in ast.lines:
+            # we're only interested in variable assignments
+            if not isinstance(node, mparser.AssignmentNode):
+                continue
+            # we're only interested in the variable we're looking for
+            if node.var_name.value != var:
+                continue
+            # we're expecting string or array
+            if isinstance(node.value, mparser.StringNode):
+                return node.value.value
+            if isinstance(node.value, mparser.ArrayNode):
+                return [item.value for item in node.value.args.arguments]
+    except (MesonException, FileNotFoundError):
+        return []
+    return None
+
+
+def _update_ctx_from_ui(ctx: CmdlineCtx) -> int:
+    """Use whiptail dialogs to update context contents."""
+    try:
+        # update build dir
+        ctx.build_dir = _whiptail_inputbox(
+            "Enter build directory:", ctx.build_dir)
+
+        # update configs
+        ctx.enabled_configs = _get_enabled_configurations(
+            ctx.avail_configs, ctx.enabled_configs)
+
+        # update enabled apps, examples, and drivers
+        ctx.enabled_apps = _get_enabled_list(ctx.avail_apps, ctx.enabled_apps)
+        ctx.enabled_examples = _get_enabled_list(
+            ctx.avail_examples, ctx.enabled_examples)
+        ctx.enabled_drivers = _get_enabled_list(
+            ctx.avail_drivers, ctx.enabled_drivers)
+
+        return 0
+    except CalledProcessError:
+        # use probably pressed cancel, so bail out
+        return 1
+
+
+def _build_configs(ctx: CmdlineCtx) -> None:
+    # if builddir is a relative path, make it absolute from DPDK root
+    if not os.path.isabs(ctx.build_dir):
+        ctx.build_dir = os.path.realpath(
+            os.path.join(ctx.dpdk_dir, ctx.build_dir))
+
+    # first, build our common meson param string
+    common_param = ""
+    # if no apps enabled, disable all apps, otherwise they get built by default
+    if not ctx.enabled_apps:
+        common_param += " -Ddisable_apps=*"
+    else:
+        common_param += f" -Denable_apps={','.join(ctx.enabled_apps)}"
+    # examples don't get build unless user asks
+    if ctx.enabled_examples:
+        common_param += f" -Dexamples={','.join(ctx.enabled_examples)}"
+    # if no drivers enabled, disable all drivers, otherwise they get built by
+    # default
+    if not ctx.enabled_drivers:
+        common_param += " -Ddisable_drivers=*/*"
+    else:
+        common_param += f" -Denable_drivers={','.join(ctx.enabled_drivers)}"
+
+    # create build tasks
+    build_tasks = [DPDKBuildTask(l, d, m) for l, d, m in ctx.enabled_configs]
+
+    # create launch tasks
+    launch_tasks: List[DPDKLaunchTask] = []
+    for app in ctx.enabled_apps:
+        label = app
+        exe = os.path.join("app", f"dpdk-{app}")
+        launch_tasks.append(DPDKLaunchTask(label, exe, ctx.gdb_path))
+    for app in ctx.enabled_examples:
+        # examples may have complex paths but they always flatten
+        label = os.path.basename(app)
+        exe = os.path.join("examples", f"dpdk-{label}")
+        launch_tasks.append(DPDKLaunchTask(label, exe, ctx.gdb_path))
+
+    # build our config
+    vscode_cfg = VSCodeConfig(ctx.build_dir, common_param)
+    vscode_cfg.build_tasks = build_tasks
+    vscode_cfg.launch_tasks = launch_tasks
+
+    # we're done! now, create .vscode directory
+    os.makedirs(os.path.join(ctx.dpdk_dir, ".vscode"), exist_ok=True)
+
+    # ...and create VSCode configuration
+    print("Creating VSCode configuration files...")
+    config_root = os.path.join(ctx.dpdk_dir, ".vscode")
+    func_map = {
+        "settings.json": vscode_cfg.settings_to_json_dict,
+        "tasks.json": vscode_cfg.tasks_to_json_dict,
+        "launch.json": vscode_cfg.launch_to_json_dict,
+        "c_cpp_properties.json": vscode_cfg.c_cpp_properties_to_json_dict
+    }
+    for filename, func in func_map.items():
+        with open(os.path.join(config_root, filename), "w", encoding="utf-8") as f:
+            print(f"Writing {filename}...")
+            f.write(json.dumps(func(), indent=4))
+    print("Done!")
+
+
+def _process_ctx(ctx: CmdlineCtx) -> None:
+    """Map command-line enabled options to available options."""
+    # for each enabled app, see if it's a wildcard and if so, do a wildcard
+    # match
+    for app in ctx.enabled_apps[:]:
+        if "*" in app:
+            ctx.enabled_apps.remove(app)
+            ctx.enabled_apps.extend(fnmatch.filter(ctx.avail_apps, app))
+    # do the same with examples
+    for example in ctx.enabled_examples[:]:
+        if "*" in example:
+            ctx.enabled_examples.remove(example)
+            ctx.enabled_examples.extend(
+                fnmatch.filter(ctx.avail_examples, example))
+    # do the same with drivers
+    for driver in ctx.enabled_drivers[:]:
+        if "*" in driver:
+            ctx.enabled_drivers.remove(driver)
+            ctx.enabled_drivers.extend(
+                fnmatch.filter(ctx.avail_drivers, driver))
+
+    # due to wildcard, there may be dupes, so sort(set()) everything
+    ctx.enabled_apps = sorted(set(ctx.enabled_apps))
+    ctx.enabled_examples = sorted(set(ctx.enabled_examples))
+    ctx.enabled_drivers = sorted(set(ctx.enabled_drivers))
+
+
+def _resolve_deps(ctx: CmdlineCtx) -> None:
+    """Resolve driver dependencies."""
+    for driver in ctx.enabled_drivers[:]:
+        ctx.enabled_drivers.extend(ctx.driver_dep_map.get(driver, []))
+    ctx.enabled_drivers = sorted(set(ctx.enabled_drivers))
+
+
+def _discover_ctx(ctx: CmdlineCtx) -> int:
+    """Discover available apps/drivers etc. from DPDK."""
+    # find out where DPDK root is located
+    _self = os.path.realpath(__file__)
+    dpdk_root = os.path.realpath(os.path.join(os.path.dirname(_self), ".."))
+    ctx.dpdk_dir = dpdk_root
+
+    # find gdb path
+    if ctx.use_gdbsudo:
+        gdb = "gdbsudo"
+    else:
+        gdb = "gdb"
+    ctx.gdb_path = shutil.which(gdb)
+    if not ctx.gdb_path:
+        print(f"Error: Cannot find {gdb} in PATH!", file=stderr)
+        return 1
+
+    # we want to extract information from DPDK build files, but we don't have a
+    # good way of doing it without already having a meson build directory. for
+    # some things we can use meson AST parsing to extract this information, but
+    # for drivers extracting this information is not straightforward because
+    # they have complex build-time logic to determine which drivers need to be
+    # built (e.g. qat). so, we'll use meson AST for apps and examples, but for
+    # drivers we'll do it the old-fashioned way: by globbing directories.
+
+    apps: List[str] = []
+    examples: List[str] = []
+    drivers: List[str] = []
+
+    app_root = os.path.join(dpdk_root, "app")
+    examples_root = os.path.join(dpdk_root, "examples")
+    drivers_root = os.path.join(dpdk_root, "drivers")
+
+    apps = _extract_var(os.path.join(app_root, "meson.build"), "apps")
+    # special case for apps: test isn't added by default
+    apps.append("test")
+    # some apps will have overridden names using 'name' variable, extract it
+    for i, app in enumerate(apps[:]):
+        new_name = _extract_var(os.path.join(
+            app_root, app, "meson.build"), "name")
+        if new_name:
+            apps[i] = new_name
+
+    # examples don't have any special cases
+    examples = _extract_var(os.path.join(
+        examples_root, "meson.build"), "all_examples")
+
+    for root, _, _ in os.walk(drivers_root):
+        # some directories are drivers, while some are there simply to
+        # organize source in a certain way (e.g. base drivers), so we're
+        # going to cheat a little and only consider directories that have
+        # exactly two levels (e.g. net/ixgbe) and no others.
+        if root == drivers_root:
+            continue
+        rel_root = os.path.relpath(root, drivers_root)
+        if len(rel_root.split(os.sep)) != 2:
+            continue
+        category = os.path.dirname(rel_root)
+        # see if there's a name override
+        name = os.path.basename(rel_root)
+        new_name = _extract_var(os.path.join(root, "meson.build"), "name")
+        if new_name:
+            name = new_name
+        driver_name = os.path.join(category, name)
+        drivers.append(driver_name)
+
+        # some drivers depend on other drivers, so parse these dependencies
+        # using the "deps" variable
+        deps: List[str] = _extract_var(
+            os.path.join(root, "meson.build"), "deps")
+        if not deps:
+            continue
+        for dep in deps:
+            # by convention, drivers are named as <category>_<name>, so we can
+            # infer that dependency is a driver if it has an underscore
+            if not "_" in dep:
+                continue
+            dep_driver = dep.replace("_", "/")
+            ctx.driver_dep_map.setdefault(driver_name, []).append(dep_driver)
+
+    # sort all lists alphabetically
+    apps.sort()
+    examples.sort()
+    drivers.sort()
+
+    # save all of this information into our context
+    ctx.avail_apps = apps
+    ctx.avail_examples = examples
+    ctx.avail_drivers = drivers
+
+    return 0
+
+
+def _main() -> int:
+    """Parse command line arguments and direct program flow."""
+    # this is primarily a TUI script, but we also want to be able to automate
+    # everything, or set defaults to enhance user interaction and
+    # customization.
+
+    # valid parameters:
+    # --no-ui: run without any user interaction
+    # --no-gdbsudo: set up launch targets to use gdb directly
+    # --no-defaults: do not add default build configurations
+    # --help: show help message
+    # -B/--build-dir: set build directory
+    # -b/--build-configs: set default build configurations
+    #                     format: <label> <description> <meson-param>
+    #                     can be specified multiple times
+    # -a/--apps: comma-separated list of enabled apps
+    # -e/--examples: comma-separated list of enabled examples
+    # -d/--drivers: comma-separated list of enabled drivers
+    ap = argparse.ArgumentParser(
+        description="Generate VSCode configuration for DPDK")
+    ap.add_argument("--no-ui", action="store_true",
+                    help="Run without any user interaction")
+    ap.add_argument("--no-gdbsudo", action="store_true",
+                    help="Set up launch targets to use gdb directly")
+    ap.add_argument("--no-defaults", action="store_true",
+                    help="Do not enable built-in build configurations")
+    ap.add_argument("-B", "--build-dir", default="build",
+                    help="Set build directory")
+    ap.add_argument("-b", "--build-config", action="append", default=[],
+                    help="Comma-separated build task configuration of format [label,description,meson setup arguments]")
+    ap.add_argument("-a", "--apps", default="",
+                    help="Comma-separated list of enabled apps (wildcards accepted)")
+    ap.add_argument("-e", "--examples", default="",
+                    help="Comma-separated list of enabled examples (wildcards accepted)")
+    ap.add_argument("-d", "--drivers", default="",
+                    help="Comma-separated list of enabled drivers (wildcards accepted)")
+    ap.epilog = """\
+When script is run in interactive mode, parameters will be used to set up dialog defaults. \
+Otherwise, they will be used to create configuration directly."""
+    args = ap.parse_args()
+
+    def_configs = [
+        ("debug", "Debug build", "--buildtype=debug"),
+        ("debugopt", "Debug optimized build", "--buildtype=debugoptimized"),
+        ("release", "Release build", "--buildtype=release -Denable_docs=true"),
+        ("asan", "Address sanitizer build",
+         "--buildtype=debugoptimized -Db_sanitize=address -Db_lundef=false"),
+    ]
+    def_apps = [
+        "test", "testpmd"
+    ]
+    def_examples = [
+        "helloworld"
+    ]
+    # parse build configs
+    arg_configs = []
+    for c in args.build_config:
+        parts = c.split(",")
+        if len(parts) != 3:
+            print(
+                f"Error: Invalid build configuration format: {c}", file=stderr)
+            return 1
+        arg_configs.append(tuple(parts))
+
+    # set up command line context. all wildcards will be passed directly to _main, and will be
+    # resolved later, when we have a list of things to enable/disable.
+    ctx = CmdlineCtx()
+    ctx.use_ui = not args.no_ui
+    ctx.use_gdbsudo = not args.no_gdbsudo
+    ctx.build_dir = args.build_dir
+    ctx.enabled_apps = args.apps.split(",") if args.apps else []
+    ctx.enabled_examples = args.examples.split(",") if args.examples else []
+    ctx.enabled_drivers = args.drivers.split(",") if args.drivers else []
+    ctx.enabled_configs = arg_configs
+    ctx.avail_configs = def_configs + ctx.enabled_configs
+
+    if not args.no_defaults:
+        # enable default configs
+        ctx.enabled_configs.extend(def_configs)
+
+        # for apps and examples, we only want to add defaults if
+        # user didn't directly specify anything
+        if not ctx.enabled_apps:
+            ctx.enabled_apps.extend(def_apps)
+        if not ctx.enabled_examples:
+            ctx.enabled_examples.extend(def_examples)
+
+    # if UI interaction is requested, check if whiptail is installed
+    if ctx.use_ui and os.system("which whiptail &> /dev/null") != 0:
+        print(
+            "whiptail is not installed! Please install it and try again.",
+            file=stderr)
+        return 1
+
+    # check if gdbsudo is available
+    if ctx.use_gdbsudo and os.system("which gdbsudo &> /dev/null") != 0:
+        print(
+            "Generated configuration will use gdbsudo script to run applications.",
+            file=stderr)
+        print(
+            "If you want to use gdb directly, please run with --no-gdbsudo argument.",
+            file=stderr)
+        print(
+            "Otherwise, run the following snippet in your terminal and try again:",
+            file=stderr)
+        print("""sudo tee <<EOF /usr/local/bin/gdbsudo &> /dev/null
+        #!/usr/bin/bash
+        sudo gdb $@
+        EOF
+        sudo chmod a+x /usr/local/bin/gdbsudo""", file=stderr)
+        return 1
+
+    _discover_ctx(ctx)
+    _process_ctx(ctx)
+    if ctx.use_ui and _update_ctx_from_ui(ctx):
+        return 1
+    _resolve_deps(ctx)
+    _build_configs(ctx)
+
+    return 0
+
+
+if __name__ == "__main__":
+    _exit(_main())
-- 
2.43.5


^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v1 1/1] devtools: add vscode configuration generator
  2024-07-26 12:42 ` [RFC PATCH v1 1/1] devtools: add vscode configuration generator Anatoly Burakov
@ 2024-07-26 15:36   ` Stephen Hemminger
  2024-07-26 16:05     ` Burakov, Anatoly
  0 siblings, 1 reply; 21+ messages in thread
From: Stephen Hemminger @ 2024-07-26 15:36 UTC (permalink / raw)
  To: Anatoly Burakov; +Cc: dev, john.mcnamara

On Fri, 26 Jul 2024 13:42:56 +0100
Anatoly Burakov <anatoly.burakov@intel.com> wrote:

> A lot of developers use Visual Studio Code as their primary IDE. This
> script generates a configuration file for VSCode that sets up basic build
> tasks, launch tasks, as well as C/C++ code analysis settings that will
> take into account compile_commands.json that is automatically generated
> by meson.
> 
> Files generated by script:
>  - .vscode/settings.json: stores variables needed by other files
>  - .vscode/tasks.json: defines build tasks
>  - .vscode/launch.json: defines launch tasks
>  - .vscode/c_cpp_properties.json: defines code analysis settings
> 
> The script uses a combination of globbing and meson file parsing to
> discover available apps, examples, and drivers, and generates a
> project-wide settings file, so that the user can later switch between
> debug/release/etc. configurations while keeping their desired apps,
> examples, and drivers, built by meson, and ensuring launch configurations
> still work correctly whatever the configuration selected.
> 
> This script uses whiptail as TUI, which is expected to be universally
> available as it is shipped by default on most major distributions.
> However, the script is also designed to be scriptable and can be run
> without user interaction, and have its configuration supplied from
> command-line arguments.
> 
> Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>

The TUI doesn't matter much since I would expect this gets run
100% on Windows.

In general looks good, you might want to address
$ flake8 ./devtools/gen-vscode-config.py  --max-line 100
./devtools/gen-vscode-config.py:352:47: E741 ambiguous variable name 'l'
./devtools/gen-vscode-config.py:499:16: E713 test for membership should be 'not in'
./devtools/gen-vscode-config.py:546:101: E501 line too long (120 > 100 characters)

^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v1 1/1] devtools: add vscode configuration generator
  2024-07-26 15:36   ` Stephen Hemminger
@ 2024-07-26 16:05     ` Burakov, Anatoly
  0 siblings, 0 replies; 21+ messages in thread
From: Burakov, Anatoly @ 2024-07-26 16:05 UTC (permalink / raw)
  To: Stephen Hemminger; +Cc: dev, john.mcnamara

On 7/26/2024 5:36 PM, Stephen Hemminger wrote:
> On Fri, 26 Jul 2024 13:42:56 +0100
> Anatoly Burakov <anatoly.burakov@intel.com> wrote:
> 
>> A lot of developers use Visual Studio Code as their primary IDE. This
>> script generates a configuration file for VSCode that sets up basic build
>> tasks, launch tasks, as well as C/C++ code analysis settings that will
>> take into account compile_commands.json that is automatically generated
>> by meson.
>>
>> Files generated by script:
>>   - .vscode/settings.json: stores variables needed by other files
>>   - .vscode/tasks.json: defines build tasks
>>   - .vscode/launch.json: defines launch tasks
>>   - .vscode/c_cpp_properties.json: defines code analysis settings
>>
>> The script uses a combination of globbing and meson file parsing to
>> discover available apps, examples, and drivers, and generates a
>> project-wide settings file, so that the user can later switch between
>> debug/release/etc. configurations while keeping their desired apps,
>> examples, and drivers, built by meson, and ensuring launch configurations
>> still work correctly whatever the configuration selected.
>>
>> This script uses whiptail as TUI, which is expected to be universally
>> available as it is shipped by default on most major distributions.
>> However, the script is also designed to be scriptable and can be run
>> without user interaction, and have its configuration supplied from
>> command-line arguments.
>>
>> Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
> 
> The TUI doesn't matter much since I would expect this gets run
> 100% on Windows.

I run it on Linux using Remote SSH, and that's the primary target 
audience as far as I'm concerned (a lot of people do the same at our 
office). Just in case it wasn't clear, this is not for *Visual Studio* 
the Windows IDE, this is for *Visual Studio Code* the cross-platform 
code editor.

I didn't actually think of testing this on Windows. I assume Windows 
doesn't have whiptail, so this will most likely refuse to run in TUI 
mode (unless run under WSL - I assume WSL ships whiptail).

> 
> In general looks good, you might want to address
> $ flake8 ./devtools/gen-vscode-config.py  --max-line 100
> ./devtools/gen-vscode-config.py:352:47: E741 ambiguous variable name 'l'
> ./devtools/gen-vscode-config.py:499:16: E713 test for membership should be 'not in'
> ./devtools/gen-vscode-config.py:546:101: E501 line too long (120 > 100 characters)

Thanks, I had Pylance linter but not flake8.

-- 
Thanks,
Anatoly


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [RFC PATCH v2 0/1] Add Visual Studio Code configuration script
  2024-07-26 12:42 [RFC PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
  2024-07-26 12:42 ` [RFC PATCH v1 1/1] devtools: add vscode configuration generator Anatoly Burakov
@ 2024-07-29 13:05 ` Anatoly Burakov
  2024-07-29 13:05   ` [RFC PATCH v2 1/1] devtools: add vscode configuration generator Anatoly Burakov
  2024-07-30 15:01   ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Bruce Richardson
  2024-07-31 13:33 ` [RFC PATCH v3 " Anatoly Burakov
  2024-09-02 12:17 ` [PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
  3 siblings, 2 replies; 21+ messages in thread
From: Anatoly Burakov @ 2024-07-29 13:05 UTC (permalink / raw)
  To: dev; +Cc: john.mcnamara

Lots of developers (myself included) uses Visual Studio Code as their primary
IDE for DPDK development. I have been successfully using various incarnations of
this script internally to quickly set up my development trees whenever I need a
new configuration, so this script is being shared in hopes that it will be
useful both to new developers starting with DPDK, and to seasoned DPDK
developers who are already using Visual Studio Code. It makes starting working
on DPDK in Visual Studio Code so much easier!

** NOTE: Currently, only x86 configuration is generated as I have no way to test
   the code analysis configuration on any other platforms.

** NOTE 2: this is not for *Visual Studio* the Windows IDE, this is for *Visual
   Studio Code* the cross-platform code editor. Specifically, main target
   audience for this script is people who either run DPDK directly on their
   Linux machine, or who use Remote SSH functionality to connect to a remote
   Linux machine and set up VSCode build there. No other OS's are currently
   supported by the script.

(if you're unaware of what is Remote SSH, I highly suggest checking it out [1])

Philosophy behind this script is as follows:

- The assumption is made that a developer will not be using wildly different
  configurations from build to build - usually, they build the same things, work
  with the same set of apps/drivers for a while, then switch to something else,
  at which point a new configuration is needed

- Some configurations I consider to be "common" are included: debug build, debug
  optimized build, release build with docs, and ASan build (feel free to make
  suggestions here!)

- By default, the script will not add any meson flags unless user requested it,
  however it will create launch configurations for all apps because not
  specifying any flags leads to all apps being enabled

- All parameters that can be adjusted by TUI are also available as command line
  arguments, so while user interaction is the default (using whiptail), it's
  actually not required and can be bypassed

- I usually work as a local user not as root, so by default the script will
  attempt to use "gdbsudo" (a "sudo gdb $@" script in /usr/local/bin) for launch
  tasks, unless user specifically requests using gdb or is running as root

With the above assumptions, the script is expected to work for basic DPDK
configuration, as well as allow scripting any and all parts of it:

- Any build configurations specified on the command-line will be added to the
  interactive configuration selector

- Any apps/examples/drivers specified on the command-line will be selected by
  default in the interactive configuration selector

- Any custom meson strings will also be filled in by default

- When --no-ui is passed, all of the above parameters will be used to generate
  config as if the user has selected them

There are a few usability issues with the script that are very difficult, if not
impossible to solve for all cases, so a few other decisions have been made to
hopefully address these issues for most cases.

For starters, we can only know which apps/examples will get built after we
create a build directory and process dependencies; we cannot infer this from our
script. Therefore, by default all apps will get launch configurations created,
even though these apps might not be built later. We could add another step after
build to check config for missing apps and removing them, but it's difficult to
do that without destroying any custom configuration user might have created. We
remove generated configuration as it is!

More importantly, our build system supports wildcards - for example, we can
request "net/*" drivers to be built, and they will be built on a best-effort
basis; that is, if the driver lacks a dependency, it will get ignored without
any errors. For this script, a design decision has been made to handle internal
driver dependencies (such as "net/ice" depending on "common/iavf") automatically
to enhance user experience, until such time as the build system can do it
better. However, the downside of this decision is that we need to handle
dependencies between drivers for when user specifies wildcards on the
command-line, which can be solved in three ways:

* Not allow wildcards at all
* Allow wildcards and always expand them
* Allow wildcards and be smarter about handling them

The first option is, IMO, inferior one: wildcards are convenient and quick to
work with, so I would rather leave them as an option available to the user.

The second options is nice, but it has a fundamental issue: when we expand very
broad wildcards like "net/*", we will definitely cover a lot of drivers we do
not have any dependencies for, but since we're now explicitly requesting them,
the build will now error out if we requested something we didn't necessarily
expect to be built. For this reason, I decided against this option.

The third option is what I went with, with "smarter" being defined as follows:

* User is allowed to use dialogs to edit configuration that is generated from
  parsing wildcard: if user changed something, we cannot keep wildcard any more
  and we assume user knows what they're doing and is OK with explicitly
  requesting compilation for drivers they selected. So, if user didn't change
  anything in the dialog, we keep the wildcard, otherwise we expand it.

* If, by the time we get to resolving driver dependencies, we have wildcards in
  our driver param string, we see which drivers match this wildcard, and add
  wildcards for their dependencies. For example, if "net/ice" requires
  "common/iavf", and we have a "net/*" wildcard, one of the dependencies that we
  will add is "common/*". This behavior is, IMO, far better than the default one
  from our build system, where if a driver matches wildcard but cannot be built
  due to another internal dependency not being enabled (e.g. if "net/ice" is
  requested but "common/iavf" isn't requested), the build will fail to configure
  even though it would've been possible to build them otherwise

So, explicitly enabled drivers get explicit dependencies, implicitly enabled
drivers get implicit dependencies. The resulting build will be bigger than when
using meson command line directly, but if the user is worried about build size,
they can customize it via common meson parameters as well as being more granular
about requested apps/examples/drivers. Thus, we address the "simple" usecase of
"let's build everything by default", we handle some common use cases smarter
than we otherwise would have, and we allow user to be as in-depth as they want
by allowing to specify explicit meson command strings. I feel like this is a
good compromise between usability and robustness.

Please feel free to make any suggestions!

[1] https://code.visualstudio.com/docs/remote/ssh

Anatoly Burakov (1):
  devtools: add vscode configuration generator

 devtools/gen-vscode-config.py | 871 ++++++++++++++++++++++++++++++++++
 1 file changed, 871 insertions(+)
 create mode 100755 devtools/gen-vscode-config.py

-- 
2.43.5


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-29 13:05 ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Anatoly Burakov
@ 2024-07-29 13:05   ` Anatoly Burakov
  2024-07-29 13:14     ` Bruce Richardson
  2024-07-29 14:30     ` Bruce Richardson
  2024-07-30 15:01   ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Bruce Richardson
  1 sibling, 2 replies; 21+ messages in thread
From: Anatoly Burakov @ 2024-07-29 13:05 UTC (permalink / raw)
  To: dev; +Cc: john.mcnamara

A lot of developers use Visual Studio Code as their primary IDE. This
script generates a configuration file for VSCode that sets up basic build
tasks, launch tasks, as well as C/C++ code analysis settings that will
take into account compile_commands.json that is automatically generated
by meson.

Files generated by script:
 - .vscode/settings.json: stores variables needed by other files
 - .vscode/tasks.json: defines build tasks
 - .vscode/launch.json: defines launch tasks
 - .vscode/c_cpp_properties.json: defines code analysis settings

The script uses a combination of globbing and meson file parsing to
discover available apps, examples, and drivers, and generates a
project-wide settings file, so that the user can later switch between
debug/release/etc. configurations while keeping their desired apps,
examples, and drivers, built by meson, and ensuring launch configurations
still work correctly whatever the configuration selected.

This script uses whiptail as TUI, which is expected to be universally
available as it is shipped by default on most major distributions.
However, the script is also designed to be scriptable and can be run
without user interaction, and have its configuration supplied from
command-line arguments.

Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
---

Notes:
    RFCv1 -> RFCv2:
    
    - No longer disable apps and drivers if nothing was specified via command line
      or TUI, and warn user about things being built by default
    - Generate app launch configuration by default for when no apps are selected
    - Added paramters:
      - --force to avoid overwriting existing config
      - --common-conf to specify global meson flags applicable to all configs
      - --gdbsudo/--no-gdbsudo to specify gdbsudo behavior
    - Autodetect gdbsudo/gdb from UID
    - Updated comments, error messages, fixed issues with user interaction
    - Improved handling of wildcards and driver dependencies
    - Fixed a few bugs in dependency detection due to incorrect parsing
    - [Stephen] flake8 is happy

 devtools/gen-vscode-config.py | 871 ++++++++++++++++++++++++++++++++++
 1 file changed, 871 insertions(+)
 create mode 100755 devtools/gen-vscode-config.py

diff --git a/devtools/gen-vscode-config.py b/devtools/gen-vscode-config.py
new file mode 100755
index 0000000000..f0d6044c1b
--- /dev/null
+++ b/devtools/gen-vscode-config.py
@@ -0,0 +1,871 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 Intel Corporation
+#
+
+"""Visual Studio Code configuration generator script."""
+
+import os
+import json
+import argparse
+import fnmatch
+import shutil
+from typing import List, Dict, Tuple, Any
+from sys import exit as _exit, stderr
+from subprocess import run, CalledProcessError, PIPE
+from mesonbuild import mparser
+from mesonbuild.mesonlib import MesonException
+
+
+class DPDKBuildTask:
+    """A build task for DPDK"""
+
+    def __init__(self, label: str, description: str, param: str):
+        # label as it appears in build configuration
+        self.label = label
+        # description to be given in menu
+        self.description = description
+        # task-specific configuration parameters
+        self.param = param
+
+    def to_json_dict(self) -> Dict[str, Any]:
+        """Generate JSON dictionary for this task"""
+        return {
+            "label": f"Configure {self.label}",
+            "detail": self.description,
+            "type": "shell",
+            "dependsOn": "Remove builddir",
+            # take configuration from settings.json using config: namespace
+            "command": f"meson setup ${{config:BUILDCONFIG}} " \
+                       f"{self.param} ${{config:BUILDDIR}}",
+            "problemMatcher": [],
+            "group": "build"
+        }
+
+
+class DPDKLaunchTask:
+    """A launch task for DPDK"""
+
+    def __init__(self, label: str, exe: str, gdb_path: str):
+        # label as it appears in launch configuration
+        self.label = label
+        # path to executable
+        self.exe = exe
+        self.gdb_path = gdb_path
+
+    def to_json_dict(self) -> Dict[str, Any]:
+        """Generate JSON dictionary for this task"""
+        return {
+            "name": f"Run {self.label}",
+            "type": "cppdbg",
+            "request": "launch",
+            # take configuration from settings.json using config: namespace
+            "program": f"${{config:BUILDDIR}}/{self.exe}",
+            "args": [],
+            "stopAtEntry": False,
+            "cwd": "${workspaceFolder}",
+            "externalConsole": False,
+            "preLaunchTask": "Build",
+            "MIMode": "gdb",
+            "miDebuggerPath": self.gdb_path,
+            "setupCommands": [
+                {
+                    "description": "Enable pretty-printing for gdb",
+                    "text": "-gdb-set print pretty on",
+                    "ignoreFailures": True
+                }
+            ]
+        }
+
+
+class VSCodeConfig:
+    """Configuration for VSCode"""
+
+    def __init__(self, builddir: str, commoncfg: str):
+        # where will our build dir be located
+        self.builddir = builddir
+        # meson configuration common to all configs
+        self.commonconfig = commoncfg
+        # meson build configurations
+        self.build_tasks: List[DPDKBuildTask] = []
+        # meson launch configurations
+        self.launch_tasks: List[DPDKLaunchTask] = []
+
+    def settings_to_json_dict(self) -> Dict[str, Any]:
+        """Generate settings.json"""
+        return {
+            "BUILDDIR": self.builddir,
+            "BUILDCONFIG": self.commonconfig,
+        }
+
+    def tasks_to_json_dict(self) -> Dict[str, Any]:
+        """Generate tasks.json"""
+        # generate outer layer
+        build_tasks: Dict[str, Any] = {
+            "version": "2.0.0",
+            "tasks": []
+        }
+        # generate inner layer
+        tasks = build_tasks["tasks"]
+        # add common tasks
+        tasks.append({
+            "label": "Remove builddir",
+            "type": "shell",
+            "command": "rm -rf ${config:BUILDDIR}",
+        })
+        tasks.append({
+            "label": "Build",
+            "detail": "Run build command",
+            "type": "shell",
+            "command": "ninja",
+            "options": {
+                "cwd": "${config:BUILDDIR}"
+            },
+            "problemMatcher": {
+                "base": "$gcc",
+                "fileLocation": ["relative", "${config:BUILDDIR}"]
+            },
+            "group": "build"
+        })
+        # now, add generated tasks
+        tasks.extend([task.to_json_dict() for task in self.build_tasks])
+
+        # we're done
+        return build_tasks
+
+    def launch_to_json_dict(self) -> Dict[str, Any]:
+        """Generate launch.json"""
+        return {
+            "version": "0.2.0",
+            "configurations": [task.to_json_dict()
+                               for task in self.launch_tasks]
+        }
+
+    def c_cpp_properties_to_json_dict(self) -> Dict[str, Any]:
+        """Generate c_cpp_properties.json"""
+        return {
+            "configurations": [
+                {
+                    "name": "Linux",
+                    "includePath": [
+                        "${config:BUILDDIR}/",
+                        "${workspaceFolder}/lib/eal/x86",
+                        "${workspaceFolder}/lib/eal/linux",
+                        "${workspaceFolder}/**"
+                    ],
+                    "compilerPath": "/usr/bin/gcc",
+                    "cStandard": "c99",
+                    "cppStandard": "c++17",
+                    "intelliSenseMode": "${default}",
+                    "compileCommands":
+                        "${config:BUILDDIR}/compile_commands.json"
+                }
+            ],
+            "version": 4
+        }
+
+
+class CmdlineCtx:
+    """POD class to set up command line parameters"""
+
+    def __init__(self):
+        self.use_ui = False
+        self.use_gdbsudo = False
+        self.force_overwrite = False
+        self.build_dir = ""
+        self.dpdk_dir = ""
+        self.gdb_path = ""
+
+        self.avail_configs: List[Tuple[str, str, str]] = []
+        self.avail_apps: List[str] = []
+        self.avail_examples: List[str] = []
+        self.avail_drivers: List[str] = []
+
+        self.enabled_configs_str = ""
+        self.enabled_apps_str = ""
+        self.enabled_examples_str = ""
+        self.enabled_drivers_str = ""
+        self.enabled_configs: List[Tuple[str, str, str]] = []
+        self.enabled_apps: List[str] = []
+        self.enabled_examples: List[str] = []
+        self.enabled_drivers: List[str] = []
+
+        self.driver_dep_map: Dict[str, List[str]] = {}
+        self.common_conf = ""
+
+        # this is only used by TUI to decide which windows to show
+        self.show_apps = False
+        self.show_examples = False
+        self.show_drivers = False
+        self.show_configs = False
+        self.show_common_config = False
+
+
+def _whiptail_msgbox(message: str) -> None:
+    """Display a message box."""
+    args = ["whiptail", "--msgbox", message, "10", "70"]
+    run(args, check=True)
+
+
+def _whiptail_checklist(title: str, prompt: str,
+                        options: List[Tuple[str, str]],
+                        checked: List[str]) -> List[str]:
+    """Display a checklist and get user input."""
+    # build whiptail checklist
+    checklist = [
+        (label, desc, "on" if label in checked else "off")
+        for label, desc in options
+    ]
+    # flatten the list
+    flat = [item for sublist in checklist for item in sublist]
+    # build whiptail arguments
+    args = [
+        "whiptail", "--separate-output", "--checklist",
+        "--title", title, prompt, "15", "80", "8"
+    ] + flat
+
+    result = run(args, stderr=PIPE, check=True)
+    # capture selected options
+    return result.stderr.decode().strip().split()
+
+
+def _whiptail_inputbox(title: str, prompt: str, default: str = "") -> str:
+    """Display an input box and get user input."""
+    args = [
+        "whiptail", "--inputbox",
+        "--title", title,
+        prompt, "10", "70", default
+    ]
+    result = run(args, stderr=PIPE, check=True)
+    return result.stderr.decode().strip()
+
+
+def _get_enabled_configurations(configs: List[Tuple[str, str, str]],
+                                checked: List[str]) \
+        -> List[Tuple[str, str, str]]:
+    """Ask user which build configurations they want."""
+    stop = False
+    while not stop:
+        opts = [
+            (c[0], c[1]) for c in configs
+        ]
+        # when interacting using UI, allow adding items
+        opts += [("add", "Add new option")]
+
+        # ask user to select options
+        checked = _whiptail_checklist(
+            "Build configurations", "Select build configurations to enable:",
+            opts, checked)
+
+        # if user selected "add", ask for custom meson configuration
+        if "add" in checked:
+            # remove "" from checked because it's a special option
+            checked.remove("add")
+            while True:
+                custom_label = _whiptail_inputbox(
+                    "Configuration name",
+                    "Enter custom meson configuration label:")
+                custom_description = _whiptail_inputbox(
+                    "Configuration description",
+                    "Enter custom meson configuration description:")
+                custom_mesonstr = _whiptail_inputbox(
+                    "Configuration parameters",
+                    "Enter custom meson configuration string:")
+                # do we have meaningful label?
+                if not custom_label:
+                    _whiptail_msgbox("Configuration label cannot be empty!")
+                    continue
+                # don't allow "add", don't allow duplicates
+                existing = [task[0] for task in configs] + ["add"]
+                if custom_label in existing:
+                    _whiptail_msgbox(
+                        f"Label '{custom_label}' is not allowed!")
+                    continue
+                # we passed all checks, stop
+                break
+            new_task = (custom_label, custom_description, custom_mesonstr)
+            configs += [new_task]
+            # enable new configuration
+            checked += [custom_label]
+        else:
+            stop = True
+    # return our list of enabled configurations
+    return [
+        c for c in configs if c[0] in checked
+    ]
+
+
+def _select_from_list(title: str, prompt: str, items: List[str],
+                      enabled: List[str]) -> List[str]:
+    """Display a list of items, optionally some enabled by default."""
+    opts = [
+        (item, "") for item in items
+    ]
+    # ask user to select options
+    return _whiptail_checklist(title, prompt, opts, enabled)
+
+
+def _extract_var(path: str, var: str) -> Any:
+    """Extract a variable from a meson.build file."""
+    try:
+        # we don't want to deal with multiline variable assignments
+        # so just read entire file in one go
+        with open(path, 'r', encoding='utf-8') as file:
+            content = file.read()
+        parser = mparser.Parser(content, path)
+        ast = parser.parse()
+
+        for node in ast.lines:
+            # we're only interested in variable assignments
+            if not isinstance(node, mparser.AssignmentNode):
+                continue
+            # we're only interested in the variable we're looking for
+            if node.var_name.value != var:
+                continue
+            # we're expecting string or array
+            if isinstance(node.value, mparser.StringNode):
+                return node.value.value
+            if isinstance(node.value, mparser.ArrayNode):
+                return [item.value for item in node.value.args.arguments]
+    except (MesonException, FileNotFoundError):
+        return None
+    return None
+
+
+def _pick_ui_options(ctx: CmdlineCtx) -> None:
+    """Use whiptail dialogs to decide which setup options to show."""
+    opts = [
+        ("config", "Select build configurations to enable"),
+        ("common", "Customize meson flags"),
+        ("apps", "Select apps to enable"),
+        ("examples", "Select examples to enable"),
+        ("drivers", "Select drivers to enable"),
+    ]
+    # whether any options are enabled depends on whether user has specified
+    # anything on the command-line, but also enable examples by default
+    checked_opts = ["examples"]
+    if ctx.enabled_configs_str:
+        checked_opts.append("config")
+    if ctx.enabled_apps_str:
+        checked_opts.append("apps")
+    if ctx.enabled_drivers_str:
+        checked_opts.append("drivers")
+    if ctx.common_conf:
+        checked_opts.append("common")
+
+    enabled = _whiptail_checklist(
+        "Options",
+        "Select options to configure (deselecting will pick defaults):",
+        opts, checked_opts)
+    for opt in enabled:
+        if opt == "config":
+            ctx.show_configs = True
+        elif opt == "common":
+            ctx.show_common_config = True
+        elif opt == "apps":
+            ctx.show_apps = True
+        elif opt == "examples":
+            ctx.show_examples = True
+        elif opt == "drivers":
+            ctx.show_drivers = True
+
+
+def _build_configs(ctx: CmdlineCtx) -> int:
+    """Build VSCode configuration files."""
+    # if builddir is a relative path, make it absolute
+    if not os.path.isabs(ctx.build_dir):
+        ctx.build_dir = os.path.realpath(ctx.build_dir)
+
+    # first, build our common meson param string
+    force_apps = False
+    force_drivers = False
+    common_param = ctx.common_conf
+
+    # if no apps are specified, all apps are built, so enable all of them. this
+    # isn't ideal because some of them might not be able to run because in
+    # actuality they don't get built due to missing dependencies. however, the
+    # alternative is to not generate any apps in launch configuration at all,
+    # which is worse than having some apps defined in config but not available.
+    if ctx.enabled_apps_str:
+        common_param += f" -Denable_apps={ctx.enabled_apps_str}"
+    else:
+        # special case: user might have specified -Dtests or apps flags in
+        # common param, so if the user did that, assume user knows what they're
+        # doing and don't display any warnings about enabling apps, and don't
+        # enable them in launch config and leave it up to the user to handle.
+        avoid_opts = ['-Dtests=', '-Denable_apps=', '-Ddisable_apps=']
+        if not any(opt in common_param for opt in avoid_opts):
+            force_apps = True
+            ctx.enabled_apps = ctx.avail_apps
+
+    # examples don't get build unless user asks
+    if ctx.enabled_examples_str:
+        common_param += f" -Dexamples={ctx.enabled_examples_str}"
+
+    # if no drivers enabled, let user know they will be built anyway
+    if ctx.enabled_drivers_str:
+        common_param += f" -Denable_drivers={ctx.enabled_drivers_str}"
+    else:
+        avoid_opts = ['-Denable_drivers=', '-Ddisable_drivers=']
+        if not any(opt in common_param for opt in avoid_opts):
+            # special case: user might have specified driver flags in common
+            # param, so if the user did that, assume user knows what they're
+            # doing and don't display any warnings about enabling drivers.
+            force_drivers = True
+
+    if force_drivers or force_apps:
+        ena: List[str] = []
+        dis: List[str] = []
+        if force_apps:
+            ena += ["apps"]
+            dis += ["-Ddisable_apps=*"]
+        if force_drivers:
+            ena += ["drivers"]
+            dis += ["-Ddisable_drivers=*/*"]
+        ena_str = " or ".join(ena)
+        dis_str = " or ".join(dis)
+        msg = f"""\
+No {ena_str} are specified in configuration, so all of them will be built. \
+To disable {ena_str}, add {dis_str} to common meson flags."""
+
+        _whiptail_msgbox(msg)
+
+    # create build tasks
+    build_tasks = [DPDKBuildTask(n, d, p) for n, d, p in ctx.enabled_configs]
+
+    # create launch tasks
+    launch_tasks: List[DPDKLaunchTask] = []
+    for app in ctx.enabled_apps:
+        label = app
+        exe = os.path.join("app", f"dpdk-{app}")
+        launch_tasks.append(DPDKLaunchTask(label, exe, ctx.gdb_path))
+    for app in ctx.enabled_examples:
+        # examples may have complex paths but they always flatten
+        label = os.path.basename(app)
+        exe = os.path.join("examples", f"dpdk-{label}")
+        launch_tasks.append(DPDKLaunchTask(label, exe, ctx.gdb_path))
+
+    # build our config
+    vscode_cfg = VSCodeConfig(ctx.build_dir, common_param)
+    vscode_cfg.build_tasks = build_tasks
+    vscode_cfg.launch_tasks = launch_tasks
+
+    # we're done! now, create .vscode directory
+    config_root = os.path.join(ctx.dpdk_dir, ".vscode")
+    os.makedirs(config_root, exist_ok=True)
+
+    # ...and create VSCode configuration
+    print("Creating VSCode configuration files...")
+    func_map = {
+        "settings.json": vscode_cfg.settings_to_json_dict,
+        "tasks.json": vscode_cfg.tasks_to_json_dict,
+        "launch.json": vscode_cfg.launch_to_json_dict,
+        "c_cpp_properties.json": vscode_cfg.c_cpp_properties_to_json_dict
+    }
+    # check if any of the files exist, and refuse to overwrite them unless
+    # --force was specified on the command line
+    for filename in func_map.keys():
+        fpath = os.path.join(config_root, filename)
+        if os.path.exists(fpath) and not ctx.force_overwrite:
+            print(f"Error: {filename} already exists! \
+                Use --force to overwrite.", file=stderr)
+            return 1
+    for filename, func in func_map.items():
+        with open(os.path.join(config_root, filename),
+                  "w", encoding="utf-8") as f:
+            print(f"Writing {filename}...")
+            f.write(json.dumps(func(), indent=4))
+    print("Done!")
+    return 0
+
+
+def _resolve_deps(ctx: CmdlineCtx) -> None:
+    """Resolve driver dependencies."""
+    # resolving dependencies is not straightforward, because DPDK build system
+    # treats wildcards differently from explicitly requested drivers: namely,
+    # it will treat wildcard-matched drivers on a best-effort basis, and will
+    # skip them if driver's dependencies aren't met without error. contrary to
+    # that, when a driver is explicitly requested, it will cause an error if
+    # any of its dependencies are unmet.
+    #
+    # to resolve this, we need to be smarter about how we add dependencies.
+    # specifically, when we're dealing with wildcards, we will need to add
+    # wildcard dependencies, whereas when we're dealing with explicitly
+    # requested drivers, we will add explicit dependencies. for example,
+    # requesting net/ice will add common/iavf, but requesting net/*ce will
+    # add common/* as a dependency. We will build more that we would've
+    # otherwise, but that's an acceptable compromise to enable as many drivers
+    # as we can while avoiding build errors due to erroneous wildcard matches.
+    new_deps: List[str] = []
+    for driver in ctx.enabled_drivers_str.split(","):
+        # is this a wildcard?
+        if "*" in driver:
+            # find all drivers matching this wildcard, figure out which
+            # category (bus, common, etc.) of driver they request as
+            # dependency, and add a wildcarded match on that category
+            wc_matches = fnmatch.filter(ctx.avail_drivers, driver)
+            # find all of their dependencies
+            deps = [d
+                    for dl in wc_matches
+                    for d in ctx.driver_dep_map.get(dl, [])]
+            categories: List[str] = []
+            for d in deps:
+                category, _ = d.split("/")
+                categories += [category]
+            # find all categories we've added
+            categories = sorted(set(categories))
+            # add them as dependencies
+            new_deps += [f"{cat}/*" for cat in categories]
+            continue
+        # this is a driver, so add its dependencies explicitly
+        new_deps += ctx.driver_dep_map.get(driver, [])
+
+    # add them to enabled_drivers_str, this will be resolved later
+    if new_deps:
+        # this might add some dupes but we don't really care
+        ctx.enabled_drivers_str += f',{",".join(new_deps)}'
+
+
+def _update_ctx_from_ui(ctx: CmdlineCtx) -> int:
+    """Use whiptail dialogs to update context contents."""
+    try:
+        # update build dir
+        ctx.build_dir = _whiptail_inputbox(
+            "Build directory", "Enter build directory:", ctx.build_dir)
+
+        # first, decide what we are going to set up
+        _pick_ui_options(ctx)
+
+        # update configs
+        if ctx.show_configs:
+            ctx.enabled_configs = _get_enabled_configurations(
+                ctx.avail_configs, [c[0] for c in ctx.enabled_configs])
+
+        # update common config
+        if ctx.show_common_config:
+            ctx.common_conf = _whiptail_inputbox(
+                "Meson configuration",
+                "Enter common meson configuration flags (if any):",
+                ctx.common_conf)
+
+        # when user interaction is requestted, we cannot really keep any values
+        # we got from arguments, because if user has changed something in those
+        # checklists, any wildcards will become invalid. however, we can do a
+        # heuristic: if user didn't *change* anything, we can infer that
+        # they're happy with the configuration they have picked, so we will
+        # only update meson param strings if the user has changed the
+        # configuration from TUI, or if we didn't have any to begin with
+
+        old_enabled_apps = ctx.enabled_apps.copy()
+        old_enabled_examples = ctx.enabled_examples.copy()
+        old_enabled_drivers = ctx.enabled_drivers.copy()
+        if ctx.show_apps:
+            ctx.enabled_apps = _select_from_list(
+                "Apps", "Select apps to enable:",
+                ctx.avail_apps, ctx.enabled_apps)
+        if ctx.show_examples:
+            ctx.enabled_examples = _select_from_list(
+                "Examples", "Select examples to enable:",
+                ctx.avail_examples, ctx.enabled_examples)
+        if ctx.show_drivers:
+            ctx.enabled_drivers = _select_from_list(
+                "Drivers", "Select drivers to enable:",
+                ctx.avail_drivers, ctx.enabled_drivers)
+
+        # did we change anything, assuming we even had anything at all?
+        if not ctx.enabled_apps_str or \
+                set(old_enabled_apps) != set(ctx.enabled_apps):
+            ctx.enabled_apps_str = ",".join(ctx.enabled_apps)
+        if not ctx.enabled_examples_str or \
+                set(old_enabled_examples) != set(ctx.enabled_examples):
+            ctx.enabled_examples_str = ",".join(ctx.enabled_examples)
+        if not ctx.enabled_drivers_str or \
+                set(old_enabled_drivers) != set(ctx.enabled_drivers):
+            ctx.enabled_drivers_str = ",".join(ctx.enabled_drivers)
+
+        return 0
+    except CalledProcessError:
+        # use probably pressed cancel, so bail out
+        return 1
+
+
+def _resolve_ctx(ctx: CmdlineCtx) -> int:
+    """Map command-line enabled options to available options."""
+    # for each enabled app, see if it's a wildcard and if so, do a wildcard
+    # match
+    for app in ctx.enabled_apps_str.split(","):
+        if "*" in app:
+            ctx.enabled_apps.extend(fnmatch.filter(ctx.avail_apps, app))
+        elif app in ctx.avail_apps:
+            ctx.enabled_apps.append(app)
+        elif app:
+            print(f"Error: Unknown app: {app}", file=stderr)
+            return 1
+
+    # do the same with examples
+    for example in ctx.enabled_examples_str.split(","):
+        if "*" in example:
+            ctx.enabled_examples.extend(
+                fnmatch.filter(ctx.avail_examples, example))
+        elif example in ctx.avail_examples:
+            ctx.enabled_examples.append(example)
+        elif example:
+            print(f"Error: Unknown example: {example}", file=stderr)
+            return 1
+
+    # do the same with drivers
+    for driver in ctx.enabled_drivers_str.split(","):
+        if "*" in driver:
+            ctx.enabled_drivers.extend(
+                fnmatch.filter(ctx.avail_drivers, driver))
+        elif driver in ctx.avail_drivers:
+            ctx.enabled_drivers.append(driver)
+        elif driver:
+            print(f"Error: Unknown driver: {driver}", file=stderr)
+            return 1
+
+    # due to wildcard, there may be dupes, so sort(set()) everything
+    ctx.enabled_apps = sorted(set(ctx.enabled_apps))
+    ctx.enabled_examples = sorted(set(ctx.enabled_examples))
+    ctx.enabled_drivers = sorted(set(ctx.enabled_drivers))
+
+    return 0
+
+
+def _discover_ctx(ctx: CmdlineCtx) -> int:
+    """Discover available apps/drivers etc. from DPDK."""
+    # find out where DPDK root is located
+    _self = os.path.realpath(__file__)
+    dpdk_root = os.path.realpath(os.path.join(os.path.dirname(_self), ".."))
+    ctx.dpdk_dir = dpdk_root
+
+    # find gdb path
+    if ctx.use_gdbsudo:
+        gdb = "gdbsudo"
+    else:
+        gdb = "gdb"
+    ctx.gdb_path = shutil.which(gdb)
+    if not ctx.gdb_path:
+        print(f"Error: Cannot find {gdb} in PATH!", file=stderr)
+        return 1
+
+    # we want to extract information from DPDK build files, but we don't have a
+    # good way of doing it without already having a meson build directory. for
+    # some things we can use meson AST parsing to extract this information, but
+    # for drivers extracting this information is not straightforward because
+    # they have complex build-time logic to determine which drivers need to be
+    # built (e.g. qat). so, we'll use meson AST for apps and examples, but for
+    # drivers we'll do it the old-fashioned way: by globbing directories.
+
+    apps: List[str] = []
+    examples: List[str] = []
+    drivers: List[str] = []
+
+    app_root = os.path.join(dpdk_root, "app")
+    examples_root = os.path.join(dpdk_root, "examples")
+    drivers_root = os.path.join(dpdk_root, "drivers")
+
+    apps = _extract_var(os.path.join(app_root, "meson.build"), "apps")
+    # special case for apps: test isn't added by default
+    apps.append("test")
+    # some apps will have overridden names using 'name' variable, extract it
+    for i, app in enumerate(apps[:]):
+        new_name = _extract_var(os.path.join(
+            app_root, app, "meson.build"), "name")
+        if new_name:
+            apps[i] = new_name
+
+    # examples don't have any special cases
+    examples = _extract_var(os.path.join(
+        examples_root, "meson.build"), "all_examples")
+
+    for root, _, _ in os.walk(drivers_root):
+        # some directories are drivers, while some are there simply to
+        # organize source in a certain way (e.g. base drivers), so we're
+        # going to cheat a little and only consider directories that have
+        # exactly two levels (e.g. net/ixgbe) and no others.
+        if root == drivers_root:
+            continue
+        rel_root = os.path.relpath(root, drivers_root)
+        if len(rel_root.split(os.sep)) != 2:
+            continue
+        category = os.path.dirname(rel_root)
+        # see if there's a name override
+        name = os.path.basename(rel_root)
+        new_name = _extract_var(os.path.join(root, "meson.build"), "name")
+        if new_name:
+            name = new_name
+        driver_name = os.path.join(category, name)
+        drivers.append(driver_name)
+
+        # some drivers depend on other drivers, so parse these dependencies
+        # using the "deps" variable
+        deps: Any = _extract_var(
+            os.path.join(root, "meson.build"), "deps")
+        if not deps:
+            continue
+        # occasionally, deps will be a string, so convert it to a list
+        if isinstance(deps, str):
+            deps = [deps]
+        for dep in deps:
+            # by convention, drivers are named as <category>_<name>, so we can
+            # infer that dependency is a driver if it has an underscore
+            if "_" not in dep:
+                continue
+            dep_driver = dep.replace("_", "/", 1)
+            ctx.driver_dep_map.setdefault(driver_name, []).append(dep_driver)
+
+    # sort all lists alphabetically
+    apps.sort()
+    examples.sort()
+    drivers.sort()
+
+    # save all of this information into our context
+    ctx.avail_apps = apps
+    ctx.avail_examples = examples
+    ctx.avail_drivers = drivers
+
+    return 0
+
+
+def _main() -> int:
+    """Parse command line arguments and direct program flow."""
+    # this is primarily a TUI script, but we also want to be able to automate
+    # everything, or set defaults to enhance user interaction and
+    # customization.
+
+    # valid parameters:
+    # --no-ui: run without any user interaction
+    # --no-gdbsudo: set up launch targets to use gdb directly
+    # --gdbsudo: set up launch targets to use gdbsudo
+    # --no-defaults: do not enable built-in build configurations
+    # --help: show help message
+    # -B/--build-dir: set build directory
+    # -b/--build-config: set default build configurations
+    #                    format: <label>,<description>,<meson-param>
+    #                    can be specified multiple times
+    # -c/--common-conf: additional configuration common to all build tasks
+    # -a/--apps: comma-separated list of enabled apps
+    # -e/--examples: comma-separated list of enabled examples
+    # -d/--drivers: comma-separated list of enabled drivers
+    # -f/--force: overwrite existing configuration
+    ap = argparse.ArgumentParser(
+        description="Generate VSCode configuration for DPDK")
+    ap.add_argument("--no-ui", action="store_true",
+                    help="Run without any user interaction")
+    gdbgrp = ap.add_mutually_exclusive_group()
+    gdbgrp.add_argument("--no-gdbsudo", action="store_true",
+                        help="Set up launch targets to use gdb directly")
+    gdbgrp.add_argument("--gdbsudo", action="store_true",
+                        help="Set up launch targets to use gdbsudo")
+    ap.add_argument("--no-defaults", action="store_true",
+                    help="Do not enable built-in build configurations")
+    ap.add_argument("-B", "--build-dir", default="build",
+                    help="Set build directory")
+    ap.add_argument("-b", "--build-config", action="append", default=[],
+                    help="Comma-separated build task configuration of format \
+                        [label,description,meson setup arguments]")
+    ap.add_argument("-c", "--common-conf",
+                    help="Additional configuration common to all build tasks",
+                    default="")
+    ap.add_argument("-a", "--apps", default="",
+                    help="Comma-separated list of enabled apps \
+                        (wildcards accepted)")
+    ap.add_argument("-e", "--examples", default="",
+                    help="Comma-separated list of enabled examples \
+                        (wildcards accepted)")
+    ap.add_argument("-d", "--drivers", default="",
+                    help="Comma-separated list of enabled drivers \
+                        (wildcards accepted)")
+    ap.add_argument("-f", "--force", action="store_true",
+                    help="Overwrite existing configuration")
+    ap.epilog = """\
+When script is run in interactive mode, parameters will be \
+used to set up dialog defaults. Otherwise, they will be used \
+to create configuration directly."""
+    args = ap.parse_args()
+
+    def_configs = [
+        ("debug", "Debug build", "--buildtype=debug"),
+        ("debugopt", "Debug build with optimizations",
+         "--buildtype=debugoptimized"),
+        ("release", "Release build with documentation",
+         "--buildtype=release -Denable_docs=true"),
+        ("asan", "Address Sanitizer build",
+         "--buildtype=debugoptimized -Db_sanitize=address -Db_lundef=false"),
+    ]
+    # parse build configs
+    arg_configs: List[Tuple[str, str, str]] = []
+    for c in args.build_config:
+        parts: List[str] = c.split(",")
+        if len(parts) != 3:
+            print(
+                f"Error: Invalid build configuration format: {c}", file=stderr)
+            return 1
+        arg_configs.append(tuple(parts))
+
+    # set up command line context. all wildcards will be passed directly to
+    # _main, and will be resolved later, when we have a list of things to
+    # enable/disable.
+    ctx = CmdlineCtx()
+    ctx.use_ui = not args.no_ui
+    ctx.force_overwrite = args.force
+    ctx.build_dir = args.build_dir
+    ctx.common_conf = args.common_conf
+    ctx.enabled_configs_str = args.build_config
+    ctx.enabled_apps_str = args.apps
+    ctx.enabled_examples_str = args.examples
+    ctx.enabled_drivers_str = args.drivers
+    ctx.enabled_configs = arg_configs
+    ctx.avail_configs = def_configs + ctx.enabled_configs
+
+    # if user has specified gdbsudo argument, use that
+    if args.gdbsudo or args.no_gdbsudo:
+        ctx.use_gdbsudo = args.gdbsudo or not args.no_gdbsudo
+    else:
+        # use gdb if we're root
+        ctx.use_gdbsudo = os.geteuid() != 0
+        print(f"Autodetected gdbsudo usage: {ctx.use_gdbsudo}")
+
+    if not args.no_defaults:
+        # enable default configs
+        ctx.enabled_configs.extend(def_configs)
+
+    # if UI interaction is requested, check if whiptail is installed
+    if ctx.use_ui and os.system("which whiptail &> /dev/null") != 0:
+        print("whiptail is not installed! Please install it and try again.",
+              file=stderr)
+        return 1
+
+    # check if gdbsudo is available
+    if ctx.use_gdbsudo and os.system("which gdbsudo &> /dev/null") != 0:
+        print("Generated configuration will use \
+            gdbsudo script to run applications.", file=stderr)
+        print("If you want to use gdb directly, \
+            please run with --no-gdbsudo argument.", file=stderr)
+        print("Otherwise, run the following snippet \
+            in your terminal and try again:", file=stderr)
+        print("""\
+sudo tee <<EOF /usr/local/bin/gdbsudo &> /dev/null
+#!/usr/bin/bash
+sudo gdb $@
+EOF
+sudo chmod a+x /usr/local/bin/gdbsudo
+""", file=stderr)
+        return 1
+
+    if _discover_ctx(ctx):
+        return 1
+    if _resolve_ctx(ctx):
+        return 1
+    if ctx.use_ui and _update_ctx_from_ui(ctx):
+        return 1
+    _resolve_deps(ctx)
+    # resolve again because we might have added some dependencies
+    if _resolve_ctx(ctx):
+        return 1
+    return _build_configs(ctx)
+
+
+if __name__ == "__main__":
+    _exit(_main())
-- 
2.43.5


^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-29 13:05   ` [RFC PATCH v2 1/1] devtools: add vscode configuration generator Anatoly Burakov
@ 2024-07-29 13:14     ` Bruce Richardson
  2024-07-29 13:17       ` Burakov, Anatoly
  2024-07-29 14:30     ` Bruce Richardson
  1 sibling, 1 reply; 21+ messages in thread
From: Bruce Richardson @ 2024-07-29 13:14 UTC (permalink / raw)
  To: Anatoly Burakov; +Cc: dev, john.mcnamara

On Mon, Jul 29, 2024 at 02:05:52PM +0100, Anatoly Burakov wrote:
> A lot of developers use Visual Studio Code as their primary IDE. This
> script generates a configuration file for VSCode that sets up basic build
> tasks, launch tasks, as well as C/C++ code analysis settings that will
> take into account compile_commands.json that is automatically generated
> by meson.
> 
> Files generated by script:
>  - .vscode/settings.json: stores variables needed by other files
>  - .vscode/tasks.json: defines build tasks
>  - .vscode/launch.json: defines launch tasks
>  - .vscode/c_cpp_properties.json: defines code analysis settings
> 
> The script uses a combination of globbing and meson file parsing to
> discover available apps, examples, and drivers, and generates a
> project-wide settings file, so that the user can later switch between
> debug/release/etc. configurations while keeping their desired apps,
> examples, and drivers, built by meson, and ensuring launch configurations
> still work correctly whatever the configuration selected.
> 
> This script uses whiptail as TUI, which is expected to be universally
> available as it is shipped by default on most major distributions.
> However, the script is also designed to be scriptable and can be run
> without user interaction, and have its configuration supplied from
> command-line arguments.
> 
> Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
> ---
> 
Not sure where it would go - contributors guide probably - but I think this
script could do with some docs, especially a quick-setup example on how to
use. Having it in the docs will also make it more likely someone will use
this.

/Bruce

^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-29 13:14     ` Bruce Richardson
@ 2024-07-29 13:17       ` Burakov, Anatoly
  0 siblings, 0 replies; 21+ messages in thread
From: Burakov, Anatoly @ 2024-07-29 13:17 UTC (permalink / raw)
  To: Bruce Richardson; +Cc: dev, john.mcnamara

On 7/29/2024 3:14 PM, Bruce Richardson wrote:
> On Mon, Jul 29, 2024 at 02:05:52PM +0100, Anatoly Burakov wrote:
>> A lot of developers use Visual Studio Code as their primary IDE. This
>> script generates a configuration file for VSCode that sets up basic build
>> tasks, launch tasks, as well as C/C++ code analysis settings that will
>> take into account compile_commands.json that is automatically generated
>> by meson.
>>
>> Files generated by script:
>>   - .vscode/settings.json: stores variables needed by other files
>>   - .vscode/tasks.json: defines build tasks
>>   - .vscode/launch.json: defines launch tasks
>>   - .vscode/c_cpp_properties.json: defines code analysis settings
>>
>> The script uses a combination of globbing and meson file parsing to
>> discover available apps, examples, and drivers, and generates a
>> project-wide settings file, so that the user can later switch between
>> debug/release/etc. configurations while keeping their desired apps,
>> examples, and drivers, built by meson, and ensuring launch configurations
>> still work correctly whatever the configuration selected.
>>
>> This script uses whiptail as TUI, which is expected to be universally
>> available as it is shipped by default on most major distributions.
>> However, the script is also designed to be scriptable and can be run
>> without user interaction, and have its configuration supplied from
>> command-line arguments.
>>
>> Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
>> ---
>>
> Not sure where it would go - contributors guide probably - but I think this
> script could do with some docs, especially a quick-setup example on how to
> use. Having it in the docs will also make it more likely someone will use
> this.
> 
> /Bruce

Yep, this is the next step :) I've left it until v1 to add docs.

-- 
Thanks,
Anatoly


^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-29 13:05   ` [RFC PATCH v2 1/1] devtools: add vscode configuration generator Anatoly Burakov
  2024-07-29 13:14     ` Bruce Richardson
@ 2024-07-29 14:30     ` Bruce Richardson
  2024-07-29 16:16       ` Burakov, Anatoly
  1 sibling, 1 reply; 21+ messages in thread
From: Bruce Richardson @ 2024-07-29 14:30 UTC (permalink / raw)
  To: Anatoly Burakov; +Cc: dev, john.mcnamara

On Mon, Jul 29, 2024 at 02:05:52PM +0100, Anatoly Burakov wrote:
> A lot of developers use Visual Studio Code as their primary IDE. This
> script generates a configuration file for VSCode that sets up basic build
> tasks, launch tasks, as well as C/C++ code analysis settings that will
> take into account compile_commands.json that is automatically generated
> by meson.
> 
> Files generated by script:
>  - .vscode/settings.json: stores variables needed by other files
>  - .vscode/tasks.json: defines build tasks
>  - .vscode/launch.json: defines launch tasks
>  - .vscode/c_cpp_properties.json: defines code analysis settings
> 
> The script uses a combination of globbing and meson file parsing to
> discover available apps, examples, and drivers, and generates a
> project-wide settings file, so that the user can later switch between
> debug/release/etc. configurations while keeping their desired apps,
> examples, and drivers, built by meson, and ensuring launch configurations
> still work correctly whatever the configuration selected.
> 
> This script uses whiptail as TUI, which is expected to be universally
> available as it is shipped by default on most major distributions.
> However, the script is also designed to be scriptable and can be run
> without user interaction, and have its configuration supplied from
> command-line arguments.
> 
> Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
> ---
> 
Just was trying this out, nice script, thanks.

Initial thoughts concerning the build directory:
- the script doesn't actually create the build directory, so there is no
  guarantee that the build directory created will have the same parameters
  as that specified in the script run. I'd suggest in the case where the
  user runs the script and specifies build settings, that the build
  directory is then configured using those settings.

- On the other hand, when the build directory already exists - I think the
  script should pull all settings from there, rather than prompting the
  user.

- I'm not sure I like the idea for reconfiguring of just removing the build
  directory and doing a whole meson setup command all over again. This
  seems excessive and also removes the possibility of the user having made
  changes in config to the build dir without re-running the whole config
  script. For example, having tweaked the LTO setting, or the
  instruction_set_isa_setting. Rather than deleting it and running meson
  setup, it would be better to use "meson configure" to adjust the one
  required setting and let ninja figure out how to propagate that change.
  That saves the script from having to track all meson parameters itself.

- Finally, and semi-related, this script assumes that the user does
  everything in a single build directory. Just something to consider, but
  my own workflow till now has tended to keep multiple build directories
  around, generally a "build" directory, which is either release or
  debugoptimized type, and a separate "build-debug" directory + occasionally
  others for build testing. When doing incremental builds, the time taken to
  do two builds following a change is a lot less noticable than the time taken
  for periodic switches of a single build directory between debug and release
  mode.

Final thoughts on usability:

- Please don't write gdbsudo to /usr/local/bin without asking the user
  first. Instead I think it should default to $HOME/.local/bin, but with a
  prompt for the user to specify a path.

- While I realise your primary concern here is an interactive script, I'd
  tend towards requiring a cmdline arg to run in interactive mode and
  instead printing the help usage when run without parameters. Just a
  personal preference on my part though.

/Bruce

^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-29 14:30     ` Bruce Richardson
@ 2024-07-29 16:16       ` Burakov, Anatoly
  2024-07-29 16:41         ` Bruce Richardson
  0 siblings, 1 reply; 21+ messages in thread
From: Burakov, Anatoly @ 2024-07-29 16:16 UTC (permalink / raw)
  To: Bruce Richardson; +Cc: dev, john.mcnamara

On 7/29/2024 4:30 PM, Bruce Richardson wrote:
> On Mon, Jul 29, 2024 at 02:05:52PM +0100, Anatoly Burakov wrote:
>> A lot of developers use Visual Studio Code as their primary IDE. This
>> script generates a configuration file for VSCode that sets up basic build
>> tasks, launch tasks, as well as C/C++ code analysis settings that will
>> take into account compile_commands.json that is automatically generated
>> by meson.
>>
>> Files generated by script:
>>   - .vscode/settings.json: stores variables needed by other files
>>   - .vscode/tasks.json: defines build tasks
>>   - .vscode/launch.json: defines launch tasks
>>   - .vscode/c_cpp_properties.json: defines code analysis settings
>>
>> The script uses a combination of globbing and meson file parsing to
>> discover available apps, examples, and drivers, and generates a
>> project-wide settings file, so that the user can later switch between
>> debug/release/etc. configurations while keeping their desired apps,
>> examples, and drivers, built by meson, and ensuring launch configurations
>> still work correctly whatever the configuration selected.
>>
>> This script uses whiptail as TUI, which is expected to be universally
>> available as it is shipped by default on most major distributions.
>> However, the script is also designed to be scriptable and can be run
>> without user interaction, and have its configuration supplied from
>> command-line arguments.
>>
>> Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
>> ---
>>
> Just was trying this out, nice script, thanks.

Thanks for the feedback! Comments below.

> 
> Initial thoughts concerning the build directory:
> - the script doesn't actually create the build directory, so there is no
>    guarantee that the build directory created will have the same parameters
>    as that specified in the script run. I'd suggest in the case where the
>    user runs the script and specifies build settings, that the build
>    directory is then configured using those settings.

I'm not sure I follow.

The script creates a command for VSCode to create a build directory 
using configuration the user has supplied at script's run time. The 
directory is not created by the script, that is the job of meson build 
system. This script is merely codifying commands for meson to do that, 
with the expectation that the user is familiar with VSCode workflow and 
will go straight to build commands anyway, and will pick one of them. 
Are you suggesting running `meson setup` right after?

Assuming we do that, it would actually then be possible to adjust launch 
tasks to only include *actual* built apps/examples (as well as infer 
things like platform, compiler etc.), as that's one weakness of my 
current "flying blind" approach, so I wouldn't be opposed to adding an 
extra step here, just want to make sure I understand what you're saying 
correctly.

> 
> - On the other hand, when the build directory already exists - I think the
>    script should pull all settings from there, rather than prompting the
>    user.
> 

That can be done, however, my own workflow has been that I do not ever 
keep build directories inside my source directory, so it would not be 
possible to pick up directories anywhere but the source directory.

I also think from the point of view of the script it would be easier to 
start from known quantities rather than guess what user was trying to do 
from current configuration, but I guess a few common-sense heuristics 
should suffice for most use cases, such as e.g. inferring debug builds.

> - I'm not sure I like the idea for reconfiguring of just removing the build
>    directory and doing a whole meson setup command all over again. This
>    seems excessive and also removes the possibility of the user having made
>    changes in config to the build dir without re-running the whole config
>    script. For example, having tweaked the LTO setting, or the
>    instruction_set_isa_setting. Rather than deleting it and running meson
>    setup, it would be better to use "meson configure" to adjust the one
>    required setting and let ninja figure out how to propagate that change.
>    That saves the script from having to track all meson parameters itself.

Last I checked, meson doesn't have a command that would "setup or 
configure existing" a directory, it's either "set up new one" or 
"configure existing one". I guess we could set up a fallback of 
"configure || setup".

> 
> - Finally, and semi-related, this script assumes that the user does
>    everything in a single build directory. Just something to consider, but
>    my own workflow till now has tended to keep multiple build directories
>    around, generally a "build" directory, which is either release or
>    debugoptimized type, and a separate "build-debug" directory + occasionally
>    others for build testing. When doing incremental builds, the time taken to
>    do two builds following a change is a lot less noticable than the time taken
>    for periodic switches of a single build directory between debug and release
>    mode.

The problem with that approach is the launch tasks, because a launch 
task can only ever point to one executable, so if you have multiple 
build directories, you'll have to have multiple launch tasks per 
app/example. I guess we can either tag them (e.g. "Launch dpdk-testpmd 
[debug]", "Launch dpdk-testpmd [asan]" etc.), or use some kind of 
indirection to "select active build configuration" (e.g. have one launch 
task but overwrite ${config:BUILDDIR} after request for configuration, 
so that launch tasks would pick up actual executable path at run time 
from settings). I would prefer the latter to be honest, as it's much 
easier to drop a script into ./vscode and run it together with 
"configure" command to switch between different build/launch 
configurations. What do you think?

> 
> Final thoughts on usability:
> 
> - Please don't write gdbsudo to /usr/local/bin without asking the user
>    first. Instead I think it should default to $HOME/.local/bin, but with a
>    prompt for the user to specify a path.

It's not creating anything, it's just printing out a snippet, which, if 
run by user, would do that - the implication is obviously that the user 
may correct it if necessary. The script actually picks up path to 
`gdbsudo` from `which` command, so if the user puts their command to 
$HOME/.local/bin or something, it would get picked up if it's in PATH. I 
see your point about maybe suggesting using a home directory path 
instead of a system wide path, I can change that.

> 
> - While I realise your primary concern here is an interactive script, I'd
>    tend towards requiring a cmdline arg to run in interactive mode and
>    instead printing the help usage when run without parameters. Just a
>    personal preference on my part though.

I found it to be much faster to pick my targets, apps etc. using a 
couple of interactive windows than to type out parameters I probably 
don't even remember ahead of time (especially build configurations!), 
and I believe it's more newbie-friendly that way, as I imagine very few 
people will want to learn arguments for yet-another-script just to start 
using VSCode. It would be my personal preference to leave it as 
default-to-TUI, but maybe recognizing a widely used `-i` parameter would 
be a good compromise for instant familiarity.

> 
> /Bruce

-- 
Thanks,
Anatoly


^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-29 16:16       ` Burakov, Anatoly
@ 2024-07-29 16:41         ` Bruce Richardson
  2024-07-30  9:21           ` Burakov, Anatoly
  0 siblings, 1 reply; 21+ messages in thread
From: Bruce Richardson @ 2024-07-29 16:41 UTC (permalink / raw)
  To: Burakov, Anatoly; +Cc: dev, john.mcnamara

On Mon, Jul 29, 2024 at 06:16:48PM +0200, Burakov, Anatoly wrote:
> On 7/29/2024 4:30 PM, Bruce Richardson wrote:
> > On Mon, Jul 29, 2024 at 02:05:52PM +0100, Anatoly Burakov wrote:
> > > A lot of developers use Visual Studio Code as their primary IDE. This
> > > script generates a configuration file for VSCode that sets up basic build
> > > tasks, launch tasks, as well as C/C++ code analysis settings that will
> > > take into account compile_commands.json that is automatically generated
> > > by meson.
> > > 
> > > Files generated by script:
> > >   - .vscode/settings.json: stores variables needed by other files
> > >   - .vscode/tasks.json: defines build tasks
> > >   - .vscode/launch.json: defines launch tasks
> > >   - .vscode/c_cpp_properties.json: defines code analysis settings
> > > 
> > > The script uses a combination of globbing and meson file parsing to
> > > discover available apps, examples, and drivers, and generates a
> > > project-wide settings file, so that the user can later switch between
> > > debug/release/etc. configurations while keeping their desired apps,
> > > examples, and drivers, built by meson, and ensuring launch configurations
> > > still work correctly whatever the configuration selected.
> > > 
> > > This script uses whiptail as TUI, which is expected to be universally
> > > available as it is shipped by default on most major distributions.
> > > However, the script is also designed to be scriptable and can be run
> > > without user interaction, and have its configuration supplied from
> > > command-line arguments.
> > > 
> > > Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
> > > ---
> > > 
> > Just was trying this out, nice script, thanks.
> 
> Thanks for the feedback! Comments below.
> 
> > 
> > Initial thoughts concerning the build directory:
> > - the script doesn't actually create the build directory, so there is no
> >    guarantee that the build directory created will have the same parameters
> >    as that specified in the script run. I'd suggest in the case where the
> >    user runs the script and specifies build settings, that the build
> >    directory is then configured using those settings.
> 
> I'm not sure I follow.
> 
> The script creates a command for VSCode to create a build directory using
> configuration the user has supplied at script's run time. The directory is
> not created by the script, that is the job of meson build system. This
> script is merely codifying commands for meson to do that, with the
> expectation that the user is familiar with VSCode workflow and will go
> straight to build commands anyway, and will pick one of them. Are you
> suggesting running `meson setup` right after?
> 

Yes, that is what I was thinking, based on the assumption that running
"meson setup" should be a once-only task. I suppose overall it's a
different workflow to what you have - where you run meson setup repeatedly
each time you change a build type. My thinking for the approach to take
here is that your script firstly asks for a build directory, and then:
* if build-dir exists, pull settings you need from there, such as build
  type and the apps being built.
* if not existing, ask for config settings as you do now, and then run
  meson setup to create the build dir.

Thereafter, the source for all build settings is not in vscode, but in
meson, and you just use "meson configure" from vscode to tweak whatever
needs tweaking, without affecting any other settings. Since you don't
affect any other settings there is no need to record what they should be.


> Assuming we do that, it would actually then be possible to adjust launch
> tasks to only include *actual* built apps/examples (as well as infer things
> like platform, compiler etc.), as that's one weakness of my current "flying
> blind" approach, so I wouldn't be opposed to adding an extra step here, just
> want to make sure I understand what you're saying correctly.
> 
> > 
> > - On the other hand, when the build directory already exists - I think the
> >    script should pull all settings from there, rather than prompting the
> >    user.
> > 
> 
> That can be done, however, my own workflow has been that I do not ever keep
> build directories inside my source directory, so it would not be possible to
> pick up directories anywhere but the source directory.
> 

Why is that, and how does it work now, for e.g. getting the
compile_commands.json file from your build folder?

> I also think from the point of view of the script it would be easier to
> start from known quantities rather than guess what user was trying to do
> from current configuration, but I guess a few common-sense heuristics should
> suffice for most use cases, such as e.g. inferring debug builds.
> 

What you need depends on whether you want to keep running "meson setup" -
which means you need to track all settings - or want to use "meson
configure" where you don't really need to track much at all.

> > - I'm not sure I like the idea for reconfiguring of just removing the build
> >    directory and doing a whole meson setup command all over again. This
> >    seems excessive and also removes the possibility of the user having made
> >    changes in config to the build dir without re-running the whole config
> >    script. For example, having tweaked the LTO setting, or the
> >    instruction_set_isa_setting. Rather than deleting it and running meson
> >    setup, it would be better to use "meson configure" to adjust the one
> >    required setting and let ninja figure out how to propagate that change.
> >    That saves the script from having to track all meson parameters itself.
> 
> Last I checked, meson doesn't have a command that would "setup or configure
> existing" a directory, it's either "set up new one" or "configure existing
> one". I guess we could set up a fallback of "configure || setup".
> 

This goes back to the whole "create build directory after running the
script" option. If the script creates the build dir, the vscode commands
never need to use meson setup, only ever meson configure.

> > 
> > - Finally, and semi-related, this script assumes that the user does
> >    everything in a single build directory. Just something to consider, but
> >    my own workflow till now has tended to keep multiple build directories
> >    around, generally a "build" directory, which is either release or
> >    debugoptimized type, and a separate "build-debug" directory + occasionally
> >    others for build testing. When doing incremental builds, the time taken to
> >    do two builds following a change is a lot less noticable than the time taken
> >    for periodic switches of a single build directory between debug and release
> >    mode.
> 
> The problem with that approach is the launch tasks, because a launch task
> can only ever point to one executable, so if you have multiple build
> directories, you'll have to have multiple launch tasks per app/example. I
> guess we can either tag them (e.g. "Launch dpdk-testpmd [debug]", "Launch
> dpdk-testpmd [asan]" etc.), or use some kind of indirection to "select
> active build configuration" (e.g. have one launch task but overwrite
> ${config:BUILDDIR} after request for configuration, so that launch tasks
> would pick up actual executable path at run time from settings). I would
> prefer the latter to be honest, as it's much easier to drop a script into
> ./vscode and run it together with "configure" command to switch between
> different build/launch configurations. What do you think?
> 

I think I'd prefer the former actually - to have explicit tasks always
listed for debug and release builds.
Not a big deal for me either way, I'll just hack in the extra tasks as I
need them, so it's a low-priority support item for me.

> > 
> > Final thoughts on usability:
> > 
> > - Please don't write gdbsudo to /usr/local/bin without asking the user
> >    first. Instead I think it should default to $HOME/.local/bin, but with a
> >    prompt for the user to specify a path.
> 
> It's not creating anything, it's just printing out a snippet, which, if run
> by user, would do that - the implication is obviously that the user may
> correct it if necessary. The script actually picks up path to `gdbsudo` from
> `which` command, so if the user puts their command to $HOME/.local/bin or
> something, it would get picked up if it's in PATH. I see your point about
> maybe suggesting using a home directory path instead of a system wide path,
> I can change that.

Yep, thanks, and thanks for the explanation.
BTW: even if the user is running as non-root user, they don't always need
to use sudo (I set up my system to not need it for running DPDK). I see
there is a cmdline option for "no-gdbsudo" but I think you should make that
accessible via TUI also if non-root.

And for the cmdline parameters, how about shortening them to "--sudo" and
"--nosudo". For debug vs release builds you may want to have the latter run
without gdb at all, just with or without sudo.]

> 
> > 
> > - While I realise your primary concern here is an interactive script, I'd
> >    tend towards requiring a cmdline arg to run in interactive mode and
> >    instead printing the help usage when run without parameters. Just a
> >    personal preference on my part though.
> 
> I found it to be much faster to pick my targets, apps etc. using a couple of
> interactive windows than to type out parameters I probably don't even
> remember ahead of time (especially build configurations!), and I believe
> it's more newbie-friendly that way, as I imagine very few people will want
> to learn arguments for yet-another-script just to start using VSCode. It
> would be my personal preference to leave it as default-to-TUI, but maybe
> recognizing a widely used `-i` parameter would be a good compromise for
> instant familiarity.
> 

Ok. There's always a -h option for me to get the cmdline parameters.

I also think if the script is ok with working off an existing build
directory (or directories!), and only prompting for that, it would remove
for me the real necessity of asking for a more cmdline-fieldly version.


^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-29 16:41         ` Bruce Richardson
@ 2024-07-30  9:21           ` Burakov, Anatoly
  2024-07-30 10:31             ` Bruce Richardson
  0 siblings, 1 reply; 21+ messages in thread
From: Burakov, Anatoly @ 2024-07-30  9:21 UTC (permalink / raw)
  To: Bruce Richardson; +Cc: dev, john.mcnamara

On 7/29/2024 6:41 PM, Bruce Richardson wrote:
> On Mon, Jul 29, 2024 at 06:16:48PM +0200, Burakov, Anatoly wrote:
>> On 7/29/2024 4:30 PM, Bruce Richardson wrote:
>>> On Mon, Jul 29, 2024 at 02:05:52PM +0100, Anatoly Burakov wrote:
>>>> A lot of developers use Visual Studio Code as their primary IDE. This
>>>> script generates a configuration file for VSCode that sets up basic build
>>>> tasks, launch tasks, as well as C/C++ code analysis settings that will
>>>> take into account compile_commands.json that is automatically generated
>>>> by meson.
>>>>
>>>> Files generated by script:
>>>>    - .vscode/settings.json: stores variables needed by other files
>>>>    - .vscode/tasks.json: defines build tasks
>>>>    - .vscode/launch.json: defines launch tasks
>>>>    - .vscode/c_cpp_properties.json: defines code analysis settings
>>>>
>>>> The script uses a combination of globbing and meson file parsing to
>>>> discover available apps, examples, and drivers, and generates a
>>>> project-wide settings file, so that the user can later switch between
>>>> debug/release/etc. configurations while keeping their desired apps,
>>>> examples, and drivers, built by meson, and ensuring launch configurations
>>>> still work correctly whatever the configuration selected.
>>>>
>>>> This script uses whiptail as TUI, which is expected to be universally
>>>> available as it is shipped by default on most major distributions.
>>>> However, the script is also designed to be scriptable and can be run
>>>> without user interaction, and have its configuration supplied from
>>>> command-line arguments.
>>>>
>>>> Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
>>>> ---
>>>>
>>> Just was trying this out, nice script, thanks.
>>
>> Thanks for the feedback! Comments below.
>>
>>>
>>> Initial thoughts concerning the build directory:
>>> - the script doesn't actually create the build directory, so there is no
>>>     guarantee that the build directory created will have the same parameters
>>>     as that specified in the script run. I'd suggest in the case where the
>>>     user runs the script and specifies build settings, that the build
>>>     directory is then configured using those settings.
>>
>> I'm not sure I follow.
>>
>> The script creates a command for VSCode to create a build directory using
>> configuration the user has supplied at script's run time. The directory is
>> not created by the script, that is the job of meson build system. This
>> script is merely codifying commands for meson to do that, with the
>> expectation that the user is familiar with VSCode workflow and will go
>> straight to build commands anyway, and will pick one of them. Are you
>> suggesting running `meson setup` right after?
>>
> 
> Yes, that is what I was thinking, based on the assumption that running
> "meson setup" should be a once-only task. I suppose overall it's a
> different workflow to what you have - where you run meson setup repeatedly
> each time you change a build type. My thinking for the approach to take
> here is that your script firstly asks for a build directory, and then:
> * if build-dir exists, pull settings you need from there, such as build
>    type and the apps being built.
> * if not existing, ask for config settings as you do now, and then run
>    meson setup to create the build dir.

I guess the disconnect comes from the fact that I treat meson build 
directories as transient and disposable and never hesitate to wipe them 
and start over, whereas you seem to favor persistence. Since 99% of the 
time I'm using heavily reduced builds anyway (e.g. one app, one driver), 
repeatedly doing meson setup doesn't really hinder me in any way, but I 
suppose it would if I had more meaty builds.

I think we also have a slightly different view of what this script 
should be - I envisioned a "one-stop-shop" for "open freshly cloned DPDK 
directory, run one script and off you go" (which stems from the fact 
that I heavily rely on Git Worktrees [1] in my workflow, so having an 
untouched source directory without any configuration in it is something 
I am constantly faced with), while you seem to favor picking up existing 
meson build and building a VSCode configuration around it.

I can see point in this, and I guess I actually unwittingly rolled two 
scripts into one - a TUI meson frontend, and a VSCode configuration 
generator. Perhaps it should very well be two scripts, not one? Because 
honestly, having something like what I built for TUI (the meson 
configuration frontend) is something I missed a lot and something I 
always wished our long-gone `setup.sh` script had, but didn't, and now 
with meson it's much simpler to do but we still don't have anything like 
that. Maybe this is our opportunity to provide a quicker "quick start" 
script, one that I could run and configure meson build without having to 
type anything. WDYT?

> 
> Thereafter, the source for all build settings is not in vscode, but in
> meson, and you just use "meson configure" from vscode to tweak whatever
> needs tweaking, without affecting any other settings. Since you don't
> affect any other settings there is no need to record what they should be.

Why would I be using meson configure from vscode then? I mean, the whole 
notion of having tasks in VSCode is so that I define a few common 
configurations and never touch them until I'm working on something else. 
If I have to type in configure commands anyway (because I would have to, 
to adjust configuration in meson), I might as well do so using terminal, 
and avoid dealing with meson from VSCode entirely?

(technically, there's a way to read arguments from a popup window - I 
suppose we could have this step recorded as a VSCode task)

> 
> 
>> Assuming we do that, it would actually then be possible to adjust launch
>> tasks to only include *actual* built apps/examples (as well as infer things
>> like platform, compiler etc.), as that's one weakness of my current "flying
>> blind" approach, so I wouldn't be opposed to adding an extra step here, just
>> want to make sure I understand what you're saying correctly.
>>
>>>
>>> - On the other hand, when the build directory already exists - I think the
>>>     script should pull all settings from there, rather than prompting the
>>>     user.
>>>
>>
>> That can be done, however, my own workflow has been that I do not ever keep
>> build directories inside my source directory, so it would not be possible to
>> pick up directories anywhere but the source directory.
>>
> 
> Why is that, and how does it work now, for e.g. getting the
> compile_commands.json file from your build folder?

Right now, at configuration time I store build directory in 
settings.json, and all of the other tasks (build, launch, C++ analysis) 
pick it up via ${config:...} variable. This is why I suggested having a 
notion of "active configuration" - if we just rewrite that variable, 
both ninja and launch tasks can be switched to a different build 
directory without actually having to create new tasks. More on that 
below, as that raises a few questions.

As for why, it's a personal preference - it's annoying to have the 
build/ directory in my file view (it's possible to hide it, I guess I 
could automatically add a setting to do that in settings.json), and it 
interferes with e.g. source-tree-wide searches and stuff like that. This 
isn't a hard requirement though, I can switch to in-tree builds and 
automatic directory hiding if it means more automation :)

> 
>> I also think from the point of view of the script it would be easier to
>> start from known quantities rather than guess what user was trying to do
>> from current configuration, but I guess a few common-sense heuristics should
>> suffice for most use cases, such as e.g. inferring debug builds.
>>
> 
> What you need depends on whether you want to keep running "meson setup" -
> which means you need to track all settings - or want to use "meson
> configure" where you don't really need to track much at all.

I guess I was aiming for the former, but doing only the latter makes 
sense if we assume we want to separate setup from VSCode config 
generation (which it seems like we're heading that way).

> 
>>> - I'm not sure I like the idea for reconfiguring of just removing the build
>>>     directory and doing a whole meson setup command all over again. This
>>>     seems excessive and also removes the possibility of the user having made
>>>     changes in config to the build dir without re-running the whole config
>>>     script. For example, having tweaked the LTO setting, or the
>>>     instruction_set_isa_setting. Rather than deleting it and running meson
>>>     setup, it would be better to use "meson configure" to adjust the one
>>>     required setting and let ninja figure out how to propagate that change.
>>>     That saves the script from having to track all meson parameters itself.
>>
>> Last I checked, meson doesn't have a command that would "setup or configure
>> existing" a directory, it's either "set up new one" or "configure existing
>> one". I guess we could set up a fallback of "configure || setup".
>>
> 
> This goes back to the whole "create build directory after running the
> script" option. If the script creates the build dir, the vscode commands
> never need to use meson setup, only ever meson configure.

Sure, and like I suggested above, it doesn't even have to be *this* 
script, it can be another, a Python-based TUI reimplementation of 
setup.sh :)

> 
>>>
>>> - Finally, and semi-related, this script assumes that the user does
>>>     everything in a single build directory. Just something to consider, but
>>>     my own workflow till now has tended to keep multiple build directories
>>>     around, generally a "build" directory, which is either release or
>>>     debugoptimized type, and a separate "build-debug" directory + occasionally
>>>     others for build testing. When doing incremental builds, the time taken to
>>>     do two builds following a change is a lot less noticable than the time taken
>>>     for periodic switches of a single build directory between debug and release
>>>     mode.
>>
>> The problem with that approach is the launch tasks, because a launch task
>> can only ever point to one executable, so if you have multiple build
>> directories, you'll have to have multiple launch tasks per app/example. I
>> guess we can either tag them (e.g. "Launch dpdk-testpmd [debug]", "Launch
>> dpdk-testpmd [asan]" etc.), or use some kind of indirection to "select
>> active build configuration" (e.g. have one launch task but overwrite
>> ${config:BUILDDIR} after request for configuration, so that launch tasks
>> would pick up actual executable path at run time from settings). I would
>> prefer the latter to be honest, as it's much easier to drop a script into
>> ./vscode and run it together with "configure" command to switch between
>> different build/launch configurations. What do you think?
>>
> 
> I think I'd prefer the former actually - to have explicit tasks always
> listed for debug and release builds.
> Not a big deal for me either way, I'll just hack in the extra tasks as I
> need them, so it's a low-priority support item for me.

There's another issue though: code analysis. If you have multiple build 
directories, your C++ analysis settings (the 
.vscode/c_cpp_properties.json file) can only ever use one specific 
compile_commands.json from a specific build directory. I think if we 
were to support having multiple build dirs, we would *have to* implement 
something like "switch active configuration" thing anyway.

Whether we add tagged launch tasks is honestly a problem to be solved, 
because on the one hand we want them to be generated automatically 
(ideally after configure step), and on the other we also want to not 
interfere with any custom configuration the user has already added (such 
as e.g. command line arguments to existing launch tasks). We'll probably 
have to do config file parsing and updating the configuration, rather 
than regenerating it each time. Python's JSON also doesn't support 
comments, so for any comments that were added to configurations, we'd 
either lose them, or find a way to reinsert them post-generation.

> 
>>>
>>> Final thoughts on usability:
>>>
>>> - Please don't write gdbsudo to /usr/local/bin without asking the user
>>>     first. Instead I think it should default to $HOME/.local/bin, but with a
>>>     prompt for the user to specify a path.
>>
>> It's not creating anything, it's just printing out a snippet, which, if run
>> by user, would do that - the implication is obviously that the user may
>> correct it if necessary. The script actually picks up path to `gdbsudo` from
>> `which` command, so if the user puts their command to $HOME/.local/bin or
>> something, it would get picked up if it's in PATH. I see your point about
>> maybe suggesting using a home directory path instead of a system wide path,
>> I can change that.
> 
> Yep, thanks, and thanks for the explanation.
> BTW: even if the user is running as non-root user, they don't always need
> to use sudo (I set up my system to not need it for running DPDK). I see
> there is a cmdline option for "no-gdbsudo" but I think you should make that
> accessible via TUI also if non-root.

Sure, all of that can be done.

> 
> And for the cmdline parameters, how about shortening them to "--sudo" and
> "--nosudo". For debug vs release builds you may want to have the latter run
> without gdb at all, just with or without sudo.]

I honestly never run anything outside gdb, but that's a consequence of 
me not really working on things that routinely require it (e.g. 
performace code). We can add that no problem though.

> 
>>
>>>
>>> - While I realise your primary concern here is an interactive script, I'd
>>>     tend towards requiring a cmdline arg to run in interactive mode and
>>>     instead printing the help usage when run without parameters. Just a
>>>     personal preference on my part though.
>>
>> I found it to be much faster to pick my targets, apps etc. using a couple of
>> interactive windows than to type out parameters I probably don't even
>> remember ahead of time (especially build configurations!), and I believe
>> it's more newbie-friendly that way, as I imagine very few people will want
>> to learn arguments for yet-another-script just to start using VSCode. It
>> would be my personal preference to leave it as default-to-TUI, but maybe
>> recognizing a widely used `-i` parameter would be a good compromise for
>> instant familiarity.
>>
> 
> Ok. There's always a -h option for me to get the cmdline parameters.
> 
> I also think if the script is ok with working off an existing build
> directory (or directories!), and only prompting for that, it would remove
> for me the real necessity of asking for a more cmdline-fieldly version.

It sounds like this would really be something that a setup script would 
do better than a VSCode config generator.

So, assuming we want to move setup steps to another script and 
concentrate on VSCode configuration exclusively, my thinking of how it 
would work is as follows:

1) Assume we want multiple build directories, suggest automatically 
picking them up from source directory but support specifying one or more 
from command line arguments (or TUI, although I suspect if setup moves 
to a separate script, there's very little need for TUI in VSCode config 
generator - it should be a mostly mechanical process at that point)

2) Now that we track multiple build directories, we can store them in a 
YAML file or something (e.g. .vscode/.dpdk_builds.yaml), and create 
tasks to switch between them as "active" to support e.g. different code 
analysis settings and stuff like that

3) All build tasks can work off "active configuration" which means we 
don't need multiple compile tasks, but for launch tasks we may need 
different ones because different build dirs may have different launch tasks

Let's assume user just ran `meson configure` and changed something about 
one of their configurations. What do you think should happen next? I 
mean, if they added/removed an app/example to the build, we can detect 
that and auto-generate (or remove) a launch task, for example? Or do you 
think it should be a manual step, e.g. user should explicitly request 
regenerating/updating launch tasks? Maybe there should be an --update 
flag, to indicate that we're not creating new configurations but merely 
refreshing existing ones?

[1] https://git-scm.com/docs/git-worktree

-- 
Thanks,
Anatoly


^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-30  9:21           ` Burakov, Anatoly
@ 2024-07-30 10:31             ` Bruce Richardson
  2024-07-30 10:50               ` Burakov, Anatoly
  0 siblings, 1 reply; 21+ messages in thread
From: Bruce Richardson @ 2024-07-30 10:31 UTC (permalink / raw)
  To: Burakov, Anatoly; +Cc: dev, john.mcnamara

On Tue, Jul 30, 2024 at 11:21:25AM +0200, Burakov, Anatoly wrote:
> On 7/29/2024 6:41 PM, Bruce Richardson wrote:
> > On Mon, Jul 29, 2024 at 06:16:48PM +0200, Burakov, Anatoly wrote:
> > > On 7/29/2024 4:30 PM, Bruce Richardson wrote:
> > > > On Mon, Jul 29, 2024 at 02:05:52PM +0100, Anatoly Burakov wrote:
> > > > > A lot of developers use Visual Studio Code as their primary IDE. This
> > > > > script generates a configuration file for VSCode that sets up basic build
> > > > > tasks, launch tasks, as well as C/C++ code analysis settings that will
> > > > > take into account compile_commands.json that is automatically generated
> > > > > by meson.
> > > > > 
> > > > > Files generated by script:
> > > > >    - .vscode/settings.json: stores variables needed by other files
> > > > >    - .vscode/tasks.json: defines build tasks
> > > > >    - .vscode/launch.json: defines launch tasks
> > > > >    - .vscode/c_cpp_properties.json: defines code analysis settings
> > > > > 
> > > > > The script uses a combination of globbing and meson file parsing to
> > > > > discover available apps, examples, and drivers, and generates a
> > > > > project-wide settings file, so that the user can later switch between
> > > > > debug/release/etc. configurations while keeping their desired apps,
> > > > > examples, and drivers, built by meson, and ensuring launch configurations
> > > > > still work correctly whatever the configuration selected.
> > > > > 
> > > > > This script uses whiptail as TUI, which is expected to be universally
> > > > > available as it is shipped by default on most major distributions.
> > > > > However, the script is also designed to be scriptable and can be run
> > > > > without user interaction, and have its configuration supplied from
> > > > > command-line arguments.
> > > > > 
> > > > > Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
> > > > > ---
> > > > > 
> > > > Just was trying this out, nice script, thanks.
> > > 
> > > Thanks for the feedback! Comments below.
> > > 

More comments inline below, but summarising after the fact here.

 Still not entirely sure what way is best for all this so please take all
current and previous suggestions with a pinch of salt. Based off what you
suggest and the ongoing discuss my current thinking is:

* +1 to split the vscode config generation from the TUI. Both are also
  targetting different end-users - the TUI is for everyone looking to build
  DPDK, both devs and users, while the vscode config is for developers only.
* Let's ignore the multi-build-directory setup for now, if it makes it more
  complex for the simple cases of one build-dir.
* I think we should investigate having the vscode config generated from
  meson rather than the other way around.

See also below.

/Bruce

> > > > 
> > > > Initial thoughts concerning the build directory:
> > > > - the script doesn't actually create the build directory, so there is no
> > > >     guarantee that the build directory created will have the same parameters
> > > >     as that specified in the script run. I'd suggest in the case where the
> > > >     user runs the script and specifies build settings, that the build
> > > >     directory is then configured using those settings.
> > > 
> > > I'm not sure I follow.
> > > 
> > > The script creates a command for VSCode to create a build directory using
> > > configuration the user has supplied at script's run time. The directory is
> > > not created by the script, that is the job of meson build system. This
> > > script is merely codifying commands for meson to do that, with the
> > > expectation that the user is familiar with VSCode workflow and will go
> > > straight to build commands anyway, and will pick one of them. Are you
> > > suggesting running `meson setup` right after?
> > > 
> > 
> > Yes, that is what I was thinking, based on the assumption that running
> > "meson setup" should be a once-only task. I suppose overall it's a
> > different workflow to what you have - where you run meson setup repeatedly
> > each time you change a build type. My thinking for the approach to take
> > here is that your script firstly asks for a build directory, and then:
> > * if build-dir exists, pull settings you need from there, such as build
> >    type and the apps being built.
> > * if not existing, ask for config settings as you do now, and then run
> >    meson setup to create the build dir.
> 
> I guess the disconnect comes from the fact that I treat meson build
> directories as transient and disposable and never hesitate to wipe them and
> start over, whereas you seem to favor persistence. Since 99% of the time I'm
> using heavily reduced builds anyway (e.g. one app, one driver), repeatedly
> doing meson setup doesn't really hinder me in any way, but I suppose it
> would if I had more meaty builds.
> 

It mainly just seems inefficient to me. Ninja does a lot of processing to
minimise the work done whenever you make a configuration change, and you
bypass all that by nuking the directory from orbit (only way to be sure!)
and then creating a new one!

The other main downside of it (to my mind), is that the tracking of
settings for the build needs to be in vscode. I'd prefer meson itself to
be the "one source of truth" and vscode to be tweaking that, rather than
tracking everything itself.

> I think we also have a slightly different view of what this script should be
> - I envisioned a "one-stop-shop" for "open freshly cloned DPDK directory,
> run one script and off you go" (which stems from the fact that I heavily
> rely on Git Worktrees [1] in my workflow, so having an untouched source
> directory without any configuration in it is something I am constantly faced
> with), while you seem to favor picking up existing meson build and building
> a VSCode configuration around it.
> 
> I can see point in this, and I guess I actually unwittingly rolled two
> scripts into one - a TUI meson frontend, and a VSCode configuration
> generator. Perhaps it should very well be two scripts, not one? Because
> honestly, having something like what I built for TUI (the meson
> configuration frontend) is something I missed a lot and something I always
> wished our long-gone `setup.sh` script had, but didn't, and now with meson
> it's much simpler to do but we still don't have anything like that. Maybe
> this is our opportunity to provide a quicker "quick start" script, one that
> I could run and configure meson build without having to type anything. WDYT?
> 

Splitting it into two makes sense to me, yes, since as you point out there
are really two separate jobs involved here that one may want to roll with
separately.

> > 
> > Thereafter, the source for all build settings is not in vscode, but in
> > meson, and you just use "meson configure" from vscode to tweak whatever
> > needs tweaking, without affecting any other settings. Since you don't
> > affect any other settings there is no need to record what they should be.
> 
> Why would I be using meson configure from vscode then? I mean, the whole
> notion of having tasks in VSCode is so that I define a few common
> configurations and never touch them until I'm working on something else. If
> I have to type in configure commands anyway (because I would have to, to
> adjust configuration in meson), I might as well do so using terminal, and
> avoid dealing with meson from VSCode entirely?
> 
> (technically, there's a way to read arguments from a popup window - I
> suppose we could have this step recorded as a VSCode task)
> 

The reason I was thinking about this is that you don't want to expose
dozens of tasks in vscode for tweaking every possible meson setting. Sure,
have build and run tasks for the common options, but for "advanced use"
where the user wants to tweak a build setting, let them just do it from
commandline without having to re-run a config script to adjust the
settings.

<Snip>

> > > > 
> > > > - Finally, and semi-related, this script assumes that the user does
> > > >     everything in a single build directory. Just something to consider, but
> > > >     my own workflow till now has tended to keep multiple build directories
> > > >     around, generally a "build" directory, which is either release or
> > > >     debugoptimized type, and a separate "build-debug" directory + occasionally
> > > >     others for build testing. When doing incremental builds, the time taken to
> > > >     do two builds following a change is a lot less noticable than the time taken
> > > >     for periodic switches of a single build directory between debug and release
> > > >     mode.
> > > 
> > > The problem with that approach is the launch tasks, because a launch task
> > > can only ever point to one executable, so if you have multiple build
> > > directories, you'll have to have multiple launch tasks per app/example. I
> > > guess we can either tag them (e.g. "Launch dpdk-testpmd [debug]", "Launch
> > > dpdk-testpmd [asan]" etc.), or use some kind of indirection to "select
> > > active build configuration" (e.g. have one launch task but overwrite
> > > ${config:BUILDDIR} after request for configuration, so that launch tasks
> > > would pick up actual executable path at run time from settings). I would
> > > prefer the latter to be honest, as it's much easier to drop a script into
> > > ./vscode and run it together with "configure" command to switch between
> > > different build/launch configurations. What do you think?
> > > 
> > 
> > I think I'd prefer the former actually - to have explicit tasks always
> > listed for debug and release builds.
> > Not a big deal for me either way, I'll just hack in the extra tasks as I
> > need them, so it's a low-priority support item for me.
> 
> There's another issue though: code analysis. If you have multiple build
> directories, your C++ analysis settings (the .vscode/c_cpp_properties.json
> file) can only ever use one specific compile_commands.json from a specific
> build directory. I think if we were to support having multiple build dirs,
> we would *have to* implement something like "switch active configuration"
> thing anyway.
> 

Strictly speaking, yes. However, in my experience using eclipse as an IDE
in the past it doesn't matter that much which or how many build directories
are analysed. However, vscode may well be different in this regard.

Since I don't ever envisage myself doing everything always through vscode,
I'm happy enough with vscode managing a single build directory, and I can
manually worry about a second build directory myself.  Maybe let's park the
multi-build-dir stuff for now, unless others feel that it's something they
need.

> Whether we add tagged launch tasks is honestly a problem to be solved,
> because on the one hand we want them to be generated automatically (ideally
> after configure step), and on the other we also want to not interfere with
> any custom configuration the user has already added (such as e.g. command
> line arguments to existing launch tasks). We'll probably have to do config
> file parsing and updating the configuration, rather than regenerating it
> each time. Python's JSON also doesn't support comments, so for any comments
> that were added to configurations, we'd either lose them, or find a way to
> reinsert them post-generation.
> 

Have you considered generating the launch tasks from a script launched from
meson itself? Any time the configuration is changed, meson will re-run at
the next build and that can trigger re-generation of whatever vscode config
you need, including launch tasks for all the binaries. This would be
another advantage of splitting the script into two - one should look to make
the vscode-settings generation script usable from meson.

<snip>

> It sounds like this would really be something that a setup script would do
> better than a VSCode config generator.
> 
> So, assuming we want to move setup steps to another script and concentrate
> on VSCode configuration exclusively, my thinking of how it would work is as
> follows:
> 
> 1) Assume we want multiple build directories, suggest automatically picking
> them up from source directory but support specifying one or more from
> command line arguments (or TUI, although I suspect if setup moves to a
> separate script, there's very little need for TUI in VSCode config generator
> - it should be a mostly mechanical process at that point)
> 

I'm probably an outlier, so lets not over-design things for the
multi-build-directory case.

If we think about generating the vscode config from a meson run (via a
"generate-vscode-config" setting or something), that may switch the way in
which things are actually being done. In that case, a build directory would
register itself with the vscode config - creating a new one if not already
present.

> 2) Now that we track multiple build directories, we can store them in a YAML
> file or something (e.g. .vscode/.dpdk_builds.yaml), and create tasks to
> switch between them as "active" to support e.g. different code analysis
> settings and stuff like that
> 

Initially, maybe don't add this. For first draft supporting
multi-directories, I'd start by adding prefixed duplicate tasks for each
directory registered.

> 3) All build tasks can work off "active configuration" which means we don't
> need multiple compile tasks, but for launch tasks we may need different ones
> because different build dirs may have different launch tasks
> 

Again, I'd just add tasks rather than bothering with active configs.

> Let's assume user just ran `meson configure` and changed something about one
> of their configurations. What do you think should happen next? I mean, if
> they added/removed an app/example to the build, we can detect that and
> auto-generate (or remove) a launch task, for example? Or do you think it
> should be a manual step, e.g. user should explicitly request
> regenerating/updating launch tasks? Maybe there should be an --update flag,
> to indicate that we're not creating new configurations but merely refreshing
> existing ones?

See above, this is a case where having the vscode config script callable
from meson would be perfect.

> 
> [1] https://git-scm.com/docs/git-worktree
>
Thanks for the link - seems to be automating a setup which I've been
approximately doing manually for some years now! :-) 

^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 1/1] devtools: add vscode configuration generator
  2024-07-30 10:31             ` Bruce Richardson
@ 2024-07-30 10:50               ` Burakov, Anatoly
  0 siblings, 0 replies; 21+ messages in thread
From: Burakov, Anatoly @ 2024-07-30 10:50 UTC (permalink / raw)
  To: Bruce Richardson; +Cc: dev, john.mcnamara

On 7/30/2024 12:31 PM, Bruce Richardson wrote:
> On Tue, Jul 30, 2024 at 11:21:25AM +0200, Burakov, Anatoly wrote:
>> On 7/29/2024 6:41 PM, Bruce Richardson wrote:
>>> On Mon, Jul 29, 2024 at 06:16:48PM +0200, Burakov, Anatoly wrote:
>>>> On 7/29/2024 4:30 PM, Bruce Richardson wrote:
>>>>> On Mon, Jul 29, 2024 at 02:05:52PM +0100, Anatoly Burakov wrote:
>>>>>> A lot of developers use Visual Studio Code as their primary IDE. This
>>>>>> script generates a configuration file for VSCode that sets up basic build
>>>>>> tasks, launch tasks, as well as C/C++ code analysis settings that will
>>>>>> take into account compile_commands.json that is automatically generated
>>>>>> by meson.
>>>>>>
>>>>>> Files generated by script:
>>>>>>     - .vscode/settings.json: stores variables needed by other files
>>>>>>     - .vscode/tasks.json: defines build tasks
>>>>>>     - .vscode/launch.json: defines launch tasks
>>>>>>     - .vscode/c_cpp_properties.json: defines code analysis settings
>>>>>>
>>>>>> The script uses a combination of globbing and meson file parsing to
>>>>>> discover available apps, examples, and drivers, and generates a
>>>>>> project-wide settings file, so that the user can later switch between
>>>>>> debug/release/etc. configurations while keeping their desired apps,
>>>>>> examples, and drivers, built by meson, and ensuring launch configurations
>>>>>> still work correctly whatever the configuration selected.
>>>>>>
>>>>>> This script uses whiptail as TUI, which is expected to be universally
>>>>>> available as it is shipped by default on most major distributions.
>>>>>> However, the script is also designed to be scriptable and can be run
>>>>>> without user interaction, and have its configuration supplied from
>>>>>> command-line arguments.
>>>>>>
>>>>>> Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
>>>>>> ---
>>>>>>
>>>>> Just was trying this out, nice script, thanks.
>>>>
>>>> Thanks for the feedback! Comments below.
>>>>
> 
> More comments inline below, but summarising after the fact here.
> 
>   Still not entirely sure what way is best for all this so please take all
> current and previous suggestions with a pinch of salt. Based off what you
> suggest and the ongoing discuss my current thinking is:
> 
> * +1 to split the vscode config generation from the TUI. Both are also
>    targetting different end-users - the TUI is for everyone looking to build
>    DPDK, both devs and users, while the vscode config is for developers only.
> * Let's ignore the multi-build-directory setup for now, if it makes it more
>    complex for the simple cases of one build-dir.

Not really *that* much more complex, IMO. The only real issue is 
possible differences in code analysis behavior stemming from having 
"wrong" build directory set up as a source of compile_commands.json. If 
you're OK with adding multiple tasks per multiple build directories, 
then all of the rest of it becomes trivial because if launch tasks are 
per-build, they can reference per-build build commands and work 
seamlessly using "duplicate" commands.

And even then, for first version we can probably drop the code analysis 
angle (just use the first detected config as the source), in which case 
we can pretty much support multiple build dirs for free as we'd have to 
build all the infrastructure (e.g. config updates etc.) anyway if we 
want this process to be seamless.

> * I think we should investigate having the vscode config generated from
>    meson rather than the other way around.

It didn't occur to me that it was possible, it sounds like that's really 
the way to go!

<snip>


> Strictly speaking, yes. However, in my experience using eclipse as an IDE
> in the past it doesn't matter that much which or how many build directories
> are analysed. However, vscode may well be different in this regard.

Unless the user does *wildly* different things in their build 
directories (i.e. two dirs, one of which used for cross-build or 
something), I expect things to work without any additional effort, so 
you're right in that for most practical purposes, the result wouldn't 
really be different to having "proper" C++ analysis configurations.

> Since I don't ever envisage myself doing everything always through vscode,
> I'm happy enough with vscode managing a single build directory, and I can
> manually worry about a second build directory myself.  Maybe let's park the
> multi-build-dir stuff for now, unless others feel that it's something they
> need.

Well, I do strive to do most things with VSCode (that's why I had 
multiple configurations to begin with!), so it would benefit *my* 
workflow to support that :)

-- 
Thanks,
Anatoly


^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 0/1] Add Visual Studio Code configuration script
  2024-07-29 13:05 ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Anatoly Burakov
  2024-07-29 13:05   ` [RFC PATCH v2 1/1] devtools: add vscode configuration generator Anatoly Burakov
@ 2024-07-30 15:01   ` Bruce Richardson
  2024-07-30 15:14     ` Burakov, Anatoly
  1 sibling, 1 reply; 21+ messages in thread
From: Bruce Richardson @ 2024-07-30 15:01 UTC (permalink / raw)
  To: Anatoly Burakov; +Cc: dev, john.mcnamara

On Mon, Jul 29, 2024 at 02:05:51PM +0100, Anatoly Burakov wrote:
> Lots of developers (myself included) uses Visual Studio Code as their primary
> IDE for DPDK development. I have been successfully using various incarnations of
> this script internally to quickly set up my development trees whenever I need a
> new configuration, so this script is being shared in hopes that it will be
> useful both to new developers starting with DPDK, and to seasoned DPDK
> developers who are already using Visual Studio Code. It makes starting working
> on DPDK in Visual Studio Code so much easier!
> 
> ** NOTE: Currently, only x86 configuration is generated as I have no way to test
>    the code analysis configuration on any other platforms.
> 
> ** NOTE 2: this is not for *Visual Studio* the Windows IDE, this is for *Visual
>    Studio Code* the cross-platform code editor. Specifically, main target
>    audience for this script is people who either run DPDK directly on their
>    Linux machine, or who use Remote SSH functionality to connect to a remote
>    Linux machine and set up VSCode build there. No other OS's are currently
>    supported by the script.
> 
> (if you're unaware of what is Remote SSH, I highly suggest checking it out [1])
> 
> Philosophy behind this script is as follows:
> 
> - The assumption is made that a developer will not be using wildly different
>   configurations from build to build - usually, they build the same things, work
>   with the same set of apps/drivers for a while, then switch to something else,
>   at which point a new configuration is needed
> 
> - Some configurations I consider to be "common" are included: debug build, debug
>   optimized build, release build with docs, and ASan build (feel free to make
>   suggestions here!)
> 
> - By default, the script will not add any meson flags unless user requested it,
>   however it will create launch configurations for all apps because not
>   specifying any flags leads to all apps being enabled
> 
> - All parameters that can be adjusted by TUI are also available as command line
>   arguments, so while user interaction is the default (using whiptail), it's
>   actually not required and can be bypassed
> 

The management of dependencies of components to be built is obviously a
tricky area here, when specifying e.g. enable_drivers flags. It may be
possible to improve the situation in meson itself, but that probably
requires massive rework of the lib/meson.build, drivers/meson.build and
app/meson.build files to process the subdirs and save the results for later
use (effectively process them twice within the restrictions of meson only
allowing subdir once).

In the meantime, as a better-than-nothing improvement, I've pushed a draft
patch to have meson produce a dependencies file as part of its processing[1].
That may be of use to you in doing new versions of the TUI - i.e. in the
background you could run a dummy meson config to /tmp and then process the
resulting deps file from it, to allow you to recursively enable
dependencies of the user-selected components..

Regards,
/Bruce

[1] https://patches.dpdk.org/project/dpdk/patch/20240730145508.551075-1-bruce.richardson@intel.com/

^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 0/1] Add Visual Studio Code configuration script
  2024-07-30 15:01   ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Bruce Richardson
@ 2024-07-30 15:14     ` Burakov, Anatoly
  2024-07-30 15:19       ` Bruce Richardson
  0 siblings, 1 reply; 21+ messages in thread
From: Burakov, Anatoly @ 2024-07-30 15:14 UTC (permalink / raw)
  To: Bruce Richardson; +Cc: dev, john.mcnamara

On 7/30/2024 5:01 PM, Bruce Richardson wrote:
> On Mon, Jul 29, 2024 at 02:05:51PM +0100, Anatoly Burakov wrote:
>> Lots of developers (myself included) uses Visual Studio Code as their primary
>> IDE for DPDK development. I have been successfully using various incarnations of
>> this script internally to quickly set up my development trees whenever I need a
>> new configuration, so this script is being shared in hopes that it will be
>> useful both to new developers starting with DPDK, and to seasoned DPDK
>> developers who are already using Visual Studio Code. It makes starting working
>> on DPDK in Visual Studio Code so much easier!
>>
>> ** NOTE: Currently, only x86 configuration is generated as I have no way to test
>>     the code analysis configuration on any other platforms.
>>
>> ** NOTE 2: this is not for *Visual Studio* the Windows IDE, this is for *Visual
>>     Studio Code* the cross-platform code editor. Specifically, main target
>>     audience for this script is people who either run DPDK directly on their
>>     Linux machine, or who use Remote SSH functionality to connect to a remote
>>     Linux machine and set up VSCode build there. No other OS's are currently
>>     supported by the script.
>>
>> (if you're unaware of what is Remote SSH, I highly suggest checking it out [1])
>>
>> Philosophy behind this script is as follows:
>>
>> - The assumption is made that a developer will not be using wildly different
>>    configurations from build to build - usually, they build the same things, work
>>    with the same set of apps/drivers for a while, then switch to something else,
>>    at which point a new configuration is needed
>>
>> - Some configurations I consider to be "common" are included: debug build, debug
>>    optimized build, release build with docs, and ASan build (feel free to make
>>    suggestions here!)
>>
>> - By default, the script will not add any meson flags unless user requested it,
>>    however it will create launch configurations for all apps because not
>>    specifying any flags leads to all apps being enabled
>>
>> - All parameters that can be adjusted by TUI are also available as command line
>>    arguments, so while user interaction is the default (using whiptail), it's
>>    actually not required and can be bypassed
>>
> 
> The management of dependencies of components to be built is obviously a
> tricky area here, when specifying e.g. enable_drivers flags. It may be
> possible to improve the situation in meson itself, but that probably
> requires massive rework of the lib/meson.build, drivers/meson.build and
> app/meson.build files to process the subdirs and save the results for later
> use (effectively process them twice within the restrictions of meson only
> allowing subdir once).
> 
> In the meantime, as a better-than-nothing improvement, I've pushed a draft
> patch to have meson produce a dependencies file as part of its processing[1].
> That may be of use to you in doing new versions of the TUI - i.e. in the
> background you could run a dummy meson config to /tmp and then process the
> resulting deps file from it, to allow you to recursively enable
> dependencies of the user-selected components..

Thanks, this looks very interesting! It's a shame it can't be done 
without creating a build directory at all (e.g. by using meson dummy 
runs or something), but like you said, better than nothing!
> 
> Regards,
> /Bruce
> 
> [1] https://patches.dpdk.org/project/dpdk/patch/20240730145508.551075-1-bruce.richardson@intel.com/

-- 
Thanks,
Anatoly


^ permalink raw reply	[flat|nested] 21+ messages in thread

* Re: [RFC PATCH v2 0/1] Add Visual Studio Code configuration script
  2024-07-30 15:14     ` Burakov, Anatoly
@ 2024-07-30 15:19       ` Bruce Richardson
  0 siblings, 0 replies; 21+ messages in thread
From: Bruce Richardson @ 2024-07-30 15:19 UTC (permalink / raw)
  To: Burakov, Anatoly; +Cc: dev, john.mcnamara

On Tue, Jul 30, 2024 at 05:14:29PM +0200, Burakov, Anatoly wrote:
> On 7/30/2024 5:01 PM, Bruce Richardson wrote:
> > On Mon, Jul 29, 2024 at 02:05:51PM +0100, Anatoly Burakov wrote:
> > > Lots of developers (myself included) uses Visual Studio Code as their primary
> > > IDE for DPDK development. I have been successfully using various incarnations of
> > > this script internally to quickly set up my development trees whenever I need a
> > > new configuration, so this script is being shared in hopes that it will be
> > > useful both to new developers starting with DPDK, and to seasoned DPDK
> > > developers who are already using Visual Studio Code. It makes starting working
> > > on DPDK in Visual Studio Code so much easier!
> > > 
> > > ** NOTE: Currently, only x86 configuration is generated as I have no way to test
> > >     the code analysis configuration on any other platforms.
> > > 
> > > ** NOTE 2: this is not for *Visual Studio* the Windows IDE, this is for *Visual
> > >     Studio Code* the cross-platform code editor. Specifically, main target
> > >     audience for this script is people who either run DPDK directly on their
> > >     Linux machine, or who use Remote SSH functionality to connect to a remote
> > >     Linux machine and set up VSCode build there. No other OS's are currently
> > >     supported by the script.
> > > 
> > > (if you're unaware of what is Remote SSH, I highly suggest checking it out [1])
> > > 
> > > Philosophy behind this script is as follows:
> > > 
> > > - The assumption is made that a developer will not be using wildly different
> > >    configurations from build to build - usually, they build the same things, work
> > >    with the same set of apps/drivers for a while, then switch to something else,
> > >    at which point a new configuration is needed
> > > 
> > > - Some configurations I consider to be "common" are included: debug build, debug
> > >    optimized build, release build with docs, and ASan build (feel free to make
> > >    suggestions here!)
> > > 
> > > - By default, the script will not add any meson flags unless user requested it,
> > >    however it will create launch configurations for all apps because not
> > >    specifying any flags leads to all apps being enabled
> > > 
> > > - All parameters that can be adjusted by TUI are also available as command line
> > >    arguments, so while user interaction is the default (using whiptail), it's
> > >    actually not required and can be bypassed
> > > 
> > 
> > The management of dependencies of components to be built is obviously a
> > tricky area here, when specifying e.g. enable_drivers flags. It may be
> > possible to improve the situation in meson itself, but that probably
> > requires massive rework of the lib/meson.build, drivers/meson.build and
> > app/meson.build files to process the subdirs and save the results for later
> > use (effectively process them twice within the restrictions of meson only
> > allowing subdir once).
> > 
> > In the meantime, as a better-than-nothing improvement, I've pushed a draft
> > patch to have meson produce a dependencies file as part of its processing[1].
> > That may be of use to you in doing new versions of the TUI - i.e. in the
> > background you could run a dummy meson config to /tmp and then process the
> > resulting deps file from it, to allow you to recursively enable
> > dependencies of the user-selected components..
> 
> Thanks, this looks very interesting! It's a shame it can't be done without
> creating a build directory at all (e.g. by using meson dummy runs or
> something), but like you said, better than nothing!

Yes. I was wracking my brains to find a better way to do this, but haven't
come up with one yet.

^ permalink raw reply	[flat|nested] 21+ messages in thread

* [RFC PATCH v3 0/1] Add Visual Studio Code configuration script
  2024-07-26 12:42 [RFC PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
  2024-07-26 12:42 ` [RFC PATCH v1 1/1] devtools: add vscode configuration generator Anatoly Burakov
  2024-07-29 13:05 ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Anatoly Burakov
@ 2024-07-31 13:33 ` Anatoly Burakov
  2024-07-31 13:33   ` [RFC PATCH v3 1/1] buildtools: add vscode configuration generator Anatoly Burakov
  2024-09-02 12:17 ` [PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
  3 siblings, 1 reply; 21+ messages in thread
From: Anatoly Burakov @ 2024-07-31 13:33 UTC (permalink / raw)
  To: dev; +Cc: bruce.richardson, john.mcnamara

Lots of developers (myself included) uses Visual Studio Code as their primary
IDE for DPDK development. I have been successfully using various incarnations of
this script internally to quickly set up my development trees whenever I need a
new configuration, so this script is being shared in hopes that it will be
useful both to new developers starting with DPDK, and to seasoned DPDK
developers who are already using Visual Studio Code. It makes starting working
on DPDK in Visual Studio Code so much easier!

** NOTE: Currently, only x86 configuration is generated as I have no way to test
   the code analysis configuration on any other platforms.

** NOTE 2: this is not for *Visual Studio* the Windows IDE, this is for *Visual
   Studio Code* the cross-platform code editor. Specifically, main target
   audience for this script is people who either run DPDK directly on their
   Linux machine, or who use Remote SSH functionality to connect to a remote
   Linux machine and set up VSCode build there. No other OS's are currently
   supported by the script.

(if you're unaware of what is Remote SSH, I highly suggest checking it out [1])

Philosophy behind this script is as follows:

- Any build directory created will automatically add itself to VSCode
  configuration (ignore mechanism for e.g. test-meson-build.sh is WIP)

- Launch configuration is created using `which gdb`, so by default non-root
  users will have to do additional system configuration for things to work

- All of the interactive stuff has now been taken out and is planned to be
  included in a separate set of scripts, so this script now concerns itself only
  with adding build/launch targets to user's configuration and not much else

Please feel free to make any suggestions!

[1] https://code.visualstudio.com/docs/remote/ssh

Anatoly Burakov (1):
  buildtools: add vscode configuration generator

 app/meson.build               |  12 +-
 buildtools/gen-vscode-conf.py | 442 ++++++++++++++++++++++++++++++++++
 buildtools/meson.build        |   5 +
 examples/meson.build          |  13 +-
 meson.build                   |  11 +
 5 files changed, 481 insertions(+), 2 deletions(-)
 create mode 100755 buildtools/gen-vscode-conf.py

-- 
2.43.5


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [RFC PATCH v3 1/1] buildtools: add vscode configuration generator
  2024-07-31 13:33 ` [RFC PATCH v3 " Anatoly Burakov
@ 2024-07-31 13:33   ` Anatoly Burakov
  0 siblings, 0 replies; 21+ messages in thread
From: Anatoly Burakov @ 2024-07-31 13:33 UTC (permalink / raw)
  To: dev, Bruce Richardson; +Cc: john.mcnamara

A lot of developers use Visual Studio Code as their primary IDE. This
script will be called from within meson build process, and will generate
a configuration file for VSCode that sets up basic build tasks, launch
tasks, as well as C/C++ code analysis settings that will take into
account compile_commands.json that is automatically generated by meson.

Files generated by script:
 - .vscode/settings.json: stores variables needed by other files
 - .vscode/tasks.json: defines build tasks
 - .vscode/launch.json: defines launch tasks
 - .vscode/c_cpp_properties.json: defines code analysis settings

Multiple, as well as out-of-source-tree, build directories are supported,
and the script will generate separate configuration items for each build
directory created by user, tagging them for convenience.

Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
---

Notes:
    RFCv3 -> RFCv2:
    - Following feedback from Bruce, reworked to be minimal script run from meson
    - Moved to buildtools
    - Support for multiple build directories is now the default
    - All targets are automatically added to all configuration files
    
    RFCv1 -> RFCv2:
    
    - No longer disable apps and drivers if nothing was specified via command line
      or TUI, and warn user about things being built by default
    - Generate app launch configuration by default for when no apps are selected
    - Added paramters:
      - --force to avoid overwriting existing config
      - --common-conf to specify global meson flags applicable to all configs
      - --gdbsudo/--no-gdbsudo to specify gdbsudo behavior
    - Autodetect gdbsudo/gdb from UID
    - Updated comments, error messages, fixed issues with user interaction
    - Improved handling of wildcards and driver dependencies
    - Fixed a few bugs in dependency detection due to incorrect parsing
    - [Stephen] flake8 is happy

 app/meson.build               |  12 +-
 buildtools/gen-vscode-conf.py | 442 ++++++++++++++++++++++++++++++++++
 buildtools/meson.build        |   5 +
 examples/meson.build          |  13 +-
 meson.build                   |  11 +
 5 files changed, 481 insertions(+), 2 deletions(-)
 create mode 100755 buildtools/gen-vscode-conf.py

diff --git a/app/meson.build b/app/meson.build
index 5b2c80c7a1..cf0eda3d5f 100644
--- a/app/meson.build
+++ b/app/meson.build
@@ -114,7 +114,17 @@ foreach app:apps
         link_libs = dpdk_static_libraries + dpdk_drivers
     endif
 
-    exec = executable('dpdk-' + name,
+    # add to Visual Studio Code launch configuration
+    exe_name = 'dpdk-' + name
+    launch_path = join_paths(meson.current_build_dir(), exe_name)
+    # we don't want to block the build if this command fails
+    result = run_command(vscode_conf_gen_cmd + ['--launch', launch_path], check: false)
+    if result.returncode() != 0
+        warning('Failed to generate Visual Studio Code launch configuration for "' + name + '"')
+        message(result.stderr())
+    endif
+
+    exec = executable(exe_name,
             sources,
             c_args: cflags,
             link_args: ldflags,
diff --git a/buildtools/gen-vscode-conf.py b/buildtools/gen-vscode-conf.py
new file mode 100755
index 0000000000..fcc6469065
--- /dev/null
+++ b/buildtools/gen-vscode-conf.py
@@ -0,0 +1,442 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 Intel Corporation
+#
+
+"""Visual Studio Code configuration generator script."""
+
+# This script is meant to be run by meson build system to generate build and
+# launch commands for a specific build directory for Visual Studio Code IDE.
+#
+# Even though this script will generate settings/tasks/launch/code analysis
+# configuration for VSCode, we can't actually just regenerate the files,
+# because we want to support multiple build directories, as well as not
+# destroy any configuration user has created between runs of this script.
+# Therefore, we need some config file handling infrastructure. Luckily, VSCode
+# configs are all JSON, so we can just use json module to handle them. Of
+# course, we will lose any user comments in the files, but that's a small price
+# to pay for this sort of automation.
+#
+# Since this script will be run by meson, we can forego any parsing or anything
+# to do with the build system, and just rely on the fact that we get all of our
+# configuration from command-line.
+
+import argparse
+import ast
+import json
+import os
+import shutil
+from collections import OrderedDict
+from sys import stderr, exit as _exit
+from typing import List, Dict, Any
+
+
+class ConfigCtx:
+    """POD class to keep data associated with config."""
+    def __init__(self, build_dir: str, source_dir: str, launch: List[str]):
+        self.build_dir = build_dir
+        self.source_dir = source_dir
+        self.config_dir = os.path.join(source_dir, '.vscode')
+        # we don't have any mechanism to label things, so we're just going to
+        # use build dir basename as the label, and hope user doesn't create
+        # different build directories with the same name
+        self.label = os.path.basename(build_dir)
+        self.builddir_var = f'{self.label}.builddir'
+        self.launch = launch
+
+        settings_fname = 'settings.json'
+        tasks_fname = 'tasks.json'
+        launch_fname = 'launch.json'
+        analysis_fname = 'c_cpp_properties.json'
+        settings_tmp_fname = f'.{settings_fname}.{self.label}.tmp'
+        tasks_tmp_fname = f'.{tasks_fname}.{self.label}.tmp'
+        launch_tmp_fname = f'.{launch_fname}.{self.label}.tmp'
+        analysis_tmp_fname = f'.{analysis_fname}.{self.label}.tmp'
+
+        self.settings_path = os.path.join(self.config_dir, settings_fname)
+        self.tasks_path = os.path.join(self.config_dir, tasks_fname)
+        self.launch_path = os.path.join(self.config_dir, launch_fname)
+        self.analysis_path = os.path.join(self.config_dir, analysis_fname)
+
+        # we want to write into temporary files at first
+        self.settings_tmp = os.path.join(self.config_dir, settings_tmp_fname)
+        self.tasks_tmp = os.path.join(self.config_dir, tasks_tmp_fname)
+        self.launch_tmp = os.path.join(self.config_dir, launch_tmp_fname)
+        self.analysis_tmp = os.path.join(self.config_dir, analysis_tmp_fname)
+
+        # we don't want to mess with files if we didn't change anything
+        self.settings_changed = False
+        self.tasks_changed = False
+        self.launch_changed = False
+        self.analysis_changed = False
+
+
+class Boolifier(ast.NodeTransformer):
+    """Replace JSON "true" with Python "True"."""
+    def visit_Name(self, node: ast.Name) -> ast.Constant:
+        """Visitor for Name nodes."""
+        if node.id == 'true':
+            return ast.Constant(value=True)
+        elif node.id == 'false':
+            return ast.Constant(value=False)
+        return node
+
+
+def _parse_eval(data: str) -> Dict[str, Any]:
+    """Use AST and literal_eval to parse JSON."""
+    # JSON syntax is, for the most part, valid Python dictionary literal, aside
+    # from a small issue of capitalized booleans. so, we will try to parse
+    # JSON into an AST, replace "true"/"false" with "True"/"False", and then
+    # reparse the AST into a Python object
+    parsed = ast.parse(data)
+    unparsed = ast.unparse(Boolifier().visit(parsed))
+    # we parsed AST, now walk it and replace ast.Name nodes with booleans for
+    # actual AST boolean literals of type ast.Boolean
+    ast_data = ast.literal_eval(unparsed)
+    return ast_data
+
+
+def _load_json(file: str) -> Dict[str, Any]:
+    """Load JSON file."""
+    with open(file, 'r', encoding='utf-8') as f:
+        data = f.read()
+        try:
+            return json.loads(data)
+        except json.JSONDecodeError:
+            # Python's JSON parser doesn't like trailing commas but VSCode's
+            # JSON parser does not consider them to be syntax errors, so they
+            # may be present in user's configuration files. we can try to parse
+            # JSON as Python dictionary literal, and see if it works. if it
+            # doesn't, there's probably a syntax error anyway, so re-raise.
+            try:
+                return _parse_eval(data)
+            except (ValueError, TypeError, SyntaxError,
+                    MemoryError, RecursionError):
+                pass
+            raise
+
+
+def _dump_json(file: str, obj: Dict[str, Any]) -> None:
+    """Write JSON file."""
+    with open(file, 'w') as f:
+        json.dump(obj, f, indent=4)
+
+
+def _overwrite(src: str, dst: str) -> None:
+    """Overwrite dst file with src file."""
+    shutil.copyfile(src, dst)
+    # unlink src
+    os.unlink(src)
+
+
+def _gen_sorter(order: List[str]) -> Any:
+    """Sort dictionary by order."""
+
+    # JSON doesn't have sort order, but we want to be user friendly and display
+    # certain properties above others as they're more important. This function
+    # will return a closure that can be used to re-sort a specific object using
+    # OrderedDict and an ordered list of properties.
+    def _sorter(obj: Dict[str, Any]) -> OrderedDict[str, Any]:
+        d = OrderedDict()
+        # step 1: go through all properties in order and re-add them
+        for prop in order:
+            if prop in obj:
+                d[prop] = obj[prop]
+        # step 2: get all properties of the object, remove those that we have
+        #         already added, and sort them alphabetically
+        for prop in sorted(set(obj.keys()) - set(order)):
+            d[prop] = obj[prop]
+        # we're done: now all objects will have vaguely constant sort order
+        return d
+    return _sorter
+
+
+def _add_to_obj_list(obj_list: List[Dict[str, Any]],
+                     key: str, obj: Dict[str, Any]) -> bool:
+    """Add object to list if it doesn't already exist."""
+    for o in obj_list:
+        if o[key] == obj[key]:
+            return False
+    obj_list.append(obj)
+    return True
+
+
+def _process_settings(ctx: ConfigCtx) -> Dict[str, Any]:
+    """Update settings.json."""
+    try:
+        settings_obj = _load_json(ctx.settings_path)
+    except FileNotFoundError:
+        settings_obj = {}
+
+    # add build to settings if it doesn't exist
+    if ctx.builddir_var not in settings_obj:
+        ctx.settings_changed = True
+    settings_obj.setdefault(ctx.builddir_var, ctx.build_dir)
+
+    # add path ignore setting if it's inside the source dir
+    cpath = os.path.commonpath([ctx.source_dir, ctx.build_dir])
+    if cpath == ctx.source_dir:
+        # find path within source tree
+        relpath = os.path.relpath(ctx.build_dir, ctx.source_dir) + os.sep
+
+        # note if we need to change anything
+        if 'files.exclude' not in settings_obj:
+            ctx.settings_changed = True
+        elif relpath not in settings_obj['files.exclude']:
+            ctx.settings_changed = True
+
+        exclude = settings_obj.setdefault('files.exclude', {})
+        exclude.setdefault(relpath, True)
+        settings_obj['files.exclude'] = exclude
+
+    return settings_obj
+
+
+def _process_tasks(ctx: ConfigCtx) -> Dict[str, Any]:
+    """Update tasks.json."""
+    try:
+        outer_tasks_obj = _load_json(ctx.tasks_path)
+    except FileNotFoundError:
+        outer_tasks_obj = {
+            "version": "2.0.0",
+            "tasks": [],
+            "inputs": []
+        }
+    inner_tasks_obj = outer_tasks_obj.setdefault('tasks', [])
+    inputs_obj = outer_tasks_obj.setdefault('inputs', [])
+
+    # generate task object sorter
+    _sort_task = _gen_sorter(['label', 'detail', 'type', 'command', 'args',
+                              'options', 'problemMatcher', 'group'])
+
+    # generate our would-be configuration
+
+    # first, we need a build task
+    build_task = {
+        "label": f"[{ctx.label}] Compile",
+        "detail": f"Run `ninja` command for {ctx.label}",
+        "type": "shell",
+        "command": "meson compile",
+        "options": {
+            "cwd": f'${{config:{ctx.builddir_var}}}'
+        },
+        "problemMatcher": {
+            "base": "$gcc",
+            "fileLocation": ["relative", f"${{config:{ctx.builddir_var}}}"]
+        },
+        "group": "build"
+    }
+    # we also need a meson configure task with input
+    configure_task = {
+        "label": f"[{ctx.label}] Configure",
+        "detail": f"Run `meson configure` command for {ctx.label}",
+        "type": "shell",
+        "command": "meson configure ${input:mesonConfigureArg}",
+        "options": {
+            "cwd": f'${{config:{ctx.builddir_var}}}'
+        },
+        "problemMatcher": [],
+        "group": "build"
+    }
+    # finally, add input object
+    input_arg = {
+        "id": "mesonConfigureArg",
+        "type": "promptString",
+        "description": "Enter meson configure arguments",
+        "default": ""
+    }
+
+    # sort our tasks
+    build_task = _sort_task(build_task)
+    configure_task = _sort_task(configure_task)
+
+    # add only if task doesn't already exist
+    ctx.tasks_changed |= _add_to_obj_list(inner_tasks_obj, 'label',
+                                          build_task)
+    ctx.tasks_changed |= _add_to_obj_list(inner_tasks_obj, 'label',
+                                          configure_task)
+    ctx.tasks_changed |= _add_to_obj_list(inputs_obj, 'id', input_arg)
+
+    # replace nodes
+    outer_tasks_obj['tasks'] = inner_tasks_obj
+    outer_tasks_obj['inputs'] = inputs_obj
+
+    # we're ready
+    return outer_tasks_obj
+
+
+def _process_launch(ctx: ConfigCtx) -> Dict[str, Any]:
+    """Update launch.json."""
+    try:
+        launch_obj = _load_json(ctx.launch_path)
+    except FileNotFoundError:
+        launch_obj = {
+            "version": "0.2.0",
+            "configurations": []
+        }
+    configurations_obj = launch_obj.setdefault('configurations', [])
+
+    # generate launch task sorter
+    _sort_launch = _gen_sorter(['name', 'type', 'request', 'program', 'cwd',
+                                'preLaunchTask', 'environment', 'args',
+                                'MIMode', 'miDebuggerPath', 'setupCommands'])
+
+    gdb_path = shutil.which('gdb')
+    for target in ctx.launch:
+        # target will be a full path, we need to get relative to build path
+        exe_path = os.path.relpath(target, ctx.build_dir)
+        name = f"[{ctx.label}] Launch {exe_path}"
+        # generate config from template
+        launch_config = {
+            "name": name,
+            "type": "cppdbg",
+            "request": "launch",
+            "program": f"${{config:{ctx.builddir_var}}}/{exe_path}",
+            "args": [],
+            "cwd": "${workspaceFolder}",
+            "environment": [],
+            "MIMode": "gdb",
+            "miDebuggerPath": gdb_path,
+            "preLaunchTask": f"[{ctx.label}] Compile",
+            "setupCommands": [
+                {
+                    "description": "Enable pretty-printing for gdb",
+                    "text": "-gdb-set print pretty on",
+                    "ignoreFailures": True
+                }
+            ],
+        }
+        # sort keys
+        launch_config = _sort_launch(launch_config)
+        # add to configurations
+        ctx.launch_changed |= _add_to_obj_list(configurations_obj, 'name',
+                                               launch_config)
+
+    # replace the configuration object
+    launch_obj['configurations'] = configurations_obj
+
+    # we're ready
+    return launch_obj
+
+
+def _process_analysis(ctx: ConfigCtx) -> Dict[str, Any]:
+    """Update c_cpp_properties.json."""
+    try:
+        analysis_obj = _load_json(ctx.analysis_path)
+    except FileNotFoundError:
+        analysis_obj = {
+            "version": 4,
+            "configurations": []
+        }
+    configurations_obj = analysis_obj.setdefault('configurations', [])
+
+    # generate analysis config sorter
+    _sort_analysis = _gen_sorter(['name', 'includePath', 'compilerPath',
+                                  'cStandard', 'cppStandard',
+                                  'intelliSenseMode', 'compileCommands'])
+
+    # TODO: pick up more configuration from meson (e.g. OS, platform, compiler)
+
+    config_obj = {
+        "name": "Linux",
+        "includePath": [
+                f"${{config:{ctx.builddir_var}}}/",
+                # hardcode everything to x86/Linux for now
+                "${workspaceFolder}/lib/eal/x86",
+                "${workspaceFolder}/lib/eal/linux",
+                "${workspaceFolder}/**"
+        ],
+        "compilerPath": "/usr/bin/gcc",
+        "cStandard": "c99",
+        "cppStandard": "c++17",
+        "intelliSenseMode": "${default}",
+        "compileCommands":
+        f"${{config:{ctx.builddir_var}}}/compile_commands.json"
+    }
+    # sort configuration
+    config_obj = _sort_analysis(config_obj)
+
+    # add it to config obj
+    ctx.analysis_changed |= _add_to_obj_list(configurations_obj, 'name',
+                                             config_obj)
+
+    # we're done
+    analysis_obj['configurations'] = configurations_obj
+
+    return analysis_obj
+
+
+def _gen_config(ctx: ConfigCtx) -> None:
+    """Generate all config files."""
+    # ensure config dir exists
+    os.makedirs(ctx.config_dir, exist_ok=True)
+
+    # generate all JSON objects and write them to temp files
+    settings_obj = _process_settings(ctx)
+    _dump_json(ctx.settings_tmp, settings_obj)
+
+    tasks_obj = _process_tasks(ctx)
+    _dump_json(ctx.tasks_tmp, tasks_obj)
+
+    launch_obj = _process_launch(ctx)
+    _dump_json(ctx.launch_tmp, launch_obj)
+
+    analysis_obj = _process_analysis(ctx)
+    _dump_json(ctx.analysis_tmp, analysis_obj)
+
+
+def _main() -> int:
+    parser = argparse.ArgumentParser(
+        description='Generate VSCode configuration')
+    # where we are being called from
+    parser.add_argument('--build-dir', required=True, help='Build directory')
+    # where the sources are
+    parser.add_argument('--source-dir', required=True, help='Source directory')
+    # launch configuration item, can be multiple
+    parser.add_argument('--launch', action='append',
+                        help='Launch path for executable')
+    parser.epilog = "This script is not meant to be run manually."
+    # parse arguments
+    args = parser.parse_args()
+
+    # canonicalize all paths
+    build_dir = os.path.realpath(args.build_dir)
+    source_dir = os.path.realpath(args.source_dir)
+    if args.launch:
+        launch = [os.path.realpath(lp) for lp in args.launch]
+    else:
+        launch = []
+
+    ctx = ConfigCtx(build_dir, source_dir, launch)
+
+    try:
+        _gen_config(ctx)
+        # we finished configuration successfully, update if needed
+        update_dict = {
+            ctx.settings_path: (ctx.settings_tmp, ctx.settings_changed),
+            ctx.tasks_path: (ctx.tasks_tmp, ctx.tasks_changed),
+            ctx.launch_path: (ctx.launch_tmp, ctx.launch_changed),
+            ctx.analysis_path: (ctx.analysis_tmp, ctx.analysis_changed)
+        }
+        for path, t in update_dict.items():
+            tmp_path, changed = t
+            if changed:
+                _overwrite(tmp_path, path)
+            else:
+                os.unlink(tmp_path)
+
+        return 0
+    except json.JSONDecodeError as e:
+        # remove all temporary files we may have created
+        for tmp in [ctx.settings_tmp, ctx.tasks_tmp, ctx.launch_tmp,
+                    ctx.analysis_tmp]:
+            if os.path.exists(tmp):
+                os.unlink(tmp)
+        # if we fail to load JSON, output error
+        print(f"Error: {e}", file=stderr)
+
+        return 1
+
+
+if __name__ == '__main__':
+    _exit(_main())
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..7d2dc501d6 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,11 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
 get_test_suites_cmd = py3 + files('get-test-suites.py')
 has_hugepages_cmd = py3 + files('has-hugepages.py')
 cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+# Visual Studio Code conf generator always requires build and source root
+vscode_conf_gen_cmd = py3 + files('gen-vscode-conf.py') + [
+        '--build-dir', dpdk_build_root,
+        '--source-dir', dpdk_source_root
+    ]
 
 # install any build tools that end-users might want also
 install_data([
diff --git a/examples/meson.build b/examples/meson.build
index 8e8968a1fa..9e59223d3f 100644
--- a/examples/meson.build
+++ b/examples/meson.build
@@ -124,7 +124,18 @@ foreach example: examples
     if allow_experimental_apis
         cflags += '-DALLOW_EXPERIMENTAL_API'
     endif
-    executable('dpdk-' + name, sources,
+
+    # add to Visual Studio Code launch configuration
+    exe_name = 'dpdk-' + name
+    launch_path = join_paths(meson.current_build_dir(), exe_name)
+    # we don't want to block the build if this command fails
+    result = run_command(vscode_conf_gen_cmd + ['--launch', launch_path], check: false)
+    if result.returncode() != 0
+        warning('Failed to generate Visual Studio Code launch configuration for "' + name + '"')
+        message(result.stderr())
+    endif
+
+    executable(exe_name, sources,
             include_directories: includes,
             link_whole: link_whole_libs,
             link_args: ldflags,
diff --git a/meson.build b/meson.build
index 8b248d4505..df6115d098 100644
--- a/meson.build
+++ b/meson.build
@@ -117,6 +117,17 @@ if meson.is_subproject()
     subdir('buildtools/subproject')
 endif
 
+# if no apps or examples were enabled, no Visual Studio Code config was
+# generated, but we still need build, code analysis etc. configuration to be
+# present, so generate it just in case (it will have no effect if the
+# configuration was already generated by apps/examples). also, when running
+# this command, we don't want to block the build if it fails.
+result = run_command(vscode_conf_gen_cmd, check: false)
+if result.returncode() != 0
+    warning('Failed to generate Visual Studio Code configuration')
+    message(result.stderr())
+endif
+
 # Final output, list all the parts to be built.
 # This does not affect any part of the build, for information only.
 output_message = '\n=================\nApplications Enabled\n=================\n'
-- 
2.43.5


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [PATCH v1 0/1] Add Visual Studio Code configuration script
  2024-07-26 12:42 [RFC PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
                   ` (2 preceding siblings ...)
  2024-07-31 13:33 ` [RFC PATCH v3 " Anatoly Burakov
@ 2024-09-02 12:17 ` Anatoly Burakov
  2024-09-02 12:17   ` [PATCH v1 1/1] buildtools: add VSCode configuration generator Anatoly Burakov
  3 siblings, 1 reply; 21+ messages in thread
From: Anatoly Burakov @ 2024-09-02 12:17 UTC (permalink / raw)
  To: dev; +Cc: john.mcnamara, bruce.richardson

Lots of developers (myself included) uses Visual Studio Code as their primary
IDE for DPDK development. I have been successfully using various incarnations of
this script internally to quickly set up my development trees whenever I need a
new configuration, so this script is being shared in hopes that it will be
useful both to new developers starting with DPDK, and to seasoned DPDK
developers who are already or may want to start using Visual Studio Code. It
makes starting working on DPDK in Visual Studio Code so much easier!

** NOTE: While code analysis configuration is now populated from Meson and
   should pick up platform- or OS-specifc configuration, I have no way to test
   the code analysis configuration on anything but Linux/x86.

** NOTE 2: this is not for *Visual Studio* the Windows IDE, this is for *Visual
   Studio Code* the cross-platform code editor. Specifically, main target
   audience for this script is people who either run DPDK directly on their
   Linux machine, or who use Remote SSH functionality to connect to a remote
   Linux machine and set up VSCode build there. While the script should in theory
   work with any OS/platform supported by DPDK, it was not tested under anything
   but Linux/x86.

(if you're unaware of what is Remote SSH, I highly suggest checking it out [1])

Philosophy behind this script is as follows:

- Any build directory created will automatically add itself to VSCode
  configuration

- Launch configuration is created using `which gdb`, so by default non-root
  users will have to do additional system configuration for things to work. This
  is now documented in a new section called "Integration with IDE's".

- All of the interactive stuff has now been taken out and is planned to be
  included in a separate set of scripts, so this script now concerns itself only
  with adding build/launch targets to user's configuration and not much else

Please feel free to make any suggestions!

[1] https://code.visualstudio.com/docs/remote/ssh

Anatoly Burakov (1):
  buildtools: add VSCode configuration generator

 app/meson.build                             |  14 +-
 buildtools/gen-vscode-conf.py               | 570 ++++++++++++++++++++
 buildtools/meson.build                      |   5 +
 devtools/test-meson-builds.sh               |   3 +
 doc/guides/contributing/ide_integration.rst |  85 +++
 doc/guides/contributing/index.rst           |   1 +
 doc/guides/rel_notes/release_24_11.rst      |   5 +
 examples/meson.build                        |  15 +-
 meson.build                                 |  14 +
 9 files changed, 710 insertions(+), 2 deletions(-)
 create mode 100755 buildtools/gen-vscode-conf.py
 create mode 100644 doc/guides/contributing/ide_integration.rst

-- 
2.43.5


^ permalink raw reply	[flat|nested] 21+ messages in thread

* [PATCH v1 1/1] buildtools: add VSCode configuration generator
  2024-09-02 12:17 ` [PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
@ 2024-09-02 12:17   ` Anatoly Burakov
  0 siblings, 0 replies; 21+ messages in thread
From: Anatoly Burakov @ 2024-09-02 12:17 UTC (permalink / raw)
  To: dev, Bruce Richardson; +Cc: john.mcnamara

A lot of developers use Visual Studio Code as their primary IDE. This
script will be called from within meson build process, and will generate
a configuration file for VSCode that sets up basic build tasks, launch
tasks, as well as C/C++ code analysis settings that will take into
account compile_commands.json that is automatically generated by meson.

Files generated by script:
 - .vscode/settings.json: stores variables needed by other files
 - .vscode/tasks.json: defines configure/build tasks
 - .vscode/launch.json: defines launch tasks
 - .vscode/c_cpp_properties.json: defines code analysis settings

Multiple, as well as out-of-source-tree, build directories are supported,
and the script will generate separate configuration items for each build
directory created by user, tagging them for convenience.

In addition to all of the above, some default quality-of-life improvements
are also included, such as build-on-save (needs an extension), as well as
disablement of some default VSCode settings that are not needed for DPDK.

Signed-off-by: Anatoly Burakov <anatoly.burakov@intel.com>
---

Notes:
    RFCv3 -> v1:
    - Format code with Ruff
    - Use T. as typing alias
    - Fixed wrong comments in places
    - Added more typing information to keep PyLance happy
    - Refactor for resilience to various error conditions and proper cleanup
    - Added preprocessor to workaround issues with Python's JSON parser
    - Added a way to disable config generation from shell variable
    - Default to build-on-save if user has appropriate extension installed
    - Disable some common Javascript things that slow down task list
    - Configure code analysis from exec-env and arch-subdir variables
    - Added documentation
    
    RFCv2 -> RFCv3:
    - Following feedback from Bruce, reworked to be minimal script run from meson
    - Moved to buildtools
    - Support for multiple build directories is now the default
    - All targets are automatically added to all configuration files
    
    RFCv1 -> RFCv2:
    
    - No longer disable apps and drivers if nothing was specified via command line
      or TUI, and warn user about things being built by default
    - Generate app launch configuration by default for when no apps are selected
    - Added paramters:
      - --force to avoid overwriting existing config
      - --common-conf to specify global meson flags applicable to all configs
      - --gdbsudo/--no-gdbsudo to specify gdbsudo behavior
    - Autodetect gdbsudo/gdb from UID
    - Updated comments, error messages, fixed issues with user interaction
    - Improved handling of wildcards and driver dependencies
    - Fixed a few bugs in dependency detection due to incorrect parsing
    - [Stephen] flake8 is happy

 app/meson.build                             |  14 +-
 buildtools/gen-vscode-conf.py               | 570 ++++++++++++++++++++
 buildtools/meson.build                      |   5 +
 devtools/test-meson-builds.sh               |   3 +
 doc/guides/contributing/ide_integration.rst |  85 +++
 doc/guides/contributing/index.rst           |   1 +
 doc/guides/rel_notes/release_24_11.rst      |   5 +
 examples/meson.build                        |  15 +-
 meson.build                                 |  14 +
 9 files changed, 710 insertions(+), 2 deletions(-)
 create mode 100755 buildtools/gen-vscode-conf.py
 create mode 100644 doc/guides/contributing/ide_integration.rst

diff --git a/app/meson.build b/app/meson.build
index 5b2c80c7a1..6a3fbea65b 100644
--- a/app/meson.build
+++ b/app/meson.build
@@ -114,7 +114,19 @@ foreach app:apps
         link_libs = dpdk_static_libraries + dpdk_drivers
     endif
 
-    exec = executable('dpdk-' + name,
+    # add to Visual Studio Code launch configuration
+    exe_name = 'dpdk-' + name
+    launch_path = join_paths(meson.current_build_dir(), exe_name)
+    # we also need exec env/arch, but they were not available at the time buildtools command was generated
+    cfg_args = ['--launch', launch_path, '--exec-env', exec_env, '--arch', arch_subdir]
+    # we don't want to block the build if this command fails
+    result = run_command(vscode_conf_gen_cmd + cfg_args, check: false)
+    if result.returncode() != 0
+        warning('Failed to generate Visual Studio Code launch configuration for "' + name + '"')
+        message(result.stderr())
+    endif
+
+    exec = executable(exe_name,
             sources,
             c_args: cflags,
             link_args: ldflags,
diff --git a/buildtools/gen-vscode-conf.py b/buildtools/gen-vscode-conf.py
new file mode 100755
index 0000000000..3e88c9e4ff
--- /dev/null
+++ b/buildtools/gen-vscode-conf.py
@@ -0,0 +1,570 @@
+#!/usr/bin/env python3
+# SPDX-License-Identifier: BSD-3-Clause
+# Copyright(c) 2024 Intel Corporation
+#
+
+"""Visual Studio Code configuration generator script."""
+
+# This script is meant to be run by meson build system to generate build and launch commands for a
+# specific build directory for Visual Studio Code IDE.
+#
+# Even though this script will generate settings/tasks/launch/code analysis configuration for
+# VSCode, we can't actually just regenerate the files, because we want to support multiple build
+# directories, as well as not destroy any configuration user has created between runs of this
+# script. Therefore, we need some config file handling infrastructure. Luckily, VSCode configs are
+# all JSON, so we can just use json module to handle them. Of course, we will lose any user
+# comments in the files, but that's a small price to pay for this sort of automation.
+#
+# Since this script will be run by meson, we can forego any parsing or anything to do with the
+# build system, and just rely on the fact that we get all of our configuration from command-line.
+
+import argparse
+import json
+import os
+import shutil
+from collections import OrderedDict
+from sys import stderr, exit as _exit
+import typing as T
+
+# if this variable is defined, we will not generate any configuration files
+ENV_DISABLE = "DPDK_DISABLE_VSCODE_CONFIG"
+
+
+def _preprocess_json(data: str) -> str:
+    """Preprocess JSON to remove trailing commas, whitespace, and comments."""
+    preprocessed_data: T.List[str] = []
+    # simple state machine
+    in_comment = False
+    in_string = False
+    escape = False
+    comma = False
+    maybe_comment = False
+    for c in data:
+        _fwdslash = c == "/"
+        _newline = c == "\n"
+        _comma = c == ","
+        _obj_end = c in ["}", "]"]
+        _space = c.isspace()
+        _backslash = c == "\\"
+        _quote = c == '"'
+
+        # are we looking to start a comment?
+        if maybe_comment:
+            maybe_comment = False
+            if _fwdslash:
+                in_comment = True
+                continue
+            # slash is illegal JSON but this is not our job
+            preprocessed_data.append("/")
+            # c will receive further processing
+        # are we inside a comment?
+        if in_comment:
+            if _newline:
+                in_comment = False
+            # eat everything
+            continue
+        # do we have a trailing comma?
+        if comma:
+            # there may be whitespace after the comma
+            if _space:
+                continue
+            comma = False
+            if _obj_end:
+                # throw away trailing comma
+                preprocessed_data.append(c)
+                continue
+            # comma was needed
+            preprocessed_data.append(",")
+            # c needs further processing
+        # are we inside a string?
+        if in_string:
+            # are we in an escape?
+            if escape:
+                escape = False
+            # are we trying to escape?
+            elif _backslash:
+                escape = True
+            # are we ending the string?
+            elif _quote:
+                in_string = False
+            # we're inside a string
+            preprocessed_data.append(c)
+            continue
+        # are we looking to start a string?
+        if _quote:
+            in_string = True
+            preprocessed_data.append(c)
+            continue
+        # are we looking to start a comment?
+        elif _fwdslash:
+            maybe_comment = True
+            continue
+        # are we looking at a comma?
+        elif _comma:
+            comma = True
+            continue
+        # are we looking at whitespace?
+        elif _space:
+            continue
+        # this is a regular character, just add it
+        preprocessed_data.append(c)
+
+    return "".join(preprocessed_data)
+
+
+def _load_json(file: str) -> T.Dict[str, T.Any]:
+    """Load JSON file."""
+    with open(file, "r", encoding="utf-8") as f:
+        # Python's JSON parser doesn't like trailing commas, but VSCode's JSON parser does not
+        # consider them to be syntax errors, so they may be present in user's configuration files.
+        # remove them from the file before processing.
+        data = _preprocess_json(f.read())
+        try:
+            return json.loads(data)
+        except json.JSONDecodeError as e:
+            print(f"Error parsing {os.path.basename(file)}: {e}", file=stderr)
+            raise
+
+
+def _save_json(file: str, obj: T.Dict[str, T.Any]) -> None:
+    """Write JSON file."""
+    with open(file, "w", encoding="utf-8") as f:
+        json.dump(obj, f, indent="\t")
+
+
+class ConfigCtx:
+    """Data associated with config processing."""
+
+    def __init__(
+        self,
+        build_dir: str,
+        source_dir: str,
+        launch: T.List[str],
+        exec_env: str,
+        arch: str,
+    ):
+        self.build_dir = build_dir
+        self.source_dir = source_dir
+        self.config_dir = os.path.join(source_dir, ".vscode")
+        self.exec_env = exec_env
+        self.arch = arch
+        # we don't have any mechanism to label things, so we're just going to
+        # use build dir basename as the label, and hope user doesn't create
+        # different build directories with the same name
+        self.label = os.path.basename(build_dir)
+        self.builddir_var = f"{self.label}-builddir"
+        # default to gdb
+        self.dbg_path_var = f"{self.label}-dbg-path"
+        self.launch_dbg_path = shutil.which("gdb")
+        self.dbg_mode_var = f"{self.label}-dbg-mode"
+        self.dbg_mode = "gdb"
+        self.launch = launch
+        self.compile_task = f"[{self.label}] Compile"
+
+        # filenames for configs
+        self.settings_fname = "settings.json"
+        self.tasks_fname = "tasks.json"
+        self.launch_fname = "launch.json"
+        self.analysis_fname = "c_cpp_properties.json"
+
+        # temporary filenames to avoid touching user's configuration until last moment
+        self._tmp_fnames = {
+            self.settings_fname: f".{self.settings_fname}.{self.label}.tmp",
+            self.tasks_fname: f".{self.tasks_fname}.{self.label}.tmp",
+            self.launch_fname: f".{self.launch_fname}.{self.label}.tmp",
+            self.analysis_fname: f".{self.analysis_fname}.{self.label}.tmp",
+        }
+        # when there is no user configuration, use these templates
+        self._templates: T.Dict[str, T.Dict[str, T.Any]] = {
+            self.settings_fname: {},
+            self.tasks_fname: {"version": "2.0.0", "tasks": [], "inputs": []},
+            self.launch_fname: {"version": "0.2.0", "configurations": []},
+            self.analysis_fname: {"version": 4, "configurations": []},
+        }
+
+    def _get_fname(self, fname: str) -> str:
+        """Get filename for configuration."""
+        if fname not in self._tmp_fnames:
+            raise ValueError(f"Unknown configuration file {fname}")
+        return os.path.join(self.config_dir, self._tmp_fnames[fname])
+
+    def load(self, fname: str) -> T.Dict[str, T.Any]:
+        """Load or generate JSON data from template."""
+        path = self._get_fname(fname)
+        try:
+            return _load_json(path)
+        except FileNotFoundError:
+            return self._templates[fname]
+
+    def save(self, fname: str, obj: T.Dict[str, T.Any]) -> None:
+        """Save JSON data to temporary file."""
+        path = self._get_fname(fname)
+        _save_json(path, obj)
+
+    def commit(self):
+        """Commit previously saved settings to configuration."""
+        for dst, tmp in self._tmp_fnames.items():
+            fp_tmp = os.path.join(self.config_dir, tmp)
+            fp_dst = os.path.join(self.config_dir, dst)
+            if os.path.exists(fp_tmp):
+                shutil.copyfile(fp_tmp, fp_dst)
+
+    def cleanup(self):
+        """Cleanup any temporary files."""
+        for tmp in self._tmp_fnames.values():
+            fp_tmp = os.path.join(self.config_dir, tmp)
+            if os.path.exists(fp_tmp):
+                os.unlink(fp_tmp)
+
+
+def _gen_sorter(order: T.List[str]) -> T.Any:
+    """Sort dictionary by order."""
+
+    # JSON doesn't have sort order, but we want to be user friendly and display certain properties
+    # above others as they're more important. This function will return a closure that can be used
+    # to re-sort a specific object using OrderedDict and an ordered list of properties.
+    def _sorter(obj: T.Dict[str, T.Any]) -> OrderedDict[str, T.Any]:
+        d: OrderedDict[str, T.Any] = OrderedDict()
+        # step 1: go through all properties in order and re-add them
+        for prop in order:
+            if prop in obj:
+                d[prop] = obj[prop]
+        # step 2: get all properties of the object, remove those that we have already added, and
+        #         sort them alphabetically
+        for prop in sorted(set(obj.keys()) - set(order)):
+            d[prop] = obj[prop]
+        # we're done: now all objects will have vaguely constant sort order
+        return d
+
+    return _sorter
+
+
+def _add_obj_to_list(
+    obj_list: T.List[T.Dict[str, T.Any]], key: str, obj: T.Dict[str, T.Any]
+) -> bool:
+    """Add object to list if it doesn't already exist."""
+    for o in obj_list:
+        if o[key] == obj[key]:
+            return False
+    obj_list.append(obj)
+    return True
+
+
+def _add_var_to_obj(obj: T.Dict[str, T.Any], var: str, value: T.Any) -> bool:
+    """Add variable to object if it doesn't exist."""
+    if var in obj:
+        return False
+    obj[var] = value
+    return True
+
+
+def _update_settings(ctx: ConfigCtx) -> T.Optional[T.Dict[str, T.Any]]:
+    """Update settings.json."""
+    settings_obj = ctx.load(ctx.settings_fname)
+    dirty = False
+
+    ttos_tasks = "triggerTaskOnSave.tasks"
+    ttos_on = "triggerTaskOnSave.on"
+    default_vars: T.Dict[str, T.Any] = {
+        # store build dir
+        ctx.builddir_var: ctx.build_dir,
+        # store debug configuration
+        ctx.dbg_path_var: ctx.launch_dbg_path,
+        ctx.dbg_mode_var: ctx.dbg_mode,
+        # store dbg mode and path
+        # trigger build on save
+        ttos_tasks: {},
+        ttos_on: True,
+        # improve responsiveness by disabling auto-detection of tasks
+        "npm.autoDetect": "off",
+        "gulp.autoDetect": "off",
+        "jake.autoDetect": "off",
+        "grunt.autoDetect": "off",
+        "typescript.tsc.autoDetect": "off",
+        "task.autoDetect": "off",
+    }
+
+    for var, value in default_vars.items():
+        dirty |= _add_var_to_obj(settings_obj, var, value)
+
+    # add path ignore setting if it's inside the source dir
+    cpath = os.path.commonpath([ctx.source_dir, ctx.build_dir])
+    if cpath == ctx.source_dir:
+        # find path within source tree
+        relpath = os.path.relpath(ctx.build_dir, ctx.source_dir) + os.sep
+
+        # note if we need to change anything
+        if "files.exclude" not in settings_obj:
+            dirty = True
+        elif relpath not in settings_obj["files.exclude"]:
+            dirty = True
+
+        exclude = settings_obj.setdefault("files.exclude", {})
+        exclude.setdefault(relpath, True)
+        settings_obj["files.exclude"] = exclude
+
+    # if user has installed "Trigger Task On Save" extension (extension id:
+    # Gruntfuggly.triggertaskonsave), this will enable build-on-save by default
+    if ctx.compile_task not in settings_obj[ttos_tasks]:
+        dirty = True
+    # trigger build on save for all files
+    settings_obj[ttos_tasks][ctx.compile_task] = ["**/*"]
+
+    return settings_obj if dirty else None
+
+
+def _update_tasks(ctx: ConfigCtx) -> T.Optional[T.Dict[str, T.Any]]:
+    """Update tasks.json."""
+    outer_tasks_obj = ctx.load(ctx.tasks_fname)
+    inner_tasks_obj = outer_tasks_obj.setdefault("tasks", [])
+    inputs_obj = outer_tasks_obj.setdefault("inputs", [])
+    dirty = False
+
+    # generate task object sorter
+    _sort_task = _gen_sorter(
+        [
+            "label",
+            "detail",
+            "type",
+            "command",
+            "args",
+            "options",
+            "problemMatcher",
+            "group",
+        ]
+    )
+
+    # generate our would-be configuration
+
+    # first, we need a build task
+    build_task: T.Dict[str, T.Any] = {
+        "label": ctx.compile_task,
+        "detail": f"Run `meson compile` command for {ctx.label}",
+        "type": "shell",
+        "command": "meson compile",
+        "options": {"cwd": f"${{config:{ctx.builddir_var}}}"},
+        "problemMatcher": {
+            "base": "$gcc",
+            "fileLocation": ["relative", f"${{config:{ctx.builddir_var}}}"],
+        },
+        "group": "build",
+    }
+    # we also need a meson configure task with input
+    configure_task: T.Dict[str, T.Any] = {
+        "label": f"[{ctx.label}] Configure",
+        "detail": f"Run `meson configure` command for {ctx.label}",
+        "type": "shell",
+        "command": "meson configure ${input:mesonConfigureArg}",
+        "options": {"cwd": f"${{config:{ctx.builddir_var}}}"},
+        "problemMatcher": [],
+        "group": "build",
+    }
+    # finally, add input object
+    input_arg: T.Dict[str, T.Any] = {
+        "id": "mesonConfigureArg",
+        "type": "promptString",
+        "description": "Enter meson configure arguments",
+        "default": "",
+    }
+
+    # sort our tasks
+    build_task = _sort_task(build_task)
+    configure_task = _sort_task(configure_task)
+
+    # add only if task doesn't already exist
+    dirty |= _add_obj_to_list(inner_tasks_obj, "label", build_task)
+    dirty |= _add_obj_to_list(inner_tasks_obj, "label", configure_task)
+    dirty |= _add_obj_to_list(inputs_obj, "id", input_arg)
+
+    # replace nodes
+    outer_tasks_obj["tasks"] = inner_tasks_obj
+    outer_tasks_obj["inputs"] = inputs_obj
+
+    # we're ready
+    return outer_tasks_obj if dirty else None
+
+
+def _update_launch(ctx: ConfigCtx) -> T.Optional[T.Dict[str, T.Any]]:
+    """Update launch.json."""
+    launch_obj = ctx.load(ctx.launch_fname)
+    configurations_obj = launch_obj.setdefault("configurations", [])
+    dirty = False
+
+    # generate launch task sorter
+    _sort_launch = _gen_sorter(
+        [
+            "name",
+            "type",
+            "request",
+            "program",
+            "cwd",
+            "preLaunchTask",
+            "environment",
+            "args",
+            "MIMode",
+            "miDebuggerPath",
+            "setupCommands",
+        ]
+    )
+
+    for target in ctx.launch:
+        # target will be a full path, we need to get relative to build path
+        exe_path = os.path.relpath(target, ctx.build_dir)
+        name = f"[{ctx.label}] Launch {exe_path}"
+        # generate config from template
+        launch_config: T.Dict[str, T.Any] = {
+            "name": name,
+            "type": "cppdbg",
+            "request": "launch",
+            "program": f"${{config:{ctx.builddir_var}}}/{exe_path}",
+            "args": [],
+            "cwd": "${workspaceFolder}",
+            "environment": [],
+            "MIMode": f"${{config:{ctx.dbg_mode_var}",
+            "miDebuggerPath": f"${{config:{ctx.dbg_path_var}",
+            "preLaunchTask": ctx.compile_task,
+            "setupCommands": [
+                {
+                    "description": "Enable pretty-printing for gdb",
+                    "text": "-gdb-set print pretty on",
+                    "ignoreFailures": True,
+                }
+            ],
+        }
+        # sort keys
+        launch_config = _sort_launch(launch_config)
+        # add to configurations
+        dirty |= _add_obj_to_list(configurations_obj, "name", launch_config)
+
+    # replace the configuration object
+    launch_obj["configurations"] = configurations_obj
+
+    # we're ready
+    return launch_obj if dirty else None
+
+
+def _update_analysis(ctx: ConfigCtx) -> T.Optional[T.Dict[str, T.Any]]:
+    """Update c_cpp_properties.json."""
+    analysis_obj = ctx.load(ctx.analysis_fname)
+    configurations_obj = analysis_obj.setdefault("configurations", [])
+    dirty = False
+
+    # generate analysis config sorter
+    _sort_analysis = _gen_sorter(
+        [
+            "name",
+            "includePath",
+            "compilerPath",
+            "cStandard",
+            "cppStandard",
+            "intelliSenseMode",
+            "compileCommands",
+        ]
+    )
+
+    config_obj: T.Dict[str, T.Any] = {
+        "name": ctx.exec_env.capitalize(),
+        "includePath": [
+            f"${{config:{ctx.builddir_var}}}/",
+            # hardcode everything to x86/Linux for now
+            f"${{workspaceFolder}}/lib/eal/{ctx.arch}/include",
+            f"${{workspaceFolder}}/lib/eal/{ctx.exec_env}/include",
+            "${workspaceFolder}/**",
+        ],
+        "compilerPath": "/usr/bin/gcc",
+        "cStandard": "c99",
+        "cppStandard": "c++17",
+        "intelliSenseMode": "${default}",
+        "compileCommands": f"${{config:{ctx.builddir_var}}}/compile_commands.json",
+    }
+    # sort configuration
+    config_obj = _sort_analysis(config_obj)
+
+    # add it to config obj
+    dirty |= _add_obj_to_list(configurations_obj, "name", config_obj)
+
+    # we're done
+    analysis_obj["configurations"] = configurations_obj
+
+    return analysis_obj if dirty else None
+
+
+def _gen_config(ctx: ConfigCtx) -> None:
+    """Generate all config files."""
+
+    # generate all JSON objects and save them if we changed anything about them
+    settings_obj = _update_settings(ctx)
+    tasks_obj = _update_tasks(ctx)
+    launch_obj = _update_launch(ctx)
+    analysis_obj = _update_analysis(ctx)
+
+    if settings_obj is not None:
+        ctx.save(ctx.settings_fname, settings_obj)
+    if tasks_obj is not None:
+        ctx.save(ctx.tasks_fname, tasks_obj)
+    if launch_obj is not None:
+        ctx.save(ctx.launch_fname, launch_obj)
+    if analysis_obj is not None:
+        ctx.save(ctx.analysis_fname, analysis_obj)
+
+    # the above saves only saved to temporary files, now overwrite real files
+    ctx.commit()
+
+
+def _main() -> int:
+    if os.environ.get(ENV_DISABLE, "") == "1":
+        print(
+            "Visual Studio Code configuration generation "
+            f"disabled by environment variable {ENV_DISABLE}=1"
+        )
+        return 0
+    parser = argparse.ArgumentParser(description="Generate VSCode configuration")
+    # where we are being called from
+    parser.add_argument("--build-dir", required=True, help="Build directory")
+    # where the sources are
+    parser.add_argument("--source-dir", required=True, help="Source directory")
+    # exec-env - Windows, Linux etc.
+    parser.add_argument("--exec-env", required=True, help="Execution environment")
+    # arch - x86, arm etc.
+    parser.add_argument("--arch", required=True, help="Architecture")
+    # launch configuration item, can be multiple
+    parser.add_argument("--launch", action="append", help="Launch path for executable")
+    parser.epilog = "This script is not meant to be run manually."
+    # parse arguments
+    args = parser.parse_args()
+
+    # canonicalize all paths
+    build_dir = os.path.realpath(args.build_dir)
+    source_dir = os.path.realpath(args.source_dir)
+    if args.launch:
+        launch = [os.path.realpath(lp) for lp in args.launch]
+    else:
+        launch = []
+    exec_env = args.exec_env
+    arch = args.arch
+
+    ctx = ConfigCtx(build_dir, source_dir, launch, exec_env, arch)
+
+    try:
+        # ensure config dir exists
+        os.makedirs(ctx.config_dir, exist_ok=True)
+
+        _gen_config(ctx)
+
+        ret = 0
+    except json.JSONDecodeError as e:
+        # if we fail to load JSON, output error
+        print(f"Error: {e}", file=stderr)
+        ret = 1
+    except OSError as e:
+        # if we fail to write to disk, output error
+        print(f"Error: {e}", file=stderr)
+        ret = 1
+
+    # remove any temporary files
+    ctx.cleanup()
+    return ret
+
+
+if __name__ == "__main__":
+    _exit(_main())
diff --git a/buildtools/meson.build b/buildtools/meson.build
index 3adf34e1a8..f529189dbc 100644
--- a/buildtools/meson.build
+++ b/buildtools/meson.build
@@ -24,6 +24,11 @@ get_numa_count_cmd = py3 + files('get-numa-count.py')
 get_test_suites_cmd = py3 + files('get-test-suites.py')
 has_hugepages_cmd = py3 + files('has-hugepages.py')
 cmdline_gen_cmd = py3 + files('dpdk-cmdline-gen.py')
+# Visual Studio Code conf generator always requires build/source roots
+vscode_conf_gen_cmd = py3 + files('gen-vscode-conf.py') + [
+        '--build-dir', dpdk_build_root,
+        '--source-dir', dpdk_source_root
+    ]
 
 # install any build tools that end-users might want also
 install_data([
diff --git a/devtools/test-meson-builds.sh b/devtools/test-meson-builds.sh
index d71bb1ded0..4b80d4dea4 100755
--- a/devtools/test-meson-builds.sh
+++ b/devtools/test-meson-builds.sh
@@ -53,6 +53,8 @@ default_cppflags=$CPPFLAGS
 default_cflags=$CFLAGS
 default_ldflags=$LDFLAGS
 default_meson_options=$DPDK_MESON_OPTIONS
+# disable VSCode config generation
+export DPDK_DISABLE_VSCODE_CONFIG=1
 
 opt_verbose=
 opt_vverbose=
@@ -88,6 +90,7 @@ load_env () # <target compiler>
 	export CFLAGS=$default_cflags
 	export LDFLAGS=$default_ldflags
 	export DPDK_MESON_OPTIONS=$default_meson_options
+
 	# set target hint for use in the loaded config file
 	if [ -n "$target_override" ] ; then
 		DPDK_TARGET=$target_override
diff --git a/doc/guides/contributing/ide_integration.rst b/doc/guides/contributing/ide_integration.rst
new file mode 100644
index 0000000000..9ad0c78004
--- /dev/null
+++ b/doc/guides/contributing/ide_integration.rst
@@ -0,0 +1,85 @@
+..  SPDX-License-Identifier: BSD-3-Clause
+    Copyright 2024 The DPDK contributors
+
+Integrating DPDK with IDEs
+==========================
+
+DPDK does not mandate nor recommend a specific IDE for development. However,
+some developers may prefer to use an IDE for their development work. This guide
+provides information on how to integrate DPDK with some popular IDEs.
+
+Visual Studio Code
+------------------
+
+`Visual Studio Code <https://code.visualstudio.com/>` is a popular open-source
+code editor with IDE features such as code completion, debugging, Git
+integration, and more. It is available on most platforms.
+
+Configuration
+~~~~~~~~~~~~~
+
+When configuring a new Meson build directory for DPDK, configuration for Visual
+Studio Code will be generated automatically. It will include both a compilation
+task, as well as debugging targets for any applications or examples enabled in
+meson at configuration step. Generated configuration will be available under
+`.vscode` directory in DPDK source tree. The configuration will be updated each
+time the build directory is reconfigured with Meson.
+
+Further information on configuring, building and installing DPDK is described in
+:doc:`Linux Getting Started Guide <../linux_gsg/build_dpdk>`.
+
+.. note::
+
+    The configuration is generated based on the enabled applications and
+    examples at the time of configuration. When new applications or examples are
+    added to the configuration using the `meson configure` command (or through
+    running `Configure` task), new configuration will be added, but existing
+    configuration will never be amended or deleted, even if the application was
+    removed from build.
+
+Each generated file will refer to a few common variables defined under
+`settings.json`. This is to allow easy reconfiguration of all generated launch
+targets while also still allowing user to customize the configuration. Variables
+contained within `settings.json` are as follows:
+
+- `<build-dir-name>-builddir`: Path to the build directory (can be in-tree or
+  out-of-tree)
+- `<build-dir-name>-dbg-path`: Variable for `miDebuggerPath` in launch tasks
+- `<build-dir-name>-dbg-mode`: Variable for `MIMode` in launch tasks
+
+It is not recommended to change these variables unless there is a specific need.
+
+.. note::
+
+    Due to the way the configuration generation is implemented, each time the
+    configuration is updated, any user comments will be lost.
+
+Running as unprivileged user
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+If not running as privileged user, then by default the generated configuration
+will not be able to run DPDK applications that require `root` privileges. To
+address this, either the system will have to be configured to allow running DPDK
+as non-privileged user, or the launch configuration has to be amended to run the
+debugger (usually `GDB`) as root.
+
+Further information on configuring the system to allow running DPDK as
+non-privileged user can be found in the :ref:`common Linux guide
+<Running_Without_Root_Privileges>`.
+
+If the user prefers to run applications as `root` while still working as regular
+user instead, the following steps must be taken:
+
+- Allow running GDB with password-less `sudo` (please consult relevant system
+  documentation on how to achieve this)
+- Set up a local alias for running GDB with `sudo` (e.g. `sudo gdb $@`)
+- Amend the `settings.json` file to set `<build-dir>-dbg-path` variable to this
+  new alias
+
+Once this is done, any existing or new launch targets will use the new debugger
+alias to run DPDK applications.
+
+.. note::
+
+    The above steps are not recommended for production systems, as they may
+    introduce security vulnerabilities.
diff --git a/doc/guides/contributing/index.rst b/doc/guides/contributing/index.rst
index dcb9b1fbf0..a1f27d7f64 100644
--- a/doc/guides/contributing/index.rst
+++ b/doc/guides/contributing/index.rst
@@ -19,3 +19,4 @@ Contributor's Guidelines
     vulnerability
     stable
     cheatsheet
+    ide_integration
diff --git a/doc/guides/rel_notes/release_24_11.rst b/doc/guides/rel_notes/release_24_11.rst
index 0ff70d9057..bd21e600d0 100644
--- a/doc/guides/rel_notes/release_24_11.rst
+++ b/doc/guides/rel_notes/release_24_11.rst
@@ -55,6 +55,11 @@ New Features
      Also, make sure to start the actual text at the margin.
      =======================================================
 
+* **Generate Visual Studio Code configuration on build**
+
+   The Meson build system now generates Visual Studio Code configuration
+   files for configuration, compilation, and debugging tasks.
+
 
 Removed Items
 -------------
diff --git a/examples/meson.build b/examples/meson.build
index 8e8968a1fa..45327b6f6b 100644
--- a/examples/meson.build
+++ b/examples/meson.build
@@ -124,7 +124,20 @@ foreach example: examples
     if allow_experimental_apis
         cflags += '-DALLOW_EXPERIMENTAL_API'
     endif
-    executable('dpdk-' + name, sources,
+
+    # add to Visual Studio Code launch configuration
+    exe_name = 'dpdk-' + name
+    launch_path = join_paths(meson.current_build_dir(), exe_name)
+    # we also need exec env/arch, but they were not available at the time buildtools command was generated
+    cfg_args = ['--launch', launch_path, '--exec-env', exec_env, '--arch', arch_subdir]
+    # we don't want to block the build if this command fails
+    result = run_command(vscode_conf_gen_cmd + ['--launch', launch_path], check: false)
+    if result.returncode() != 0
+        warning('Failed to generate Visual Studio Code launch configuration for "' + name + '"')
+        message(result.stderr())
+    endif
+
+    executable(exe_name, sources,
             include_directories: includes,
             link_whole: link_whole_libs,
             link_args: ldflags,
diff --git a/meson.build b/meson.build
index 8b248d4505..17ba1192c2 100644
--- a/meson.build
+++ b/meson.build
@@ -117,6 +117,20 @@ if meson.is_subproject()
     subdir('buildtools/subproject')
 endif
 
+# if no apps or examples were enabled, no Visual Studio Code config was
+# generated, but we still need build, code analysis etc. configuration to be
+# present, so generate it just in case (it will have no effect if the
+# configuration was already generated by apps/examples). also, when running
+# this command, we don't want to block the build if it fails.
+
+# we need exec env/arch, but they were not available at the time buildtools command was generated
+cfg_args = ['--exec-env', exec_env, '--arch', arch_subdir]
+result = run_command(vscode_conf_gen_cmd + cfg_args, check: false)
+if result.returncode() != 0
+    warning('Failed to generate Visual Studio Code configuration')
+    message(result.stderr())
+endif
+
 # Final output, list all the parts to be built.
 # This does not affect any part of the build, for information only.
 output_message = '\n=================\nApplications Enabled\n=================\n'
-- 
2.43.5


^ permalink raw reply	[flat|nested] 21+ messages in thread

end of thread, other threads:[~2024-09-02 12:17 UTC | newest]

Thread overview: 21+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-07-26 12:42 [RFC PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
2024-07-26 12:42 ` [RFC PATCH v1 1/1] devtools: add vscode configuration generator Anatoly Burakov
2024-07-26 15:36   ` Stephen Hemminger
2024-07-26 16:05     ` Burakov, Anatoly
2024-07-29 13:05 ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Anatoly Burakov
2024-07-29 13:05   ` [RFC PATCH v2 1/1] devtools: add vscode configuration generator Anatoly Burakov
2024-07-29 13:14     ` Bruce Richardson
2024-07-29 13:17       ` Burakov, Anatoly
2024-07-29 14:30     ` Bruce Richardson
2024-07-29 16:16       ` Burakov, Anatoly
2024-07-29 16:41         ` Bruce Richardson
2024-07-30  9:21           ` Burakov, Anatoly
2024-07-30 10:31             ` Bruce Richardson
2024-07-30 10:50               ` Burakov, Anatoly
2024-07-30 15:01   ` [RFC PATCH v2 0/1] Add Visual Studio Code configuration script Bruce Richardson
2024-07-30 15:14     ` Burakov, Anatoly
2024-07-30 15:19       ` Bruce Richardson
2024-07-31 13:33 ` [RFC PATCH v3 " Anatoly Burakov
2024-07-31 13:33   ` [RFC PATCH v3 1/1] buildtools: add vscode configuration generator Anatoly Burakov
2024-09-02 12:17 ` [PATCH v1 0/1] Add Visual Studio Code configuration script Anatoly Burakov
2024-09-02 12:17   ` [PATCH v1 1/1] buildtools: add VSCode configuration generator Anatoly Burakov

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).