From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id 1DA7541EB5; Thu, 16 Mar 2023 22:15:39 +0100 (CET) Received: from mails.dpdk.org (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id DA0EF42FB4; Thu, 16 Mar 2023 22:14:52 +0100 (CET) Received: from mx0b-0016f401.pphosted.com (mx0a-0016f401.pphosted.com [67.231.148.174]) by mails.dpdk.org (Postfix) with ESMTP id 0767440DF6 for ; Thu, 16 Mar 2023 22:14:44 +0100 (CET) Received: from pps.filterd (m0045849.ppops.net [127.0.0.1]) by mx0a-0016f401.pphosted.com (8.17.1.19/8.17.1.19) with ESMTP id 32GKcQtd004129 for ; Thu, 16 Mar 2023 14:14:44 -0700 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=marvell.com; h=from : to : cc : subject : date : message-id : in-reply-to : references : mime-version : content-type; s=pfpt0220; bh=7aLULWfGAUkzVpvoZsrpuxzrZpQC5E9re/HSR+85qjM=; b=DI4aoqSJjMIAAofrc+ATgQLErR+JCuvZ0IhUTEyNYKZ7HcNAd1njIjV7c7McfNlj3uLb 3wMddPRG4pJQ/P5uOENCTwH+myPP/I/EAtT4Yl5kk8GNsnelHoNb/6/6qSC9Odv8XxUH BwA5VmY8nsWKw98nt3xToS43Fr+k+7vQnHFhg7cdWeElC/Qu1vobM5L07gIOKSBLjXqj wHSMazKZCAPDi0+FwWJUFnvP3DMyU8DfbjnZqnPkSD6aIXjYCm4Hcme+kvz1+/W+iNme z6EgYwp43gJYA9EUraoXc1tEVnwED5DFbMgLm+uemzTUo/IBwWh1juXA51GBSJUUG3Jf Qg== Received: from dc5-exch01.marvell.com ([199.233.59.181]) by mx0a-0016f401.pphosted.com (PPS) with ESMTPS id 3pbxq2ub11-4 (version=TLSv1.2 cipher=ECDHE-RSA-AES256-SHA384 bits=256 verify=NOT) for ; Thu, 16 Mar 2023 14:14:43 -0700 Received: from DC5-EXCH02.marvell.com (10.69.176.39) by DC5-EXCH01.marvell.com (10.69.176.38) with Microsoft SMTP Server (TLS) id 15.0.1497.42; Thu, 16 Mar 2023 14:14:40 -0700 Received: from maili.marvell.com (10.69.176.80) by DC5-EXCH02.marvell.com (10.69.176.39) with Microsoft SMTP Server id 15.0.1497.42 via Frontend Transport; Thu, 16 Mar 2023 14:14:40 -0700 Received: from ml-host-33.caveonetworks.com (unknown [10.110.143.233]) by maili.marvell.com (Postfix) with ESMTP id 8C4155C68E2; Thu, 16 Mar 2023 14:14:40 -0700 (PDT) From: Srikanth Yalavarthi To: Srikanth Yalavarthi CC: , , , , , Subject: [PATCH v7 06/11] app/mldev: add test case to interleave inferences Date: Thu, 16 Mar 2023 14:14:29 -0700 Message-ID: <20230316211434.13409-7-syalavarthi@marvell.com> X-Mailer: git-send-email 2.17.1 In-Reply-To: <20230316211434.13409-1-syalavarthi@marvell.com> References: <20221129070746.20396-1-syalavarthi@marvell.com> <20230316211434.13409-1-syalavarthi@marvell.com> MIME-Version: 1.0 Content-Type: text/plain X-Proofpoint-GUID: BjLUjlGqIikgf5eMHyrKCA4j_mbt9TZi X-Proofpoint-ORIG-GUID: BjLUjlGqIikgf5eMHyrKCA4j_mbt9TZi X-Proofpoint-Virus-Version: vendor=baseguard engine=ICAP:2.0.254,Aquarius:18.0.942,Hydra:6.0.573,FMLib:17.11.170.22 definitions=2023-03-16_14,2023-03-16_02,2023-02-09_01 X-BeenThere: dev@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK patches and discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: dev-bounces@dpdk.org Added test case to interleave inference requests from multiple models. Interleaving would load and start all models and launch inference requests for the models using available queue-pairs Operations sequence when testing with N models and R reps, (load + start) x N -> (enqueue + dequeue) x N x R ... -> (stop + unload) x N Test can be executed by selecting "inference_interleave" test. Signed-off-by: Srikanth Yalavarthi Acked-by: Anup Prabhu --- app/test-mldev/meson.build | 1 + app/test-mldev/ml_options.c | 3 +- app/test-mldev/test_inference_interleave.c | 114 +++ .../tools/img/mldev_inference_interleave.svg | 669 ++++++++++++++++++ doc/guides/tools/testmldev.rst | 41 ++ 5 files changed, 827 insertions(+), 1 deletion(-) create mode 100644 app/test-mldev/test_inference_interleave.c create mode 100644 doc/guides/tools/img/mldev_inference_interleave.svg diff --git a/app/test-mldev/meson.build b/app/test-mldev/meson.build index 475d76d126..41d22fb22c 100644 --- a/app/test-mldev/meson.build +++ b/app/test-mldev/meson.build @@ -18,6 +18,7 @@ sources = files( 'test_model_ops.c', 'test_inference_common.c', 'test_inference_ordered.c', + 'test_inference_interleave.c', ) deps += ['mldev'] diff --git a/app/test-mldev/ml_options.c b/app/test-mldev/ml_options.c index 7b56bca90e..649cb9d8d1 100644 --- a/app/test-mldev/ml_options.c +++ b/app/test-mldev/ml_options.c @@ -156,7 +156,8 @@ ml_dump_test_options(const char *testname) printf("\n"); } - if (strcmp(testname, "inference_ordered") == 0) { + if ((strcmp(testname, "inference_ordered") == 0) || + (strcmp(testname, "inference_interleave") == 0)) { printf("\t\t--filelist : comma separated list of model, input and output\n" "\t\t--repetitions : number of inference repetitions\n"); printf("\n"); diff --git a/app/test-mldev/test_inference_interleave.c b/app/test-mldev/test_inference_interleave.c new file mode 100644 index 0000000000..9cf4cfa197 --- /dev/null +++ b/app/test-mldev/test_inference_interleave.c @@ -0,0 +1,114 @@ +/* SPDX-License-Identifier: BSD-3-Clause + * Copyright (c) 2022 Marvell. + */ + +#include +#include + +#include "ml_common.h" +#include "test_inference_common.h" + +static int +test_inference_interleave_driver(struct ml_test *test, struct ml_options *opt) +{ + struct test_inference *t; + uint16_t fid = 0; + int ret = 0; + + t = ml_test_priv(test); + + ret = ml_inference_mldev_setup(test, opt); + if (ret != 0) + return ret; + + ret = ml_inference_mem_setup(test, opt); + if (ret != 0) + return ret; + + /* load and start all models */ + for (fid = 0; fid < opt->nb_filelist; fid++) { + ret = ml_model_load(test, opt, &t->model[fid], fid); + if (ret != 0) + goto error; + + ret = ml_model_start(test, opt, &t->model[fid], fid); + if (ret != 0) + goto error; + + ret = ml_inference_iomem_setup(test, opt, fid); + if (ret != 0) + goto error; + } + + /* launch inference requests */ + ret = ml_inference_launch_cores(test, opt, 0, opt->nb_filelist - 1); + if (ret != 0) { + ml_err("failed to launch cores"); + goto error; + } + + rte_eal_mp_wait_lcore(); + + /* stop and unload all models */ + for (fid = 0; fid < opt->nb_filelist; fid++) { + ret = ml_inference_result(test, opt, fid); + if (ret != ML_TEST_SUCCESS) + goto error; + + ml_inference_iomem_destroy(test, opt, fid); + + ret = ml_model_stop(test, opt, &t->model[fid], fid); + if (ret != 0) + goto error; + + ret = ml_model_unload(test, opt, &t->model[fid], fid); + if (ret != 0) + goto error; + } + + ml_inference_mem_destroy(test, opt); + + ret = ml_inference_mldev_destroy(test, opt); + if (ret != 0) + return ret; + + t->cmn.result = ML_TEST_SUCCESS; + + return 0; + +error: + ml_inference_mem_destroy(test, opt); + for (fid = 0; fid < opt->nb_filelist; fid++) { + ml_inference_iomem_destroy(test, opt, fid); + ml_model_stop(test, opt, &t->model[fid], fid); + ml_model_unload(test, opt, &t->model[fid], fid); + } + + t->cmn.result = ML_TEST_FAILED; + + return ret; +} + +static int +test_inference_interleave_result(struct ml_test *test, struct ml_options *opt) +{ + struct test_inference *t; + + RTE_SET_USED(opt); + + t = ml_test_priv(test); + + return t->cmn.result; +} + +static const struct ml_test_ops inference_interleave = { + .cap_check = test_inference_cap_check, + .opt_check = test_inference_opt_check, + .opt_dump = test_inference_opt_dump, + .test_setup = test_inference_setup, + .test_destroy = test_inference_destroy, + .test_driver = test_inference_interleave_driver, + .test_result = test_inference_interleave_result, +}; + +ML_TEST_REGISTER(inference_interleave); diff --git a/doc/guides/tools/img/mldev_inference_interleave.svg b/doc/guides/tools/img/mldev_inference_interleave.svg new file mode 100644 index 0000000000..3a741ea627 --- /dev/null +++ b/doc/guides/tools/img/mldev_inference_interleave.svg @@ -0,0 +1,669 @@ + + + + + + + + + + + + + + + + + + + + + + + + test: inference_interleave + + + + + + + + + QueuePair 0 + QueuePair 2 + + + + QueuePair 1 + + Machine LearningHardware Engine + + lcore 1 + lcore 5 + + + + lcore 3 + Enqueue Workers + + lcore 2 + lcore 4 + + + + lcore 6 + Dequeue Workers + + + + + + + + + + + + + Model 0 + Model 1 + Model 2 + Model 3 + + + + + + + nb_worker_threads = 2 * MIN(nb_queue_pairs, (lcore_count - 1) / 2) + inferences_per_queue_pair = nb_models * (repetitions / nb_queue_pairs) + + + + diff --git a/doc/guides/tools/testmldev.rst b/doc/guides/tools/testmldev.rst index 164fbca64f..1a1ab7d2bf 100644 --- a/doc/guides/tools/testmldev.rst +++ b/doc/guides/tools/testmldev.rst @@ -58,6 +58,7 @@ The following are the command-line options supported by the test application. **ML Inference Tests** :: inference_ordered + inference_interleave * ``--dev_id `` @@ -280,6 +281,46 @@ Example command to run inference_ordered test: --test=inference_ordered --filelist model.bin,input.bin,output.bin +INFERENCE_INTERLEAVE Test +~~~~~~~~~~~~~~~~~~~~~~~~~ + +This is a stress test for validating the end-to-end inference execution on ML device. The test +configures the ML device and queue pairs as per the queue-pair related options (queue_pairs +and queue_size) specified by the user. Upon successful configuration of the device and queue +pairs, all models specified through the filelist are loaded to the device. Inferences for multiple +models are enqueued by a pool of worker threads in parallel. Inference execution by the device is +interleaved between multiple models. Total number of inferences enqueued for a model are equal to +the repetitions specified. An additional pool of threads would dequeue the inferences from the +device. Models would be unloaded upon completion of inferences for all models loaded. + + +.. _figure_mldev_inference_interleave: + +.. figure:: img/mldev_inference_interleave.* + + Execution of inference_interleave on single model. + + +Example +^^^^^^^ + +Example command to run inference_interleave test: + +.. code-block:: console + + sudo /app/dpdk-test-mldev -c 0xf -a -- \ + --test=inference_interleave --filelist model.bin,input.bin,output.bin + + +Example command to run inference_interleave test with multiple models: + +.. code-block:: console + + sudo /app/dpdk-test-mldev -c 0xf -a -- \ + --test=inference_interleave --filelist model_A.bin,input_A.bin,output_A.bin \ + --filelist model_B.bin,input_B.bin,output_B.bin + + Debug mode ---------- -- 2.17.1