Hi Bob, A few comments/questions on "UNH Policies and Procedures Doc": - Objective and Scope of Project - 2: "DPDK-enabled applications", what are these applications, are they refer to testpmd/l2fwd like test applications, or OVS/VPP like other products using DPDK? And if these are other products, will they run for all vendors or for the vendor requested it? - 3: "Demonstrate any new feature performance of DPDK", is this via updating test scripts, if not how these new test will be run and how results will be shared? - What is the plan to use Lab for Continuous Integration, adding more sanity checks than performance checks by time, as far as I know this was one of the initial plans for a lab. - DPDK Branch(s) to test - 5.1.1: "master", a little background, in dpdk development there are multiple sub-trees, some specific patches targets specific trees, these sub-trees are merged into main tree before release candidate, and there is a target to do regular integration from these sub-trees. As a result of this process, for example a patch sent for next-net sub-tree may not apply cleanly to main repo, so it won't be tested. But that patch will be applied to next-net tree and a week later next-net tree will be merged into main tree, this patch can be something affects the performance, but it won't be detected. Later, when a patch arrives that can be applied on main tree, it will reveal the performance issue, but suspect will be the wrong patch and the problematic patch will be already merged. We need a solution for this. There are 5 sub-trees merged into main tree and more than half of the patches are coming to main repo through them. - Private DPDK-Member only Dashboard Specification - 5.6.1.2: "The delta-values of the script output, per test performed.", in member-only dashboard, why not show base value too, since it will be updated regularly, via "--update-expected argument", it would be good to see both current baseline and the diff. - I am for defining a change management system, there are multiple vendor and multiple requests, it would be good to trace, discuss and record the result for all of them systematically. Thanks, ferruh From: O'Driscoll, Tim Sent: Thursday, July 19, 2018 3:55 PM To: 'ci@dpdk.org' Cc: 'Bob Noseworthy' ; Mcnamara, John ; 'Shepard Siegel' ; 'Thomas Monjalon' ; 'Erez Scop' ; 'Shreyansh Jain' ; Xu, Qian Q ; 'pmacarth@iol.unh.edu' ; 'Matt Spencer' ; 'George Zhao' ; 'Mishra, Shishir' ; 'Lixuming' ; Tkachuk, Georgii ; 'Trishan de Lanerolle' ; 'Sean Campbell' ; 'Ali Alnubani' ; 'May Chen' ; 'Lodha, Nishant' ; Zhang, Chun ; 'Malla, Malathi' ; 'khemendra kumar' ; 'graeme.gregory@linaro.org' ; Yigit, Ferruh ; Tu, Lijuan Subject: Minutes of DPDK Lab Meeting, July 17th UNH Policies and Procedures Doc: Document is available at: https://docs.google.com/document/d/1rtSwpNKltVNyDKNWgeTV5gaYeDoAPlK-sfm8XE7o_5s/edit?usp=sharing I reviewed it and it looks good to me. Please add any comments to the google doc, or send them directly to Bob (ren@iol.unh.edu). Dashboard: Dashboard is in much better shape now and most tests are passing. We do need to monitor the results and determine if the tolerance is right. If it's too high then all tests will pass and we'll miss any genuine issues. If it's too low then we'll have too many tests failing when there isn't really a problem. Agreed that tuning the tolerance is the responsibility of each vendor. For Intel, we need to add the specific test config to the results page. We need to confirm the config details to Patrick, and he'll add this to the results pages. Other vendors may want to do something similar. Hardware: Shreyansh is almost ready to ship the NXP hardware. It should be in UNH in 2-3 weeks.