Hi, all, I would like to discuss the following issues: * Discuss the policies and procedures document that was sent out as part of the minutes from the last meeting. * How will the feature request process work? I would like to propose that we create a new project in bugs.dpdk.org and use that as the public-facing portal to track bugs and feature requests at a high level. UNH-IOL will decide in which order to work on feature requests and bug fixes, with input from the group. UNH-IOL may make improvements to the dashboard over time as we see fit, but the group must approve any changes that result in more vendor-specific information being made publicly available through the dashboard before it is made live. * What will the process be for making the relative numbers for each vendor public? Here are the scenarios that I am envisioning: - A vendor may choose to make results for a device all public or all private. - A vendor may choose to make results public starting at date X. - A vendor may choose to make future results private leaving current results public (to support experimental changes on a platform). - Anything missing from this list? * Any immediate thoughts on the proposed functionality below that require group discussion, or any other feature requests? You can also e-mail dpdklab@iol.unh.edu with any feedback. Proposed functionality for the dashboard: * (from DPDK 2018-06-19 meeting) Distinguish between error with the patch and patch just not applying to master. * (from DPDK 2018-06-19 meeting) Show setup configuration, topology, and hardware/CPU/motherboard environment information. - Requires approval from every Member with hardware in the lab. * (external request) Show description of tests performed. - This should ideally link to an external site describing the test cases. Is there somewhere where the DTS test_plans documentation is published that we can link to? * (external request) Show absolute values in the dashboard detail view if they were provided by the vendor, and a user from that company is logged in. - Requires explicit approval from the individual Members for us to store absolute/expected values in our database. * Adding the functionality for results to be made public as discussed above. * Ability for users to request accounts on the dashboard using a Web form. - Need to define approval process for this. This likely needs to wait on a primary contact from the appropriate company. Do we allow login by users who are not part of any company (it won't do anything for now, but maybe there will be a reason in the future for patch submitters to have accounts)? * Add the patch apply or build output text to the dashboard detail page if the tarball was not created. * Add a download link for the test logs, in the form of a ZIP file containing the contents of the DTS output directory. * Add the ability for a vendor to share the test log zip file. I propose that this be done as follows, though we could go with a different scheme if the group desires: - The log file URLs will be contain a unique, unpredictable >= 40 character random string so that any shared logs are unlikely to be leaked to users who were not given the link. - Each URL will require authentication by default, but vendors will have the option to share the link, which removes the authentication requirement (maybe with an expiration time). - Vendors will also have the option to revoke access, thus again requiring authentication as an employee of that company to access the log. * A Platforms tab which lists vendor systems along with a button to request downtime for each system. During a downtime window, the Jenkins agent on that node is suspended and no jobs will run until the end of the downtime window. This makes it easier for vendors to make updates to their systems without interfering with live test runs. * Show entire history of test runs on the detail page if a test was re-run. Functionality added/bugs fixed in the past 2 weeks (for informational purposes; we don't need to spend time during the meeting discussing these unless anyone has strong feedback): * A preferences page was added allowing individual users to turn on or off e-mail notifications of test results for their organization's devices. * A password change form was added to the preferences page. * Tooltips were added to the front page with definitions of each status, as well as a legend that appears if you click the ? next to the Status column. * Error pages were added for the common error pages to preserve the site navigation should they occur. This also includes graceful notification if the dashboard becomes unavailable due to a power outage, etc. * Timestamps are shown for each patchset and for each test run. * Pagination was added to the dashboard front page; the dashboard will show the most recent 50 patchsets that are currently active in Patchwork on the first page. * Patch submitter e-mail addresses are no longer shown to anonymous users; they will be shown if a user is logged in. * Official NIC product names (if known to us) are shown in the dashboard instead of the codenames used by DTS. * Result on front page shows as "Incomplete" if not all devices have completed testing on a specific patch. * Newly added devices will no longer affect the overall result show on the front page of the dashboard until the vendor is satisfied that the results for that device are stable. -- Patrick MacArthur Research and Development, High Performance Networking and Storage UNH InterOperability Laboratory