From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from mails.dpdk.org (mails.dpdk.org [217.70.189.124]) by inbox.dpdk.org (Postfix) with ESMTP id AA926A034F for ; Thu, 25 Feb 2021 17:08:11 +0100 (CET) Received: from [217.70.189.124] (localhost [127.0.0.1]) by mails.dpdk.org (Postfix) with ESMTP id A45001607B4; Thu, 25 Feb 2021 17:08:11 +0100 (CET) Received: from mail-ej1-f44.google.com (mail-ej1-f44.google.com [209.85.218.44]) by mails.dpdk.org (Postfix) with ESMTP id F377640692 for ; Thu, 25 Feb 2021 17:08:09 +0100 (CET) Received: by mail-ej1-f44.google.com with SMTP id w1so9748428ejf.11 for ; Thu, 25 Feb 2021 08:08:09 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=iol.unh.edu; s=unh-iol; h=mime-version:from:date:message-id:subject:to; bh=aR4ZljP6cE+3tW3ZdXdZ4OE7CFb07V1WpYZjUkHISs4=; b=iz7JImvmWmk0EoEvrCOO72jAn4R+WtqwZORxpNlStkYpSjwJ9T3IXkyBxkl1hWr5bS BAM7/1PHW396uo9W37RM4p6IZy65dwsufM5iwQRj6iccyjxKx02NwW9D59OdGHCdfqLl jN9GqywjByrWjmdtxcM/Z2qUDivWNQw6Dxyho= X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:from:date:message-id:subject:to; bh=aR4ZljP6cE+3tW3ZdXdZ4OE7CFb07V1WpYZjUkHISs4=; b=HgoTnz7I8DILbPRENS/GGk5yLHX6Mq6uOnS2oRitu87Qvy+CnNcrl5bNzZCqdZD8nj dyDaHPI2G/+VrQ2A6F3brEpZ25krQz7Q5aOcgAkHoLV1uAMVyzufJk6//SOzyunC4/bd LCbvnGjiRJNzEmVRRNPe7cmZk2TKgt1OUbfHtW5a3Zm9ef6sDvEIdCQn4gSiKtflvo9Q k2a/gL0zFl7DPAWsAtmkHAoSbxmG/Fj2sCuq+Y53D+WWar5cNDz4a/0KMiMBU+t6ZMAS 2mcYb5DzzKtdj5gpyyscRY2hgQC/30Qdkaix/B9Xo52A2x4OP05mkPg2ZUK39IGqab9b kzkQ== X-Gm-Message-State: AOAM531GQg+If29BbipAKlDCthJUnbIe2QN40TUYeFL/JxhFxMFC4Dzy SKxSAArZII1tweKMEywqmButzi6FufE/d65BRmadRq9FDjeCNw== X-Google-Smtp-Source: ABdhPJwa0liqDPX39o5KIqVC81HoXc1CFG630ez4ijXUr86R4Gi3X2BICb0l4RTAoqaXZHWYPUuLor15pB6BBoYgDjM= X-Received: by 2002:a17:906:2c02:: with SMTP id e2mr3374947ejh.155.1614269289186; Thu, 25 Feb 2021 08:08:09 -0800 (PST) MIME-Version: 1.0 From: Lincoln Lavoie Date: Thu, 25 Feb 2021 11:05:35 -0500 Message-ID: To: ci@dpdk.org Content-Type: multipart/alternative; boundary="0000000000008de6b705bc2b5b4b" Subject: [dpdk-ci] Community CI Meeting Minutes - February 25, 2021 X-BeenThere: ci@dpdk.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: DPDK CI discussions List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: ci-bounces@dpdk.org Sender: "ci" --0000000000008de6b705bc2b5b4b Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable Hello All, Here are the minutes from today's meeting. February 25, 2021 ######################################################## Attendees 1. Lincoln Lavoie 2. Ali Alnubani 3. Brandon Lo 4. David Liu 5. Juraj Linke=C5=A1 6. Lijuan Tu 7. Michael Santana 8. Owen Hilyard ######################################################## Agenda 1. Performance Testing Metrics 2. CI Status 3. Bugzilla Status 4. Test Development 5. Any other business ######################################################## Minutes =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D Performance Testing Metrics * BUG-640 (https://bugs.dpdk.org/show_bug.cgi?id=3D640) * Should performance testing pass / fail requirements be percentage based? * What is an unacceptable percentage change, i.e. what is a failure? * The run to run variations depend not only on the DUT, but also the test environment, i.e. using TRex compared to a dedicated hardware traffic generator. * One option is to move to a larger percentage, and tighten this up over time, i.e. start with 5%, move to 4%, etc. * Should DTS be changed to record / average multiple measurements from TRex. This might provide better statistics, without too much increase in testing time. * Need to ensure the infrastructure is configured to minimize variations run to run, i.e. pinning the CPU clock frequency / turbo turbo frequency, etc. * Group agreed to move forward with proposing changes towards DTS to: * Change the pass / fail determination to be percentage based, with a default value of 10%, where the value can be adjusted by the configuration. * Investigate making multiple traffic measurements (or reports from TRex) during the test run to collect better statistics, being careful to not overly increase the total run time. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D CI Status -------------------------------------------------------- UNH-IOL Community CI Lab * There will be some downtime this afternoon (East Coast USA) and tomorrow on the bare-metal test systems (performance and functional testing). This will likely be followed up with a similar downtime next week to finalize the role out of the new Jenkins pipelines. * Environment subscriptions; no longer have to subscribe to all environments in a group * More statistics are available for the lab, that detail running, passing, failing tests over time, https://lab.dpdk.org/results/dashboard/stats/ -------------------------------------------------------- Intel Lab * Finished the updates to the CI, to continue to test patches, even if parts (i.e. 1 OS) fail, the other parts will continue forward and report overall results to patchworks. -------------------------------------------------------- Travis CI / Github Actions * No updates, Aaron couldn=E2=80=99t make this meeting. =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D Bugzilla Status * https://bugs.dpdk.org/buglist.cgi?bug_status=3DUNCONFIRMED&bug_status=3DCON= FIRMED&bug_status=3DIN_PROGRESS&columnlist=3Dproduct%2Ccomponent%2Cpriority= %2Cbug_status%2Cassigned_to%2Cshort_desc%2Cchangeddate&component=3Djob%20sc= ripts&component=3DUNH%20infra&component=3DIntel%20Lab&component=3DTravis%20= CI&list_id=3D3419&order=3Dpriority%2Cchangeddate%20DESC&product=3Dlab&query= _format=3Dadvanced&resolution=3D--- =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D Test Development The following has been developed with input from the tech board and DPDK leadership for 2021 priorities and goes. These goals should shape the primary focus of lab efforts for the remainder of 2021. 1. Objective 1: UNH-IOL will host the DPDK Performance/Regression Test Lab and provide the test infrastructure to execute, record and report the tests developed and defined by the DPDK Project. 1. Maintain continuous performance and function testing systems, reporting results into patchworks systems. Tests are defined as the following: 1. Compile Testing: Complete DPDK build / compile run, monitoring for errors / warnings and ABI test. 2. Unit Testing: Hardware and infrastructure independent tests run through the make / compile tool chains 3. Functional Tests: Hardware and infrastructure dependent tests, focused on verification of features or functionality of DPDK, reported per hardware system 4. Performance Tests: Hardware and infrastructure dependent tests focused on throughput, reported per hardware system 5. Tests will be run per-patch as patches become available 6. Tests will be run weekly against the official =E2=80=98main=E2=80= =99 and =E2=80=9Cnext-=E2=80=9D branches. The tech board will provide a specific list of these branches to test. 7. Tests will be run weekly against supported LTS branches (Ex: 20.11 and 19.11 stable) 2. Maintain graphical results report/dashboard. The graphic results should report the following items: 1. Per NIC performance trend over time graph, based on periodic (not per patch) testing of the main DPDK branch. Note, the reported data shall be relative (i.e. performance lost or gained relative to an agreed baseline= . 2. Number of tests run within the lab per time (day / week), including test outcomes (i.e. pass, fail, etc.). 3. Report/display periodic branch tests in the dashboard by the end of Q= 2 4. Continue to support DPDK participants in deploying new hardware into the test lab. 2. Objective 2: Increase involvement and visibility of community lab to community 1. Host bi-weekly meeting of lab stakeholders with Techboard participation 2. Appoint Technical and Govboard liaisons for lab oversight 3. Publish the meeting minutes from the lab meeting to CI and dev mailing list 4. Publish quarterly lab update on blog and the DPDK announce email list 5. Document every job as part of the dashboard or provide links to appropriate documentation (i.e. into a git repo) 6. Publish the scripts and make them easy to consume for other labs 1. Utilize repositories hosted by DPDK for DPDK-CI (common work), UNH-jenkins (lab jobs and runners), Dashboard (ui code) 2. Aim to have scripts generalized and available in DPDK-CI repository for work items that are common across labs and different testing infrastructure. 3. Objective 3: Maintain existing tests (limit regressions) as new tests are developed 1. Expedited/prioritization of any regressions introduced with new test development, that break existing DTS tests running the lab. 2. Enable monitoring system to raise an alert to the community for repeating failures (more than 5) of the same test, automatically notify CI technical lead for appropriate follow up or tracking by the end of Q3. 1. CI Technical lead, and other members of the lab, should have some ability to pause testing or disable tests in the case that infrastructure is failed. 3. Maintain compile and unit testing on the following operating systems: FreeBSD latest, Windows Server latest, Redhat 8 & 7, Ubuntu latest LTS and LTS-1, SUSE latest, CentOS Stream 4. Break out the iol-testing test report into a some smaller reports: 1. Per-OS would be best so Linux, BSD, and Windows by end of Q3 2. This will also need to account for architecture here as well (better support of x86_64 and aarch64). 5. Review and generate a report on execution time and pipeline flow of various testing running in the community lab. 1. Techboard will review and provide specific recommendations for testing time targets, such as a goal to deliver x results within y hours of a patch submission. Note, some backlog will always occur when multiple simultaneous patch submissions occur. 4. Objective 4: Test Coverage, dd more checks static analysis (Coverity), ABI, compilation 1. Compilation: 1. Use lcov to generate a report on testing code coverage (Q2) 2. Utilize ubsan and asan for early bug detection during test execution (Q3) 2. Add ABI check tasks on all OS distributions, except Windows (Q1) 3. Add spell checker to patch tests for documentation (needs input from tech board to define official DPDK documentation language requirement) by end of Q2 4. Investigate alternatives to coverity and make a recommendation to the test board by end of Q1. 1. List: Cppcheck, SonarQube, Clang static analyzer, Flawfinder, Splint, infer 2. Add 3 more static analyzers. as directed by the tech board by the end of Q4. 5. Objective 5: Test Coverage, enable the unit tests (maintain function & enable micro benchmark) 1. Work with community to enable the other included unit test suites (perf, debug, etc) (Q1) 2. Use compilation profilers (where available, such as gcov) and report performance information (Q2) =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D Any other business * Next Meeting: March 11, 2021 --=20 *Lincoln Lavoie* Principal Engineer, Broadband Technologies 21 Madbury Rd., Ste. 100, Durham, NH 03824 lylavoie@iol.unh.edu https://www.iol.unh.edu +1-603-674-2755 (m) --0000000000008de6b705bc2b5b4b Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
Hel= lo All,

Here are the minu= tes from today's meeting.

February 25, 2021

#############################################= ###########
Attendees

1. Lincoln Lavoie
2. Ali Alnubani
3. = Brandon Lo
4. David Liu
5. Juraj Linke=C5=A1
6. Lijuan Tu
7. Mi= chael Santana
8. Owen Hilyard

###################################= #####################
Agenda

1. Performance Testing Metrics
2.= CI Status
3. Bugzilla Status
4. Test Development
5. Any other bus= iness

########################################################
Mi= nutes

=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
Performance Testing Metrics

* B= UG-640 (https://bug= s.dpdk.org/show_bug.cgi?id=3D640)
* Should performance testing pass = / fail requirements be percentage based?
* What is an unacceptable perce= ntage change, i.e. what is a failure?
* The run to run variations depend= not only on the DUT, but also the test environment, i.e. using TRex compar= ed to a dedicated hardware traffic generator.
* One option is to move to= a larger percentage, and tighten this up over time, i.e. start with 5%, mo= ve to 4%, etc.
* Should DTS be changed to record / average multiple meas= urements from TRex.=C2=A0 This might provide better statistics, without too= much increase in testing time.
* Need to ensure the infrastructure is c= onfigured to minimize variations run to run, i.e. pinning the CPU clock fre= quency / turbo turbo frequency, etc.
* Group agreed to move forward with= proposing changes towards DTS to:
=C2=A0 =C2=A0* Change the pass / fail= determination to be percentage based, with a default value of 10%, where t= he value can be adjusted by the configuration.
=C2=A0 =C2=A0* Investigat= e making multiple traffic measurements (or reports from TRex) during the te= st run to collect better statistics, being careful to not overly increase t= he total run time.

=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
CI Status

---------= -----------------------------------------------
UNH-IOL Community CI Lab=

* There will be some downtime this afternoon (East Coast USA) and t= omorrow on the bare-metal test systems (performance and functional testing)= .=C2=A0 This will likely be followed up with a similar downtime next week t= o finalize the role out of the new Jenkins pipelines.
* Environment subs= criptions; no longer have to subscribe to all environments in a group
* = More statistics are available for the lab, that detail running, passing, fa= iling tests over time, https://lab.dpdk.org/results/dashboard/stats/

-----------= ---------------------------------------------
Intel Lab

* Finishe= d the updates to the CI, to continue to test patches, even if parts (i.e. 1= OS) fail, the other parts will continue forward and report overall results= to patchworks.

----------------------------------------------------= ----
Travis CI / Github Actions

* No updates, Aaron couldn=E2=80= =99t make this meeting.

=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
Bugzilla Status
<= br>* https://bugs.dpdk.org/buglist.cgi?bug_status=3DUN= CONFIRMED&bug_status=3DCONFIRMED&bug_status=3DIN_PROGRESS&colum= nlist=3Dproduct%2Ccomponent%2Cpriority%2Cbug_status%2Cassigned_to%2Cshort_d= esc%2Cchangeddate&component=3Djob%20scripts&component=3DUNH%20infra= &component=3DIntel%20Lab&component=3DTravis%20CI&list_id=3D3419= &order=3Dpriority%2Cchangeddate%20DESC&product=3Dlab&query_form= at=3Dadvanced&resolution=3D---

=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
Test = Development

The following has been developed with input from the tec= h board and DPDK leadership for 2021 priorities and goes.=C2=A0 These goals= should shape the primary focus of lab efforts for the remainder of 2021.
1. Objective 1: UNH-IOL will host the DPDK Performance/Regression Tes= t Lab and provide the test infrastructure to execute, record and report the= tests developed and defined by the DPDK Project.
=C2=A0 =C2=A01. Mainta= in continuous performance and function testing systems, reporting results i= nto patchworks systems. Tests are defined as the following:
=C2=A0 =C2= =A0 =C2=A0 1. Compile Testing: Complete DPDK build / compile run, monitorin= g for errors / warnings and ABI test.
=C2=A0 =C2=A0 =C2=A0 2. Unit Testi= ng: Hardware and infrastructure independent tests run through the make / co= mpile tool chains
=C2=A0 =C2=A0 =C2=A0 3. Functional Tests: Hardware and= infrastructure dependent tests, focused on verification of features or fun= ctionality of DPDK, reported per hardware system
=C2=A0 =C2=A0 =C2=A0 4.= Performance Tests: Hardware and infrastructure dependent tests focused on = throughput, reported per hardware system
=C2=A0 =C2=A0 =C2=A0 5. Tests w= ill be run per-patch as patches become available
=C2=A0 =C2=A0 =C2=A0 6.= Tests will be run weekly against the official =E2=80=98main=E2=80=99 and = =E2=80=9Cnext-=E2=80=9D branches. The tech board will provide a specific li= st of these branches to test.
=C2=A0 =C2=A0 =C2=A0 7. Tests will be run = weekly against supported LTS branches =C2=A0(Ex: 20.11 and 19.11 stable)=C2=A0 =C2=A02. Maintain graphical results report/dashboard. The graphic r= esults should report the following items:
=C2=A0 =C2=A0 =C2=A0 1. Per NI= C performance trend over time graph, based on periodic (not per patch) test= ing of the main DPDK branch. Note, the reported data shall be relative (i.e= . performance lost or gained relative to an agreed baseline.
=C2=A0 =C2= =A0 =C2=A0 2. Number of tests run within the lab per time (day / week), inc= luding test outcomes (i.e. pass, fail, etc.).
=C2=A0 =C2=A03. Report/dis= play periodic branch tests in the dashboard by the end of Q2
=C2=A0 =C2= =A04. Continue to support DPDK participants in deploying new hardware into = the test lab.

2. Objective 2: Increase involvement and visibility of= community lab to community
=C2=A0 =C2=A01. Host bi-weekly meeting of la= b stakeholders with Techboard participation
=C2=A0 =C2=A02. Appoint Tech= nical and Govboard liaisons for lab oversight
=C2=A0 =C2=A03. Publish th= e meeting minutes from the lab meeting to CI and dev mailing list
=C2=A0= =C2=A04. Publish quarterly lab update on blog and the DPDK announce email = list
=C2=A0 =C2=A05. Document every job as part of the dashboard or prov= ide links to appropriate documentation (i.e. into a git repo)
=C2=A0 =C2= =A06. Publish the scripts and make them easy to consume for other labs
= =C2=A0 =C2=A0 =C2=A0 1. Utilize repositories hosted by DPDK for DPDK-CI (co= mmon work), UNH-jenkins (lab jobs and runners), Dashboard (ui code)
=C2= =A0 =C2=A0 =C2=A0 2. Aim to have scripts generalized and available in DPDK-= CI repository for work items that are common across labs and different test= ing infrastructure.

3. Objective 3: Maintain existing tests (limit r= egressions) as new tests are developed
=C2=A0 =C2=A01. Expedited/priorit= ization of any regressions introduced with new test development, that break= existing DTS tests running the lab.
=C2=A0 =C2=A02. Enable monitoring s= ystem to raise an alert to the community for repeating failures (more than = 5) of the same test, automatically notify CI technical lead for appropriate= follow up or tracking by the end of Q3.
=C2=A0 =C2=A0 =C2=A0 1. CI Tech= nical lead, and other members of the lab, should have some ability to pause= testing or disable tests in the case that infrastructure is failed.
=C2= =A0 =C2=A03. Maintain compile and unit testing on the following operating s= ystems: FreeBSD latest, Windows Server latest, Redhat 8 & 7, Ubuntu lat= est LTS and LTS-1, SUSE latest, CentOS Stream
=C2=A0 =C2=A04. Break out = the iol-testing test report into a some smaller reports:
=C2=A0 =C2=A0 = =C2=A0 1. Per-OS would be best so Linux, BSD, and Windows by end of Q3
= =C2=A0 =C2=A0 =C2=A0 2. This will also need to account for architecture her= e as well (better support of x86_64 and aarch64).
=C2=A0 =C2=A05. Review= and generate a report on execution time and pipeline flow of various testi= ng running in the community lab.
=C2=A0 =C2=A0 =C2=A0 1. Techboard will = review and provide specific recommendations for testing time targets, such = as a goal to deliver x results within y hours of a patch submission. Note, = some backlog will always occur when multiple simultaneous patch submissions= occur.

4. Objective 4: Test Coverage, dd more checks static analysi= s (Coverity), ABI, compilation
=C2=A0 =C2=A01. Compilation:
=C2=A0 = =C2=A0 =C2=A0 1. Use lcov to generate a report on testing code coverage (Q2= )
=C2=A0 =C2=A0 =C2=A0 2. Utilize ubsan and asan for early bug detection= during test execution (Q3)
=C2=A0 =C2=A02. Add ABI check tasks on all O= S distributions, except Windows (Q1)
=C2=A0 =C2=A03. Add spell checker t= o patch tests for documentation (needs input from tech board to define offi= cial DPDK documentation language requirement) by end of Q2
=C2=A0 =C2=A0= 4. Investigate alternatives to coverity and make a recommendation to the te= st board by end of Q1.
=C2=A0 =C2=A0 =C2=A0 1. List: Cppcheck, SonarQube= , Clang static analyzer, Flawfinder, Splint, infer
=C2=A0 =C2=A0 =C2=A0 = 2. Add 3 more static analyzers. as directed by the tech board by the end of= Q4.

5. Objective 5: Test Coverage, enable the unit tests (maintain = function & enable micro benchmark)
=C2=A0 =C2=A01. Work with communi= ty to enable the other included unit test suites (perf, debug, etc) (Q1)=C2=A0 =C2=A02. Use compilation profilers (where available, such as gcov) = and report performance information (Q2)

=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D
Any o= ther business

* Next Meeting: March 11, 2021

--
Lincoln Lavoie
Principal Engineer, = Broadband Technologies
21 Madbury Rd., Ste. 100, Durham, NH 03824=
+1-603-674-2755 (m)

--0000000000008de6b705bc2b5b4b--