DPDK CI discussions
 help / color / mirror / Atom feed
* [dpdk-ci] [PATCH V1] tools: add patchset performance CI doc and scripts
@ 2017-08-16  2:17 Fangfang Wei
  0 siblings, 0 replies; only message in thread
From: Fangfang Wei @ 2017-08-16  2:17 UTC (permalink / raw)
  To: ci; +Cc: Fangfang Wei

Add patchset performance test CI reference document, and the scripts
used in CI. It includes applying one patchset to repo, and sending
performance reports to patchwork.

Signed-off-by: Fangfang Wei <fangfangx.wei@intel.com>
---
 tools/intel_apply_patchset.py                      | 408 +++++++++++++++++++++
 tools/intel_send_performance_report.py             | 353 ++++++++++++++++++
 ...atchset_performance_test_CI_reference_guide.txt | 216 +++++++++++
 3 files changed, 977 insertions(+)
 create mode 100644 tools/intel_apply_patchset.py
 create mode 100644 tools/intel_send_performance_report.py
 create mode 100644 tools/per_patchset_performance_test_CI_reference_guide.txt

diff --git a/tools/intel_apply_patchset.py b/tools/intel_apply_patchset.py
new file mode 100644
index 0000000..40dcd46
--- /dev/null
+++ b/tools/intel_apply_patchset.py
@@ -0,0 +1,408 @@
+# BSD LICENSE
+#
+# Copyright(c) 2010-2017 Intel Corporation. All rights reserved.
+# All rights reserved.
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions
+# are met:
+#
+#   * Redistributions of source code must retain the above copyright
+#     notice, this list of conditions and the following disclaimer.
+#   * Redistributions in binary form must reproduce the above copyright
+#     notice, this list of conditions and the following disclaimer in
+#     the documentation and/or other materials provided with the
+#     distribution.
+#   * Neither the name of Intel Corporation nor the names of its
+#     contributors may be used to endorse or promote products derived
+#     from this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+# -*- coding: utf-8 -*-
+import os
+import sys
+import time
+import re
+import shutil
+import pexpect
+import logging
+import json
+import argparse
+
+reload(sys)
+sys.setdefaultencoding('utf-8')
+
+REPOES = ["dpdk", "dpdk-next-net", "dpdk-next-crypto", "dpdk-next-virtio",
+          "dpdk-next-eventdev"]
+
+
+def GetPatchInfo(patch):
+    patchSet = int(patch[0])
+    patchID = int(patch[1])
+    patchName = patch[2]
+    submitterMail = patch[3]
+    submitTime = patch[4]
+    submitComment = patch[5]
+    subject = patch[6]
+    return (patchSet, patchID, patchName, submitterMail, submitTime,
+            submitComment, subject)
+
+
+class PatchWorkValidation(object):
+    '''
+    You can get the path of test results, set submit
+    '''
+
+    def __init__(self, workPath, beginId, endId):
+        self.share_folder = workPath
+        self.beginId = beginId
+        self.endId = endId
+
+        self.curPatchworkPath = ""
+        self.curPatchPath = ""
+        self.curReportFolderName = ""
+        self.reportsPath = "{}/patch_performance_result".format(self.share_folder)
+        self.dpdkPath = "{}/dpdk".format(self.share_folder)
+        self.patchesPath = "{}/patches".format(self.share_folder)
+        self.patchListDict = {}
+        self.curMasterCID = ""
+        self.switchGitMasterBranch()
+
+    def getCommitID(self):
+        # get commit ID
+        handle = os.popen("git log -1")
+        firstLog = handle.read()
+        handle.close()
+        commitIdPat = r'(?<=commit ).+?(?=Author:)'
+        commitID = re.findall(commitIdPat, firstLog, re.S)
+
+        return "".join(commitID)[:-1]
+
+    def updateBranch_GetCommitID(self, repo):
+        os.chdir(self.dpdkPath)
+        lockFile = "".join([self.dpdkPath, "/", ".git", "/", "index.lock"])
+        if os.path.exists(lockFile):
+            os.remove(lockFile)
+        os.system("git checkout -f {}".format(repo))
+        os.system("git clean -fd")
+        os.system("git checkout -B {} {}/master".format(repo, repo))
+        os.system("git pull")
+        commitID = self.getCommitID()
+        logging.info("commitID = {}".format(commitID))
+
+        return "".join(commitID)
+
+    def makeTempBranch(self, repo):
+        os.chdir(self.dpdkPath)
+        os.system("git clean -fd")
+        os.system("git checkout -f {}".format(repo))
+        os.system("git branch -D temp")
+        os.system("git checkout -B temp")
+
+    def switchGitMasterBranch(self):
+        logging.info("switchGitMasterBranch......")
+        os.chdir(self.dpdkPath)
+        pattern = "^\* master"
+        handle = os.popen("git branch")
+        out = handle.read()
+        handle.close()
+        logging.info(out)
+        matchGr = re.findall(pattern, out, re.MULTILINE)
+        if len(matchGr) == 0:
+            handle = os.popen("git checkout -f master")
+            handle.close()
+            if self.curMasterCID != "":
+                os.system("git reset --hard {}".format(self.curMasterCID))
+        else:
+            logging.info(matchGr)
+
+    def logStatus(self):
+        handle = os.popen("date")
+        out = handle.read()
+        handle.close()
+        logPath = self.reportsPath + os.sep + "runningLog.log"
+        if os.path.exists(logPath):
+            fp = open(logPath, 'r')
+            out = "".join(["".join(fp.readlines()), out])
+            fp.close()
+
+        fp = open(logPath, 'w')
+        fp.writelines(out)
+        fp.close()
+
+    def setCurSubmitInfo(self, submitInfo):
+        pattern = "<(.*)>"
+        if re.findall(pattern, submitInfo["submitter"]):
+            submitInfo["mail"] = "".join(re.findall(pattern,
+                                         submitInfo["submitter"]))
+        else:
+            submitInfo["mail"] = submitInfo["submitter"]
+
+        self.submitInfo = submitInfo
+
+    def makePatchworkReportsFolder(self, name, baseCommitID):
+        logging.info("make reports history folder......")
+
+        self.curReportFolderName = "patch_{}_{}".format(name, baseCommitID)
+        self.curPatchworkPath = "".join([self.reportsPath, "/", self.curReportFolderName])
+        if os.path.exists(self.curPatchworkPath) == False:
+            os.makedirs(self.curPatchworkPath)
+            os.makedirs(self.curPatchworkPath + os.sep + "performance_results")
+
+        self.curPatchPath = self.curPatchworkPath + os.sep + "patches"
+        if os.path.exists(self.curPatchPath) == False:
+            os.makedirs(self.curPatchPath)
+
+    def recordSubmitInfo(self, submitInfo):
+        submitInfoFile = self.curPatchworkPath + os.sep + "submitInfo.txt"
+        print submitInfoFile
+        if os.path.exists(submitInfoFile):
+            os.remove(submitInfoFile)
+
+        content = ""
+        for item, value in submitInfo.iteritems():
+            if value is None or len(value) == 0:
+                continue
+
+            valueStr = ""
+            if isinstance(value, tuple) or isinstance(value, list):
+                for subItem in value:
+                    valueStr += "<val>{}".format(subItem)
+            else:
+                valueStr = value
+            content += "".join(["{}::".format(item),
+                               "<val>{}".format(valueStr), os.linesep])
+
+        with open(submitInfoFile, "wb") as submitInfo:
+            submitInfo.write(content)
+
+    def recordTarMatchIdInfo(self, matchInfo):
+        matchInfoFile = self.curPatchworkPath + os.sep + "TarMatchIdInfo.txt"
+        if os.path.exists(matchInfoFile):
+            os.remove(matchInfoFile)
+        json.dump(matchInfo, open(matchInfoFile, 'w'))
+
+    def makePatchErrorInfo(self, malformedPatch):
+        patchErrFile = "".join([self.curPatchworkPath, os.sep,
+                                "patches", os.sep,
+                                "patchError.txt"])
+        if os.path.exists(patchErrFile):
+            os.remove(patchErrFile)
+        json.dump(malformedPatch, open(patchErrFile, 'w'))
+        os.system("cp {} {}/applyPatch.log".format(patchErrFile, self.share_folder))
+
+    def apply_patch_in_repo(self, patchset, patchsetInfo, repo, CommitID):
+        """
+        apply patch into repo, if patch success, push to git server and
+        generate dpdk tar files, or generate dictionary malPatchPerRepo
+        which may contain the malformedpatch information.
+        """
+        logging.info("Start patch file........")
+        malPatchPerRepo = {}
+        patchNo = 0
+        matchInfo = {}
+
+        first_patch = patchsetInfo[sorted(patchsetInfo.keys())[0]]
+        (first_patchSet, first_patchID, first_patchName,
+         first_submitterMail, first_submitTime,
+         first_submitComment, first_subject) = GetPatchInfo(first_patch)
+
+        for item in sorted(patchsetInfo.keys()):
+            patchNo += 1
+            patch = patchsetInfo[item]
+            (patchSet, patchID, patchName, submitterMail, submitTime,
+             submitComment, subject) = GetPatchInfo(patch)
+
+            if patchNo == 1:
+                matchInfo[patchNo] = "Patch%s-%s" % (patchID, patchID)
+            else:
+                matchInfo[patchNo] = "Patch%s-%s" % (first_patchID, patchID)
+
+            # Get the abosulte path of patchfile.
+            patchSetPath = self.patchesPath + os.sep + patchset
+            patchFile = patchSetPath + os.sep + patchName
+
+            os.chdir(self.dpdkPath)
+            cmd = "patch -d {} -p1 < {}".format(self.dpdkPath,
+                                                patchFile)
+            logging.info(cmd)
+            outStream = os.popen(cmd)
+            patchingLog = outStream.read()
+            logging.info(patchingLog)
+            outStream.close()
+            malformedError = "malformed patch"
+            hunkFailError = "FAILED at"
+            missFileError = "can't find file to patch at input"
+            patchResult = "success"
+            if malformedError in patchingLog:
+                patchResult = "malformed patch"
+            elif hunkFailError in patchingLog:
+                patchResult = "hunk failed"
+            elif missFileError in patchingLog:
+                patchResult = "miss file"
+
+            if patchResult != "success":
+                logging.info("apply patch error!")
+                malPatchPerRepo[patchID] = patchingLog
+
+        # push to git server
+        os.chdir(self.dpdkPath)
+        os.system("git add --all")
+        os.system("git commit -m \" test \"")
+        submitCommentExd = submitComment.replace("\"", "")
+        # os.system("git commit --amend --author=\'" + submitterMail +
+        # "\' --date=\'" + submitTime + "\' -m \"" + submitCommentExd + "\"")
+        commit_cmd = "git commit --amend --author=\'{}\' --date=\'{}\' \
+                     -m \"{}\"".format(submitterMail, submitTime,
+                                       submitCommentExd)
+        os.system(commit_cmd)
+
+        os.chdir(self.dpdkPath + "/../")
+        tar_cmd = "tar zcvf {}/dpdk.tar.gz dpdk".format(share_folder)
+        os.popen(tar_cmd)
+        shutil.copy2("{}/dpdk.tar.gz".format(share_folder), self.curPatchPath)
+
+        self.recordTarMatchIdInfo(matchInfo)
+
+        # Copy patch file into curPatchPath to backup.
+        shutil.copy2(patchFile, self.curPatchPath)
+
+        return malPatchPerRepo
+
+    def run_patchwork_validation(self, patchset, patchsetInfo, RepoCommitID,
+                                 baseCommitID):
+        malformedPatch = dict()
+
+        # Write patchsetID and commitID into share_folder/info.txt.
+        # This file include the patch about where to store test results.
+        dpdk_info = file("{}/info.txt".format(share_folder), "w+")
+        dpdk_info.write(patchset + "_" + baseCommitID)
+        dpdk_info.close()
+
+        # Remove dpdk tar file which may be left after last build.
+        os.system("rm -rf {}/dpdk.tar.gz".format(share_folder))
+
+        # Try to apply patch files into repoes, if patch success then break,
+        # else continue to patch files to next repo
+        repo = "dpdk"
+        for item, patch in patchsetInfo.iteritems():
+            patchSet, patchID, patchName, submitterMail, submitTime, \
+             submitComment, subject = GetPatchInfo(patch)
+            if re.search("net/", subject):
+                repo = "dpdk-next-net"
+            elif (re.search("crypto/", subject) or
+                  re.search("cryptodev:", subject)):
+                repo = "dpdk-next-crypto"
+            elif re.search("event/", subject):
+                repo = "dpdk-next-eventdev"
+            elif re.search("vfio:", subject) or re.search("vfio/", subject):
+                repo = "dpdk-next-virtio"
+
+        self.makeTempBranch(repo)
+        CommitID = RepoCommitID[repo]
+        malPatchPerRepo = self.apply_patch_in_repo(patchset, patchsetInfo,
+                                                   repo, CommitID)
+        logging.info("malPatchPerRepo is {}".format(malPatchPerRepo))
+
+        if len(malPatchPerRepo) > 0:
+            malformedPatch[repo] = malPatchPerRepo
+            self.makePatchErrorInfo(malformedPatch)
+        for item, patch in patchsetInfo.iteritems():
+            (patchSet, patchID, patchName, submitterMail, submitTime,
+             submitComment, subject) = GetPatchInfo(patch)
+            patchInfo = dict()
+            patchInfo["subject"] = re.sub("\[dpdk-dev.*\]", "", subject)
+            patchInfo["submitter"] = submitterMail
+            patchInfo["date"] = submitTime
+            patchInfo["patchworkId"] = patchset
+            patchInfo["baseline"] = "Repo:{}, Branch:master, \
+                                     CommitID:{}".format(repo, CommitID)
+            self.setCurSubmitInfo(patchInfo)
+            self.recordSubmitInfo(patchInfo)
+            break
+
+    def execute(self):
+        patchsetId = "".join([self.beginId, '-', self.endId])
+        # Remove applyPatch.log which may be remained from last build.
+        os.system("rm -rf {}/applyPatch.log".format(self.share_folder))
+
+        if os.path.exists("{}/{}".format(self.patchesPath, patchsetId)) is False:
+            patchsets = os.listdir(self.patchesPath)
+            patchsetId = ""
+            if patchsets:
+                for patchset in patchsets:
+                    if patchset.startswith(str(self.beginId)):
+                        patchsetId = patchset
+
+        if patchsetId == "":
+            print "patchset {}-{} not exist".format(self.beginId, self.endId)
+            with open("{}/applyPatch.log".format(self.share_folder), "wb") as ap:
+                ap.write("no patch to test!")
+
+        else:
+            self.logStatus()
+            logging.info("Need to be tested patchset is {}".format(patchsetId))
+
+            # Get patchsetInfo which includes patchfilename, pasetfileId,
+            # Author, Time, Comments and etc.
+            file_path = "{}/{}/patchsetInfo.txt".format(self.patchesPath, patchsetId)
+            patchsetFile = open(file_path)
+            patchsetInfo = patchsetFile.read()
+            patchsetFile.close()
+            patchsetInfo = eval(patchsetInfo)
+            if patchsetInfo is not None and len(patchsetInfo) > 0:
+                RepoCommitID = {repo: self.updateBranch_GetCommitID(
+                                repo) for repo in REPOES}
+                baseCommitID = RepoCommitID["dpdk"]
+                self.makePatchworkReportsFolder(patchsetId, baseCommitID)
+                # Apply patchset with per patch into repoes
+                self.run_patchwork_validation(patchsetId, patchsetInfo,
+                                              RepoCommitID, baseCommitID)
+
+            # backup patchset files and patchset information
+            cp_cmd = "cp {} {}/patch_{}_{}".format(file_path, self.reportsPath,
+                                                   patchsetId, baseCommitID)
+            os.system(cp_cmd)
+            patcheset_path = "{}/{}".format(self.patchesPath, patchsetId)
+            for parent, dirnames, filenames in os.walk(patcheset_path):
+                for dirname in dirnames:
+                    dir_path = os.path.join(parent, dirname)
+                    os.system("rm -rf {}".format(dir_path))
+            mv_cmd = "mv {} {}/PatchSets".format(patcheset_path, self.share_folder)
+            os.system(mv_cmd)
+
+
+if __name__ == "__main__":
+    """
+    argument:
+    -p: the full path about share folder
+    -b: the first patch id in one patch set
+    -e: the last patch id in one patch set
+    usage:
+    python apply_patchset.py -p the/full/path/of/share/folder -b xxx -e xxx
+    """
+
+    parser = argparse.ArgumentParser(description='apply patch set to correct dpdk repo')
+    parser.add_argument('-p', '--path', help='the absolute path of share folder')
+    parser.add_argument('-b', '--begin-id', default=0, type=str, help='the begin id of the patch set')
+    parser.add_argument('-e', '--end-id', default=0, type=str, help='the end id of the patch set')
+
+    args = parser.parse_args()
+
+    InputBeginID = args.begin_id
+    InputEndID = args.end_id
+    share_folder = args.path
+
+    autoValidation = PatchWorkValidation(share_folder, InputBeginID, InputBeginID)
+    autoValidation.execute()
diff --git a/tools/intel_send_performance_report.py b/tools/intel_send_performance_report.py
new file mode 100644
index 0000000..9c4d691
--- /dev/null
+++ b/tools/intel_send_performance_report.py
@@ -0,0 +1,353 @@
+# BSD LICENSE
+#
+# Copyright(c) 2010-2017 Intel Corporation. All rights reserved.
+# All rights reserved.
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions
+# are met:
+#
+#   * Redistributions of source code must retain the above copyright
+#     notice, this list of conditions and the following disclaimer.
+#   * Redistributions in binary form must reproduce the above copyright
+#     notice, this list of conditions and the following disclaimer in
+#     the documentation and/or other materials provided with the
+#     distribution.
+#   * Neither the name of Intel Corporation nor the names of its
+#     contributors may be used to endorse or promote products derived
+#     from this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+# -*- coding:utf-8 -*-
+import os
+import re
+import sys
+import datetime
+import exceptions
+import argparse
+import smtplib
+import email
+from email.mime.text import MIMEText
+from email.mime.multipart import MIMEMultipart
+import json
+import xlrd
+
+reload(sys)
+sys.setdefaultencoding('utf8')
+
+
+class PatchworkReport(object):
+    def __init__(self, report_folder):
+        self.report_folder = report_folder
+        self.patchInfoDict = dict()
+        self.testStatus = "SUCCESS"
+
+    def getPatchSetInfo(self):
+        InfoFile = "".join([self.report_folder, "/patchsetInfo.txt"])
+        with open(InfoFile, "rb") as pi:
+            patchInfo = pi.read()
+        self.patchSetInfoDict = eval(patchInfo)
+        return self.patchSetInfoDict
+
+    def fetch_keys_values(self, content):
+        retDict = dict()
+        if content == "":
+            return retDict
+        patKey = "(.*)::(.*)"
+        patValue = "<val>"
+
+        result = re.findall(patKey, content, re.M)
+        for item in result:
+            key = item[0]
+            value = item[1].split(patValue)[1:]
+            if len(value) > 1:
+                retDict[key] = value
+            else:
+                retDict[key] = "".join(value)
+        return retDict
+
+    def getPatchInfo(self):
+        submitInfoFile = "".join([self.report_folder, "/submitInfo.txt"])
+        if os.path.exists(submitInfoFile):
+            with open(submitInfoFile, "rb") as info:
+                content = info.read()
+                infoDict = self.fetch_keys_values(content)
+            for item in infoDict.keys():
+                self.patchInfoDict[item] = infoDict[item]
+
+    def make_patchinfo_content(self):
+        content = ""
+        self.getPatchInfo()
+        content += "Submitter: {}\n".format(self.patchInfoDict["submitter"])
+        content += "Date: {}\n".format(self.patchInfoDict["date"])
+        content += "DPDK git baseline: {}\n".format(self.patchInfoDict["baseline"].replace(";", "\n "))
+        content += os.linesep
+        return content
+
+    def get_test_info(self, platform_name):
+        '''
+        platform_name is named as "OS-NIC"
+        '''
+        m = platform_name.split("_")
+        os = m[0]
+        nic = "_".join(m[1:])
+        return os, nic
+
+    def get_system_info(self, folder):
+        '''Kernel-GCC'''
+        f = None
+        kernel = 'NA'
+        gcc = 'NA'
+        system_filename = 'system.conf'
+        vm_system_filename = 'virtual_system.conf'
+        if os.path.exists(folder + os.sep + system_filename):
+            f = open(folder + os.sep + system_filename)
+        else:
+            f = None
+
+        if f is not None:
+            line = f.readline()
+            s = line.split(',')
+            pattern = re.compile(r':.*')
+            kernel = ''
+            gcc = ''
+            k = pattern.search(s[0])
+            if k:
+                kernel = k.group()[1:-7]
+            g = pattern.search(s[2])
+            if g:
+                gcc = g.group()[1:-4]
+                gcc = gcc.split(" ")[2]
+
+        return kernel, gcc
+
+    def get_sheet(self, result_filename):
+        if os.path.exists(result_filename):
+            d = xlrd.open_workbook(result_filename)
+            sheet = d.sheets()[0]
+            return sheet
+
+    def get_failed_cases_info(self, sheet):
+        column = len(sheet.col_values(1))
+        m = 1
+        total_number = 0
+        failed_number = 0
+        failed_case_list = []
+        target = sheet.cell(1, 1).value
+        if sheet.cell(1, 5).value is not None and sheet.cell(1, 5).value != '':
+            failed_total_number = 'N/A: {}'.format(sheet.cell(1, 5).value)
+        else:
+            while m < column:
+                if sheet.cell(m, 4).value is not None and sheet.cell(m, 4).value != '':
+                    total_number += 1
+                if (sheet.cell(m, 5).value.startswith('FAILED')
+                    or sheet.cell(m, 5).value.startswith('BLOCKED')):
+                    failed_number += 1
+                    failed_case_list.append(sheet.cell(m, 4).value)
+                m += 1
+            failed_total_number = str(failed_number)+'/'+str(total_number)
+
+        return failed_total_number, failed_case_list, failed_number, target
+
+    def get_all_info(self, resultPath, platform_name):
+        all_info = []
+        fail_number = 0
+
+        test_info = self.get_test_info(platform_name)
+        os_info = test_info[0]
+        nic_info = test_info[1].lower()
+
+        result_filename = '{}_single_core_perf.txt'.format(nic_info)
+
+        folder = resultPath + os.sep + platform_name
+        if os.path.exists(folder + os.sep + "test_results.xls"):
+            system_info = self.get_system_info(folder)
+            kernel_info = system_info[0]
+            gcc_info = system_info[1]
+
+            row = []
+            perf_content = ""
+            row.append(os_info)
+            row.append(nic_info)
+            row.append(kernel_info)
+            row.append(gcc_info)
+
+            sheet = self.get_sheet(folder + os.sep + "test_results.xls")
+            failed_cases_info = self.get_failed_cases_info(sheet)
+            failed_total_number = failed_cases_info[0]
+            failed_cases_list = failed_cases_info[1]
+            failed_number = failed_cases_info[2]
+            target = failed_cases_info[3]
+
+            if failed_number != 0:
+                fail_number += 1
+
+            if os.path.exists(folder + os.sep + result_filename):
+                with open(folder + os.sep + result_filename) as perf_file:
+                    for line in perf_file.readlines():
+                        perf_content += line
+            if 'N/A' in failed_total_number:
+                perf_content += failed_total_number.split(':')[-1]
+                fail_number = 1
+            row.append(target)
+            row.append(perf_content)
+            row.append(failed_total_number)
+            row.append(failed_cases_list)
+            all_info.append(row)
+        return all_info, fail_number
+
+    def execute_local(self, cmd):
+        print cmd
+        outStream = os.popen(cmd)
+        out = outStream.read()
+        outStream.close()
+        print out
+        return out
+
+    def get_report_content(self):
+        report_content = []
+        resultPath = "{}/regression_results".format(self.report_folder)
+        filenames = []
+        flag = 0
+        filenames = os.listdir(resultPath)
+        for filename in filenames:
+            if os.path.isdir("".join([resultPath, os.sep, filename])):
+                all_info = self.get_all_info(resultPath, filename)
+                report_content.extend(all_info[0])
+                flag += all_info[1]
+        return report_content, flag
+
+    def content_combine(self):
+        content = ''
+        report_content, flag = self.get_report_content()
+        for i in report_content:
+            content += '\n' + i[0] + '\n'
+            content += "Kernel: {}\n".format(i[2])
+            content += "GCC: {}\n".format(i[3])
+            content += "NIC: {}\n".format(i[1])
+            content += "Target: {}\n".format(i[4])
+            if 'N/A' in i[6]:
+                content += "Fail/Total: N/A\n"
+            else:
+                content += "Fail/Total: {}\n".format(i[6])
+            if len(i[7]):
+                content += "Failed cases list:\n"
+                for case in i[7]:
+                    content += "      - DTS {}\n".format(case)
+            content += os.linesep
+            content += "Detail performance results: \n{}\n".format(i[5])
+        return content, flag
+
+    def send_email(self, receivers, cc_list, subject, content):
+        msg = MIMEMultipart('alternative')
+        msg['Subject'] = subject
+        msg['From'] = "the sender's email address"
+        msg['To'] = ", ".join(receivers)
+        msg['CC'] = ", ".join(cc_list)
+        part2 = MIMEText(content, "plain", "utf-8")
+        msg.attach(part2)
+        smtp = smtplib.SMTP('the smtp server')
+        smtp.sendmail("the sender's email", receivers, msg.as_string())
+        smtp.quit()
+
+    def send_report(self, status):
+        self.getPatchInfo()
+        TarMatchfile = "".join([self.report_folder, "/TarMatchIdInfo.txt"])
+        matchDict = json.load(open(TarMatchfile, "r"))
+
+        sortedKey = sorted(int(key) for key in matchDict.keys())
+        whole_patchsetID = matchDict[str(sortedKey[-1])]
+        first_patchID = whole_patchsetID.split("-")[0]
+
+        self.patchSetInfoDict = self.getPatchSetInfo()
+
+        if status:
+            performance_content, flag = self.content_combine()
+            patchset_content = self.make_patchinfo_content()
+            self.testStatus = "FAILURE"
+            if flag == 0:
+                self.testStatus = "SUCCESS"
+        else:
+            self.testStatus = "FAILURE"
+            patchErrorfile = "".join([self.report_folder,
+                                     "/patches/patchError.txt"])
+            patch_error_dict = json.load(open(patchErrorfile, "r"))
+
+        content = ""
+        content += "Test-Label: Performance-Testing" + os.linesep
+        content += "Test-Status: {}\n".format(self.testStatus)
+        content += "http://dpdk.org/patch/{}\n\n".format(first_patchID)
+        if status:
+            if self.testStatus == "SUCCESS":
+                content += "_Performance Testing PASS_\n\n"
+                content += patchset_content
+                content += "{} --> performance testing pass\n".format(whole_patchsetID)
+            else:
+                content += "_Performance Testing issues_\n\n"
+                content += patchset_content
+                content += "{} --> performance testing fail\n".format(whole_patchsetID)
+            content += os.linesep
+            content += "Test environment and result as below:\n"
+            content += performance_content
+        else:
+            content += "_apply patch file failure_\n\n"
+            content += "Submitter: {}\n".format(self.patchInfoDict["submitter"])
+            content += "Date: {}\n".format(self.patchInfoDict["date"])
+            content += "DPDK git baseline: {}".format(self.patchInfoDict["baseline"].replace(";","\n                   "))
+            content += os.linesep
+            content += "Apply patch set {} failed:\n".format(whole_patchsetID)
+            for repo, error_log in patch_error_dict.iteritems():
+                for error_key, error_content in error_log.iteritems():
+                    content += "Repo: {}\n".format(repo)
+                    content += "{}:\n".format(error_key)
+                    content += error_log[error_key]
+                    content += os.linesep
+
+        content += os.linesep + "DPDK STV team" + os.linesep
+        print "content is {}".format(content)
+
+        subject = "|{}| {} DPDK PatchSet Performance Test Report".format(self.testStatus, whole_patchsetID)
+        rece_maillist = ["receiver's email address"]
+        if status == "SUCCESS":
+            cc_list = []
+        else:
+            cc_list = [self.patchInfoDict["submitter"]]
+
+        self.send_email(rece_maillist, cc_list, subject, content)
+
+
+if __name__ == "__main__":
+    """
+    Send report to patchwork.
+    If patchError.txt exists, it maybe apply patch set failed or no patch
+    to tests.
+    If patchError.txt doesn't exists, it apply patch set successfully.
+    Arguments:
+    -p/--dst-path: the full path of share folder which store test result
+    """
+    parser = argparse.ArgumentParser(description='Send report to patchwork')
+    parser.add_argument('-p', '--path', type=str, help='the absolute path of result folder')
+    args = parser.parse_args()
+    reporter = PatchworkReport(args.path)
+
+    flag_file = "{}/patches/patchError.txt".format(args.path)
+    if os.path.exists(flag_file):
+        with open(flag_file, 'rb') as ff:
+            content = ff.read()
+        if "no patch" in content:
+            pass
+        else:
+            reporter.send_report(False)
+    else:
+        reporter.send_report(True)
diff --git a/tools/per_patchset_performance_test_CI_reference_guide.txt b/tools/per_patchset_performance_test_CI_reference_guide.txt
new file mode 100644
index 0000000..9dde040
--- /dev/null
+++ b/tools/per_patchset_performance_test_CI_reference_guide.txt
@@ -0,0 +1,216 @@
+# BSD LICENSE
+#
+# Copyright(c) 2010-2017 Intel Corporation. All rights reserved.
+# All rights reserved.
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions
+# are met:
+#
+#   * Redistributions of source code must retain the above copyright
+#     notice, this list of conditions and the following disclaimer.
+#   * Redistributions in binary form must reproduce the above copyright
+#     notice, this list of conditions and the following disclaimer in
+#     the documentation and/or other materials provided with the
+#     distribution.
+#   * Neither the name of Intel Corporation nor the names of its
+#     contributors may be used to endorse or promote products derived
+#     from this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+
+This document is to present the continuous integration deployment practice
+from Intel on the per patch set performance testing. 
+
+The purpose of patch set performance testing is to test each patch set's
+performance, check if there is regression and integrate the results into
+patchwork. 
+
+This document includes following parts:
+1. Prerequisites
+2. Work Flow
+3. Create Jenkins jobs
+4. Intel CI scripts
+
+
+Prerequisites
+=============
+
+You need one machine which installs Jenkins;
+One machine which can get source code, store scripts and store test results;
+One or more DUT to run performance testing.
+The machine which will get source code, it will be generate a dpdk.tar.gz.
+If DUT want to do testing, they must get the dpdk.tar.gz file.
+You can use the share strategy to share this dpdk.tar.gz to all DUT. You can
+also use ssh to copy dpdk.tar.gz to DUT.
+
+1. Install Jenkins: On RPM-based distributions, such as Red Hat Enterprise
+ Linux (RHEL), CentOS, Fedora or Scientific Linux, you can install Jenkins
+ through yum.
+2. Install neccessary plugins for patchset performance testing:
+      1) BuildResultTrigger Plug-in
+      2) Conditional BuildStep
+      3) Files Found Trigger
+      4) Multijob plugin
+      5) SSH Slaves plugin
+      6) SSH Credential Plugin
+      7) Parameterized Trigger plugin
+3. Create three nodes(machines) in Jenkins:
+      - source node: this node will get source code and store scripts.
+      - result node: This node will store all test results.
+      - tester node: This node will run performance testing.
+
+4. Get dpdk original repo to local :
+You can create a dpdk source code repository on your local server, which
+includes master branch, dpdk, dpdk-next-net, dpdk-next-crypto,
+dpdk-next-eventdev and dpdk-next-virtio. Pull these repoes from dpdk.org.
+
+
+CI Work Flow Introduction
+=========================
+
+Step 1. Download patch set from dpdk.org. 
+Step 2. Apply one patch set to latest dpdk repo(will go through master repo and all 
+	next-* repos). 
+Step 3. If applying patchset fails, then the build result will be marked as Failed, 
+	and the result will be related to the patchwork.
+Step 4. If applying patchset succeeds, then it will continue to trigger the 
+	peroformance test and send performance report to patchwork.
+Step 5. The result of performance test will be integrated to patchwork and showed on 
+	the first patch of the whole patch set. And if the performance test is failed,
+	then the result will be sent to the patch authors for the notification. 
+
+Related Jobs and scripts in Jenkins for each steps are as below: 
+
+Step 1. Jenkins job: "Get-Patch-Sets"; script: download_patchset.py
+Step 2. Jenkins job "Apply-One-Patch-Set"; script: "apply-patchset.py" 
+Step 3. Jenkins job "DPDK-Send-Apply-Failed-Report"; script: N/A. 
+Step 4. Jenkins job "Apply-One-Patch-Set";script: performance test script.
+Step 5. Jenkins job "performance testing and send report" which includes 
+	"DPDK-Performance-Test" job and "DPDK-Send-Performance-Report" job.
+        These two jobs are running sequentially; Script "send_report.py".
+
+
+Details on Jenkins Jobs
+=======================
+
+1. Create "Get-Patch-Sets" job:
+   a. New items, enter an item name: "Get-Patch-Sets". Select
+      "Freestyle project" then press "OK";
+   b. Restrict where this project can be run, write the label of the created
+      host into Label Expression;
+   c. Build Triggers: Build periodically, Schedule is "H/20 * * * *";
+   d. Build: Runs a shell script (defaults to sh, but this is configurable)
+      for building the project. The script will be run with the workspace as the
+      current directory. Type in the contents of your shell script.
+      In this windows, use script download-patchset.py to download patchset from
+      dpdk.org.
+   e. This job works on source node.
+
+2. Create "Apply-One-Patch-Set" job:
+   a. New items, enter an item name: "Apply-One-Patch-Set". Select
+      "Freestyle project" then press "OK";
+   b. General: Restrict where this project can be run, write the label of the
+      created host into Label Expression;
+   c. General: Block Build when downstream project is building;
+   d. Build Triggers: Build periodically, Schedule is "H/30 * * * *";
+   e. Build: Runs a shell script (defaults to sh, but this is configurable)
+      for building the project. The script will be run with the workspace as the
+      current directory. Type in the contents of your shell script.
+      In this windows, use script apply_patchset.py to apply one patchset.
+   f. Add build step -> Conditional step(single). Run? is "File exists".
+      Fill the file name in "File". Base directory is "Workspace".
+      On evaluation failure is "Fail the build";
+   g. Add post-build action -> Trigger parameterized build on other projects.
+      Build Triggers -> Projects to build "performance testing and send report",
+      Trigger when build is "Stable or unstable but not failed".
+      checkout "Trigger build without parameters".
+   h. Add Parameters. Projects to build is "DPDK-Send-Apply-Failed-Reports",
+      Trigger when build is "Failed". checkout "Trigger build without parameters".
+   i. This job works on source node.
+
+3. Create "DPDK-Performance-Test"
+   a. New items, enter an item name: "DPDK Performance Test". Select
+      "Freestyle project" then press "OK";
+   b. General: Restrict where this project can be run, write the label of the
+      created host into Label Expression;
+   c. Build: Runs a shell script (defaults to sh, but this is configurable)
+      for building the project. The script will be run with the workspace as the
+      current directory. Type in the contents of your shell script.
+      In this windows, use script auto-regression.py to run performance testing.
+   d. This job works on test node.
+
+4. Create "DPDK-Send-Apply-Failed-Report"
+   a. New items, enter an item name: "DPDK-Send-Apply-Failed-Report".
+      Select "Freestyle project" then press "OK:
+   b. General: Restrict where this project can be run, write the label of the
+      created host into Label Expression
+   c. Build: Runs a shell script (defaults to sh, but this is configurable)
+      for building the project. The script will be run with the workspace as the
+      current directory. Type in the contents of your shell script.
+      In this windows, use script send-report.py.
+   d. This job works on source node.
+
+5. Create "DPDK-Send-Performance-Report" job
+   a. New items, enter an item name: "DPDK-Send-Performance-Report".
+      Select "Freestyle project" then press "OK";
+   b. General: Restrict where this project can be run, write the label of the
+      created host into Label Expression;
+   c. Build: Runs a shell script (defaults to sh, but this is configurable)
+      for building the project. The script will be run with the workspace as the
+      current directory. Type in the contents of your shell script.
+      In this windows, use script send-report.py to send performance report.
+   d. This job works on source node.
+
+6. Create "performance testing and send report" job
+   a. New items, enter an item name: "performance testing and send report".
+      Select "MultiJob Project" then press "OK";
+   b. Build -> MultiJob Phase: Phase name is the name you want to write.
+      Phase jobs -> Job, Job name is "DPDK-Performance-Test". Job execution type
+      is "Running phase jobs in parallel". Note: If you want to add more
+      performance testing environment, you can create these performance testing
+      jobs, then add them to this steps.
+   c. Add build step -> MultiJob Phase: hase name is the name you want to
+      write. Phase jobs -> Job, job name is "DPDK-Send-Performance-Report".
+      Job execution type is "Running phase jobs in sequentially".
+
+
+Introductions of Intel CI Scripts
+=================================
+
+Some example scripts for performance CI testing can be found in dpdk-ci repo via
+http://dpdk.org/browse/tools/dpdk-ci/
+Such as download_patchset.py, apply_one_patchset.py, send_report.py and etc.
+
+Intel_download_patchset.py:
+This script can download patchset and patchset information from patchwork. And
+store this patchset into a directory which named as
+"<the first patch id>-<the last patch id>".
+Usage:
+python download_patchset.py -p <the full path where you want to put the patchset in> \
+                            -b <the begin patch id> \
+		            -e <the end patch id>.
+If end patch id is 0, this script will get the latest one from patchwork.
+
+Intel_apply_patchset.py:
+This script can apply one patchset to correct repo, such as dpdk,
+dpdk-next-net, dpdk-next-virtio, dpdk-next-eventdev and dpdk-next-crypto.
+The subject prefix on the patches should provide the hint. If the majority of
+the patches start with "net/" then it's for next-net, and similarly for crypto.
+Patches with virtio or vhost go to that tree, etc.
+After apply patchset to correct repo, it will generate a dpdk.tar.gz.
+
+Intel_send_report.py:
+This script will parse the test results which are stored in a shared folder.
+Then send performance report to patchwork.
+
-- 
2.7.4

^ permalink raw reply	[flat|nested] only message in thread

only message in thread, other threads:[~2017-08-16  2:16 UTC | newest]

Thread overview: (only message) (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2017-08-16  2:17 [dpdk-ci] [PATCH V1] tools: add patchset performance CI doc and scripts Fangfang Wei

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).