DPDK patches and discussions
 help / color / mirror / Atom feed
* [dpdk-dev] [PATCH 0/3] Snow3G UEA2 support for Intel Quick Assist Devices
@ 2016-01-28 17:46 Deepak Kumar JAIN
  2016-01-28 17:46 ` [dpdk-dev] [PATCH 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
                   ` (4 more replies)
  0 siblings, 5 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-01-28 17:46 UTC (permalink / raw)
  To: dev

This patchset contains support for snow3g UEA2 wireless algorithm
for Intel Quick Assist devices. (cipher-only)
 
QAT PMD previously supported only cipher/hash chaining for AES/SHA.
The code has been refactored to also support cipher-only
functionality for Snow3g algorithms.
Cipher/hash only functionality is only supported
for Snow3g and not for AES/SHA.

Deepak Kumar JAIN (3):
  crypto: add cipher/auth only support
  qat: add Snow3G UEA2 support
  app/test: add Snow3G UEA2 tests

 app/test/test_cryptodev.c                        | 318 +++++++++++++++++++++-
 app/test/test_cryptodev.h                        |   2 +-
 app/test/test_cryptodev_snow3g_test_vectors.h    | 323 +++++++++++++++++++++++
 doc/guides/cryptodevs/qat.rst                    |   5 +-
 doc/guides/rel_notes/release_2_3.rst             |   1 +
 drivers/crypto/qat/qat_adf/qat_algs.h            |  21 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 218 +++++++++++++--
 drivers/crypto/qat/qat_crypto.c                  | 144 +++++++---
 drivers/crypto/qat/qat_crypto.h                  |  12 +-
 9 files changed, 974 insertions(+), 70 deletions(-)
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH 1/3] crypto: add cipher/auth only support
  2016-01-28 17:46 [dpdk-dev] [PATCH 0/3] Snow3G UEA2 support for Intel Quick Assist Devices Deepak Kumar JAIN
@ 2016-01-28 17:46 ` Deepak Kumar JAIN
  2016-01-28 17:46 ` [dpdk-dev] [PATCH 2/3] qat: add Snow3G UEA2 support Deepak Kumar JAIN
                   ` (3 subsequent siblings)
  4 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-01-28 17:46 UTC (permalink / raw)
  To: dev

Refactored the existing functionality into
modular form to support the cipher/auth only
functionalities.

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 drivers/crypto/qat/qat_adf/qat_algs.h            |  20 ++-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 206 ++++++++++++++++++++---
 drivers/crypto/qat/qat_crypto.c                  | 136 +++++++++++----
 drivers/crypto/qat/qat_crypto.h                  |  12 +-
 4 files changed, 308 insertions(+), 66 deletions(-)

diff --git a/drivers/crypto/qat/qat_adf/qat_algs.h b/drivers/crypto/qat/qat_adf/qat_algs.h
index 76c08c0..d4aa087 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs.h
+++ b/drivers/crypto/qat/qat_adf/qat_algs.h
@@ -3,7 +3,7 @@
  *  redistributing this file, you may do so under either license.
  *
  *  GPL LICENSE SUMMARY
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  This program is free software; you can redistribute it and/or modify
  *  it under the terms of version 2 of the GNU General Public License as
  *  published by the Free Software Foundation.
@@ -17,7 +17,7 @@
  *  qat-linux@intel.com
  *
  *  BSD LICENSE
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  Redistribution and use in source and binary forms, with or without
  *  modification, are permitted provided that the following conditions
  *  are met:
@@ -104,11 +104,17 @@ struct qat_alg_ablkcipher_cd {
 
 int qat_get_inter_state_size(enum icp_qat_hw_auth_algo qat_hash_alg);
 
-int qat_alg_aead_session_create_content_desc(struct qat_session *cd,
-					uint8_t *enckey, uint32_t enckeylen,
-					uint8_t *authkey, uint32_t authkeylen,
-					uint32_t add_auth_data_length,
-					uint32_t digestsize);
+int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cd,
+						uint8_t *enckey,
+						uint32_t enckeylen);
+
+int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
+						uint8_t *cipherkey,
+						uint32_t cipherkeylen,
+						uint8_t *authkey,
+						uint32_t authkeylen,
+						uint32_t add_auth_data_length,
+						uint32_t digestsize);
 
 void qat_alg_init_common_hdr(struct icp_qat_fw_comn_req_hdr *header);
 
diff --git a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
index ceaffb7..88fd803 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
+++ b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
@@ -3,7 +3,7 @@
  *  redistributing this file, you may do so under either license.
  *
  *  GPL LICENSE SUMMARY
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  This program is free software; you can redistribute it and/or modify
  *  it under the terms of version 2 of the GNU General Public License as
  *  published by the Free Software Foundation.
@@ -17,7 +17,7 @@
  *  qat-linux@intel.com
  *
  *  BSD LICENSE
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  Redistribution and use in source and binary forms, with or without
  *  modification, are permitted provided that the following conditions
  *  are met:
@@ -359,15 +359,141 @@ void qat_alg_init_common_hdr(struct icp_qat_fw_comn_req_hdr *header)
 					   ICP_QAT_FW_LA_NO_UPDATE_STATE);
 }
 
-int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
-			uint8_t *cipherkey, uint32_t cipherkeylen,
-			uint8_t *authkey, uint32_t authkeylen,
-			uint32_t add_auth_data_length,
-			uint32_t digestsize)
+int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
+						uint8_t *cipherkey,
+						uint32_t cipherkeylen)
 {
-	struct qat_alg_cd *content_desc = &cdesc->cd;
-	struct icp_qat_hw_cipher_algo_blk *cipher = &content_desc->cipher;
-	struct icp_qat_hw_auth_algo_blk *hash = &content_desc->hash;
+	struct icp_qat_hw_cipher_algo_blk *cipher;
+	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
+	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
+	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
+	void *ptr = &req_tmpl->cd_ctrl;
+	struct icp_qat_fw_cipher_cd_ctrl_hdr *cipher_cd_ctrl = ptr;
+	struct icp_qat_fw_auth_cd_ctrl_hdr *hash_cd_ctrl = ptr;
+	enum icp_qat_hw_cipher_convert key_convert;
+	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO;	/* no CCM/GCM/Snow3G */
+	uint16_t cipher_offset = 0;
+
+	PMD_INIT_FUNC_TRACE();
+
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		cipher =
+		    (struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk));
+		cipher_offset = sizeof(struct icp_qat_hw_auth_algo_blk);
+	} else {
+		cipher = (struct icp_qat_hw_cipher_algo_blk *)&cdesc->cd;
+		cipher_offset = 0;
+	}
+	/* CD setup */
+	if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT) {
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	} else {
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_CMP_AUTH_RES);
+	}
+
+	if (cdesc->qat_mode == ICP_QAT_HW_CIPHER_CTR_MODE) {
+		/* CTR Streaming ciphers are a special case. Decrypt = encrypt
+		 * Overriding default values previously set
+		 */
+		cdesc->qat_dir = ICP_QAT_HW_CIPHER_ENCRYPT;
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	} else if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT)
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	else
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
+	/* For Snow3G, set key convert and other bits */
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	}
+
+	cipher->aes.cipher_config.val =
+	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
+					cdesc->qat_cipher_alg, key_convert,
+					cdesc->qat_dir);
+	memcpy(cipher->aes.key, cipherkey, cipherkeylen);
+
+	proto = ICP_QAT_FW_LA_PROTO_GET(header->serv_specif_flags);
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2)
+		proto = ICP_QAT_FW_LA_SNOW_3G_PROTO;
+
+	/* Request template setup */
+	qat_alg_init_common_hdr(header);
+	header->service_cmd_id = cdesc->qat_cmd;
+
+	ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
+	/* Configure the common header protocol flags */
+	ICP_QAT_FW_LA_PROTO_SET(header->serv_specif_flags, proto);
+	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
+	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
+
+	/* Cipher CD config setup */
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
+		cipher_cd_ctrl->cipher_key_sz =
+			(ICP_QAT_HW_SNOW_3G_UEA2_KEY_SZ +
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ) >> 3;
+		cipher_cd_ctrl->cipher_state_sz =
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	} else {
+		cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
+		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	}
+
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else {
+		PMD_DRV_LOG(ERR, "invalid param, only authenticated "
+			    "encryption supported");
+		return -EFAULT;
+	}
+	return 0;
+}
+
+int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
+						uint8_t *cipherkey,
+						uint32_t cipherkeylen,
+						uint8_t *authkey,
+						uint32_t authkeylen,
+						uint32_t add_auth_data_length,
+						uint32_t digestsize)
+{
+	struct icp_qat_hw_cipher_algo_blk *cipher;
+	struct icp_qat_hw_auth_algo_blk *hash;
 	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
 	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
 	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
@@ -379,30 +505,56 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 		((char *)&req_tmpl->serv_specif_rqpars +
 		sizeof(struct icp_qat_fw_la_cipher_req_params));
 	enum icp_qat_hw_cipher_convert key_convert;
-	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO; /* no CCM/GCM/Snow3G */
+	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO;	/* no CCM/GCM/Snow3G */
 	uint16_t state1_size = 0;
 	uint16_t state2_size = 0;
+	uint16_t cipher_offset = 0, hash_offset = 0;
 
 	PMD_INIT_FUNC_TRACE();
 
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		hash = (struct icp_qat_hw_auth_algo_blk *)&cdesc->cd;
+		cipher =
+			(struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
+					sizeof(struct icp_qat_hw_auth_algo_blk));
+		hash_offset = 0;
+		cipher_offset = ((char *)hash - (char *)cipher);
+	} else {
+		cipher = (struct icp_qat_hw_cipher_algo_blk *)&cdesc->cd;
+		hash = (struct icp_qat_hw_auth_algo_blk *)((char *)&cdesc->cd +
+					sizeof(struct icp_qat_hw_cipher_algo_blk));
+		cipher_offset = 0;
+		hash_offset = ((char *)hash - (char *)cipher);
+	}
+
 	/* CD setup */
 	if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT) {
-		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_RET_AUTH_RES);
+					   ICP_QAT_FW_LA_RET_AUTH_RES);
 		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+					   ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
 	} else {
-		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+					   ICP_QAT_FW_LA_NO_RET_AUTH_RES);
 		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-				   ICP_QAT_FW_LA_CMP_AUTH_RES);
+					   ICP_QAT_FW_LA_CMP_AUTH_RES);
 	}
 
-	cipher->aes.cipher_config.val = ICP_QAT_HW_CIPHER_CONFIG_BUILD(
-			cdesc->qat_mode, cdesc->qat_cipher_alg, key_convert,
-			cdesc->qat_dir);
+	if (cdesc->qat_mode == ICP_QAT_HW_CIPHER_CTR_MODE) {
+		/* CTR Streaming ciphers are a special case. Decrypt = encrypt
+		 * Overriding default values previously set
+		 */
+		cdesc->qat_dir = ICP_QAT_HW_CIPHER_ENCRYPT;
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	} else if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT)
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	else
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
+	cipher->aes.cipher_config.val =
+	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
+					cdesc->qat_cipher_alg, key_convert,
+					cdesc->qat_dir);
 	memcpy(cipher->aes.key, cipherkey, cipherkeylen);
 
 	hash->sha.inner_setup.auth_config.reserved = 0;
@@ -454,15 +606,15 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 	/* Configure the common header protocol flags */
 	ICP_QAT_FW_LA_PROTO_SET(header->serv_specif_flags, proto);
 	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
-	cd_pars->u.s.content_desc_params_sz = sizeof(struct qat_alg_cd) >> 3;
+	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
 
 	/* Cipher CD config setup */
 	cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
 	cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
-	cipher_cd_ctrl->cipher_cfg_offset = 0;
+	cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
 
 	/* Auth CD config setup */
-	hash_cd_ctrl->hash_cfg_offset = ((char *)hash - (char *)cipher) >> 3;
+	hash_cd_ctrl->hash_cfg_offset = hash_offset >> 3;
 	hash_cd_ctrl->hash_flags = ICP_QAT_FW_AUTH_HDR_FLAG_NO_NESTED;
 	hash_cd_ctrl->inner_res_sz = digestsize;
 	hash_cd_ctrl->final_sz = digestsize;
@@ -505,8 +657,12 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 					>> 3);
 	auth_param->auth_res_sz = digestsize;
 
-
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_AUTH) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
 		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
 				ICP_QAT_FW_SLICE_CIPHER);
 		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
diff --git a/drivers/crypto/qat/qat_crypto.c b/drivers/crypto/qat/qat_crypto.c
index 47b257f..e524638 100644
--- a/drivers/crypto/qat/qat_crypto.c
+++ b/drivers/crypto/qat/qat_crypto.c
@@ -1,7 +1,7 @@
 /*-
  *   BSD LICENSE
  *
- *   Copyright(c) 2010-2015 Intel Corporation. All rights reserved.
+ *   Copyright(c) 2010-2016 Intel Corporation. All rights reserved.
  *   All rights reserved.
  *
  *   Redistribution and use in source and binary forms, with or without
@@ -91,16 +91,17 @@ void qat_crypto_sym_clear_session(struct rte_cryptodev *dev,
 static int
 qat_get_cmd_id(const struct rte_crypto_xform *xform)
 {
-	if (xform->next == NULL)
-		return -1;
 
 	/* Cipher Only */
 	if (xform->type == RTE_CRYPTO_XFORM_CIPHER && xform->next == NULL)
-		return -1; /* return ICP_QAT_FW_LA_CMD_CIPHER; */
+		return ICP_QAT_FW_LA_CMD_CIPHER;
 
 	/* Authentication Only */
 	if (xform->type == RTE_CRYPTO_XFORM_AUTH && xform->next == NULL)
-		return -1; /* return ICP_QAT_FW_LA_CMD_AUTH; */
+		return -1;
+
+	if (xform->next == NULL)
+		return -1;
 
 	/* Cipher then Authenticate */
 	if (xform->type == RTE_CRYPTO_XFORM_CIPHER &&
@@ -140,32 +141,14 @@ qat_get_cipher_xform(struct rte_crypto_xform *xform)
 
 	return NULL;
 }
-
-
-void *
-qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
-		struct rte_crypto_xform *xform, void *session_private)
+struct qat_session *
+qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
+				struct rte_crypto_xform *xform,
+				struct qat_session *session_private)
 {
 	struct qat_pmd_private *internals = dev->data->dev_private;
-
 	struct qat_session *session = session_private;
-
-	struct rte_crypto_auth_xform *auth_xform = NULL;
 	struct rte_crypto_cipher_xform *cipher_xform = NULL;
-
-	int qat_cmd_id;
-
-	PMD_INIT_FUNC_TRACE();
-
-	/* Get requested QAT command id */
-	qat_cmd_id = qat_get_cmd_id(xform);
-	if (qat_cmd_id < 0 || qat_cmd_id >= ICP_QAT_FW_LA_CMD_DELIMITER) {
-		PMD_DRV_LOG(ERR, "Unsupported xform chain requested");
-		goto error_out;
-	}
-	session->qat_cmd = (enum icp_qat_fw_la_cmd_id)qat_cmd_id;
-
-	/* Get cipher xform from crypto xform chain */
 	cipher_xform = qat_get_cipher_xform(xform);
 
 	switch (cipher_xform->algo) {
@@ -206,8 +189,28 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 	else
 		session->qat_dir = ICP_QAT_HW_CIPHER_DECRYPT;
 
+	if (qat_alg_aead_session_create_content_desc_cipher(session,
+						cipher_xform->key.data,
+						cipher_xform->key.length))
+		goto error_out;
+
+	return session;
 
-	/* Get authentication xform from Crypto xform chain */
+error_out:
+	rte_mempool_put(internals->sess_mp, session);
+	return NULL;
+}
+
+struct qat_session *
+qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
+				struct rte_crypto_xform *xform,
+				struct qat_session *session_private)
+{
+
+	struct qat_pmd_private *internals = dev->data->dev_private;
+	struct qat_session *session = session_private;
+	struct rte_crypto_auth_xform *auth_xform = NULL;
+	struct rte_crypto_cipher_xform *cipher_xform = NULL;
 	auth_xform = qat_get_auth_xform(xform);
 
 	switch (auth_xform->algo) {
@@ -251,8 +254,9 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 				auth_xform->algo);
 		goto error_out;
 	}
+	cipher_xform = qat_get_cipher_xform(xform);
 
-	if (qat_alg_aead_session_create_content_desc(session,
+	if (qat_alg_aead_session_create_content_desc_auth(session,
 		cipher_xform->key.data,
 		cipher_xform->key.length,
 		auth_xform->key.data,
@@ -261,19 +265,85 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 		auth_xform->digest_length))
 		goto error_out;
 
-	return (struct rte_cryptodev_session *)session;
+	return session;
 
 error_out:
 	rte_mempool_put(internals->sess_mp, session);
 	return NULL;
 }
 
-unsigned qat_crypto_sym_get_session_private_size(
-		struct rte_cryptodev *dev __rte_unused)
+void *
+qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
+				       struct rte_crypto_xform *xform,
+				       void *session_private)
 {
-	return RTE_ALIGN_CEIL(sizeof(struct qat_session), 8);
+	struct qat_pmd_private *internals = dev->data->dev_private;
+
+	struct qat_session *session = session_private;
+
+	int qat_cmd_id;
+
+	/* Get requested QAT command id */
+	qat_cmd_id = qat_get_cmd_id(xform);
+	if (qat_cmd_id < 0 || qat_cmd_id >= ICP_QAT_FW_LA_CMD_DELIMITER) {
+		PMD_DRV_LOG(ERR, "Unsupported xform chain requested");
+		goto error_out;
+	}
+	session->qat_cmd = (enum icp_qat_fw_la_cmd_id)qat_cmd_id;
+
+	switch (session->qat_cmd) {
+	case ICP_QAT_FW_LA_CMD_CIPHER:
+		session =
+			qat_crypto_sym_configure_session_cipher(dev, xform,
+							session);
+		break;
+	case ICP_QAT_FW_LA_CMD_AUTH:
+		session =
+			qat_crypto_sym_configure_session_auth(dev, xform,
+							session);
+		break;
+	case ICP_QAT_FW_LA_CMD_CIPHER_HASH:
+		session =
+			qat_crypto_sym_configure_session_cipher(dev, xform,
+							session);
+		session =
+			qat_crypto_sym_configure_session_auth(dev, xform,
+							session);
+		break;
+	case ICP_QAT_FW_LA_CMD_HASH_CIPHER:
+		session =
+			qat_crypto_sym_configure_session_auth(dev, xform,
+							session);
+		session =
+			qat_crypto_sym_configure_session_cipher(dev, xform,
+							session);
+		break;
+	case ICP_QAT_FW_LA_CMD_TRNG_GET_RANDOM:
+	case ICP_QAT_FW_LA_CMD_TRNG_TEST:
+	case ICP_QAT_FW_LA_CMD_SSL3_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_TLS_V1_1_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_TLS_V1_2_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_MGF1:
+	case ICP_QAT_FW_LA_CMD_AUTH_PRE_COMP:
+	case ICP_QAT_FW_LA_CMD_CIPHER_PRE_COMP:
+	case ICP_QAT_FW_LA_CMD_DELIMITER:
+		PMD_DRV_LOG(ERR, "Unsupported Service %u", session->qat_cmd);
+		goto error_out;
+	default:
+		PMD_DRV_LOG(ERR, "Unsupported Service %u", session->qat_cmd);
+		goto error_out;
+	}
+	return session;
+error_out:
+	rte_mempool_put(internals->sess_mp, session);
+	return NULL;
 }
 
+unsigned
+qat_crypto_sym_get_session_private_size(struct rte_cryptodev *dev __rte_unused)
+{
+	return RTE_ALIGN_CEIL(sizeof(struct qat_session), 8);
+}
 
 uint16_t qat_crypto_pkt_tx_burst(void *qp, struct rte_mbuf **tx_pkts,
 		uint16_t nb_pkts)
diff --git a/drivers/crypto/qat/qat_crypto.h b/drivers/crypto/qat/qat_crypto.h
index d680364..bd63b85 100644
--- a/drivers/crypto/qat/qat_crypto.h
+++ b/drivers/crypto/qat/qat_crypto.h
@@ -1,7 +1,7 @@
 /*-
  *   BSD LICENSE
  *
- *   Copyright(c) 2010-2015 Intel Corporation. All rights reserved.
+ *   Copyright(c) 2010-2016 Intel Corporation. All rights reserved.
  *   All rights reserved.
  *
  *   Redistribution and use in source and binary forms, with or without
@@ -107,6 +107,16 @@ qat_crypto_sym_get_session_private_size(struct rte_cryptodev *dev);
 extern void
 qat_crypto_sym_session_init(struct rte_mempool *mempool, void *priv_sess);
 
+extern struct qat_session *
+qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
+				struct rte_crypto_xform *xform,
+				struct qat_session *session_private);
+
+extern struct qat_session *
+qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
+				struct rte_crypto_xform *xform,
+				struct qat_session *session_private);
+
 extern void *
 qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 		struct rte_crypto_xform *xform, void *session_private);
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH 2/3] qat: add Snow3G UEA2 support
  2016-01-28 17:46 [dpdk-dev] [PATCH 0/3] Snow3G UEA2 support for Intel Quick Assist Devices Deepak Kumar JAIN
  2016-01-28 17:46 ` [dpdk-dev] [PATCH 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
@ 2016-01-28 17:46 ` Deepak Kumar JAIN
  2016-01-28 17:46 ` [dpdk-dev] [PATCH 3/3] app/test: add Snow3G UEA2 tests Deepak Kumar JAIN
                   ` (2 subsequent siblings)
  4 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-01-28 17:46 UTC (permalink / raw)
  To: dev

Added support for wireless Snow3G cipher only,
for the Intel Quick Assist device.

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 doc/guides/cryptodevs/qat.rst                    |  5 +++--
 doc/guides/rel_notes/release_2_3.rst             |  1 +
 drivers/crypto/qat/qat_adf/qat_algs.h            |  1 +
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 12 ++++++++++++
 drivers/crypto/qat/qat_crypto.c                  |  8 ++++++++
 5 files changed, 25 insertions(+), 2 deletions(-)

diff --git a/doc/guides/cryptodevs/qat.rst b/doc/guides/cryptodevs/qat.rst
index 1901842..eda5de2 100644
--- a/doc/guides/cryptodevs/qat.rst
+++ b/doc/guides/cryptodevs/qat.rst
@@ -1,5 +1,5 @@
 ..  BSD LICENSE
-    Copyright(c) 2015 Intel Corporation. All rights reserved.
+    Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
 
     Redistribution and use in source and binary forms, with or without
     modification, are permitted provided that the following conditions
@@ -47,6 +47,7 @@ Cipher algorithms:
 * ``RTE_CRYPTO_SYM_CIPHER_AES128_CBC``
 * ``RTE_CRYPTO_SYM_CIPHER_AES192_CBC``
 * ``RTE_CRYPTO_SYM_CIPHER_AES256_CBC``
+* ``RTE_CRYPTO_SYM_CIPHER_SNOW3G_UEA2``
 
 Hash algorithms:
 
@@ -61,7 +62,7 @@ Limitations
 
 * Chained mbufs are not supported.
 * Hash only is not supported.
-* Cipher only is not supported.
+* Cipher only is not supported except Snow3G UEA2.
 * Only in-place is currently supported (destination address is the same as source address).
 * Only supports the session-oriented API implementation (session-less APIs are not supported).
 * Not performance tuned.
diff --git a/doc/guides/rel_notes/release_2_3.rst b/doc/guides/rel_notes/release_2_3.rst
index 99de186..0e1f1ff 100644
--- a/doc/guides/rel_notes/release_2_3.rst
+++ b/doc/guides/rel_notes/release_2_3.rst
@@ -3,6 +3,7 @@ DPDK Release 2.3
 
 New Features
 ------------
+* **Added the support of Snow3g UEA2 Cipher operation for Intel Quick Assist Devices.*
 
 
 Resolved Issues
diff --git a/drivers/crypto/qat/qat_adf/qat_algs.h b/drivers/crypto/qat/qat_adf/qat_algs.h
index d4aa087..54eeb23 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs.h
+++ b/drivers/crypto/qat/qat_adf/qat_algs.h
@@ -127,5 +127,6 @@ void qat_alg_ablkcipher_init_dec(struct qat_alg_ablkcipher_cd *cd,
 					unsigned int keylen);
 
 int qat_alg_validate_aes_key(int key_len, enum icp_qat_hw_cipher_algo *alg);
+int qat_alg_validate_snow3g_key(int key_len, enum icp_qat_hw_cipher_algo *alg);
 
 #endif
diff --git a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
index 88fd803..200371d 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
+++ b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
@@ -755,3 +755,15 @@ int qat_alg_validate_aes_key(int key_len, enum icp_qat_hw_cipher_algo *alg)
 	}
 	return 0;
 }
+
+int qat_alg_validate_snow3g_key(int key_len, enum icp_qat_hw_cipher_algo *alg)
+{
+	switch (key_len) {
+	case ICP_QAT_HW_SNOW_3G_UEA2_KEY_SZ:
+		*alg = ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2;
+		break;
+	default:
+		return -EINVAL;
+	}
+	return 0;
+}
diff --git a/drivers/crypto/qat/qat_crypto.c b/drivers/crypto/qat/qat_crypto.c
index e524638..9ae6715 100644
--- a/drivers/crypto/qat/qat_crypto.c
+++ b/drivers/crypto/qat/qat_crypto.c
@@ -168,6 +168,14 @@ qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
 		}
 		session->qat_mode = ICP_QAT_HW_CIPHER_CTR_MODE;
 		break;
+	case RTE_CRYPTO_CIPHER_SNOW3G_UEA2:
+		if (qat_alg_validate_snow3g_key(cipher_xform->key.length,
+					&session->qat_cipher_alg) != 0) {
+			PMD_DRV_LOG(ERR, "Invalid SNOW3G cipher key size");
+			goto error_out;
+		}
+		session->qat_mode = ICP_QAT_HW_CIPHER_ECB_MODE;
+		break;
 	case RTE_CRYPTO_CIPHER_NULL:
 	case RTE_CRYPTO_CIPHER_3DES_ECB:
 	case RTE_CRYPTO_CIPHER_3DES_CBC:
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH 3/3] app/test: add Snow3G UEA2 tests
  2016-01-28 17:46 [dpdk-dev] [PATCH 0/3] Snow3G UEA2 support for Intel Quick Assist Devices Deepak Kumar JAIN
  2016-01-28 17:46 ` [dpdk-dev] [PATCH 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
  2016-01-28 17:46 ` [dpdk-dev] [PATCH 2/3] qat: add Snow3G UEA2 support Deepak Kumar JAIN
@ 2016-01-28 17:46 ` Deepak Kumar JAIN
  2016-02-23 14:02 ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Deepak Kumar JAIN
  2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
  4 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-01-28 17:46 UTC (permalink / raw)
  To: dev

Added encryption and decryption tests with input test vectors
from Snow3G UEA2 specifications.

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 app/test/test_cryptodev.c                     | 318 ++++++++++++++++++++++++-
 app/test/test_cryptodev.h                     |   2 +-
 app/test/test_cryptodev_snow3g_test_vectors.h | 323 ++++++++++++++++++++++++++
 3 files changed, 641 insertions(+), 2 deletions(-)
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c
index fd5b7ec..0809b0f 100644
--- a/app/test/test_cryptodev.c
+++ b/app/test/test_cryptodev.c
@@ -1,7 +1,7 @@
 /*-
  *   BSD LICENSE
  *
- *   Copyright(c) 2015 Intel Corporation. All rights reserved.
+ *   Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
  *
  *   Redistribution and use in source and binary forms, with or without
  *   modification, are permitted provided that the following conditions
@@ -43,6 +43,7 @@
 
 #include "test.h"
 #include "test_cryptodev.h"
+#include "test_cryptodev_snow3g_test_vectors.h"
 
 static enum rte_cryptodev_type gbl_cryptodev_type;
 
@@ -188,6 +189,23 @@ testsuite_setup(void)
 		}
 	}
 
+	/* Create 2 Snow3G devices if required */
+	if (gbl_cryptodev_type == RTE_CRYPTODEV_SNOW3G_PMD) {
+		nb_devs = rte_cryptodev_count_devtype(RTE_CRYPTODEV_SNOW3G_PMD);
+		if (nb_devs < 2) {
+			for (i = nb_devs; i < 2; i++) {
+				int dev_id =
+					rte_eal_vdev_init(CRYPTODEV_NAME_SNOW3G_PMD,
+						      NULL);
+
+				TEST_ASSERT(dev_id >= 0,
+					"Failed to create instance %u of"
+					" pmd : %s",
+					i, CRYPTODEV_NAME_SNOW3G_PMD);
+			}
+		}
+	}
+
 	nb_devs = rte_cryptodev_count();
 	if (nb_devs < 1) {
 		RTE_LOG(ERR, USER1, "No crypto devices found?");
@@ -1681,7 +1699,283 @@ test_AES_CBC_HMAC_AES_XCBC_decrypt_digest_verify(void)
 	return TEST_SUCCESS;
 }
 
+/* ***** Snow3G Tests ***** */
+static int
+create_snow3g_cipher_session(uint8_t dev_id,
+			enum rte_crypto_cipher_operation op,
+			const uint8_t *key, const uint8_t key_len)
+{
+	uint8_t cipher_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(cipher_key, key, key_len);
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_XFORM_CIPHER;
+	ut_params->cipher_xform.next = NULL;
+
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = op;
+	ut_params->cipher_xform.cipher.key.data = cipher_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session */
+	ut_params->sess = rte_cryptodev_session_create(dev_id,
+						&ut_params->
+						cipher_xform);
+
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+
+	return 0;
+}
+
+static int
+create_snow3g_cipher_operation(const uint8_t *iv, const unsigned iv_len,
+			const unsigned data_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned iv_pad_len = 0;
+
+	/* Generate Crypto op data structure */
+	ut_params->ol = rte_pktmbuf_offload_alloc(ts_params->mbuf_ol_pool,
+						RTE_PKTMBUF_OL_CRYPTO);
+	TEST_ASSERT_NOT_NULL(ut_params->ol,
+			     "Failed to allocate pktmbuf offload");
+
+	ut_params->op = &ut_params->ol->op.crypto;
+
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+
+	ut_params->op->iv.data =
+		(uint8_t *) rte_pktmbuf_prepend(ut_params->ibuf, iv_pad_len);
+	TEST_ASSERT_NOT_NULL(ut_params->op->iv.data, "no room to prepend iv");
+
+	memset(ut_params->op->iv.data, 0, iv_pad_len);
+	ut_params->op->iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	ut_params->op->iv.length = iv_pad_len;
+
+	rte_memcpy(ut_params->op->iv.data, iv, iv_len);
+
+	rte_hexdump(stdout, "iv:", ut_params->op->iv.data, iv_pad_len);
+	ut_params->op->data.to_cipher.length = data_len;
+	ut_params->op->data.to_cipher.offset = iv_pad_len;
+	return 0;
+}
+
+static int test_snow3g_encryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_session(ts_params->valid_devs[0],
+					RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+					tdata->key.data, tdata->key.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* Clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	       rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/*
+	 * Append data which is padded to a
+	 * multiple of the algorithms block size
+	 */
+	plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16);
+
+	plaintext = (uint8_t *) rte_pktmbuf_append(ut_params->ibuf,
+						plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_operation(tdata->iv.data, tdata->iv.len,
+						tdata->plaintext.len);
+	if (retval < 0)
+		return retval;
+
+	rte_crypto_op_attach_session(ut_params->op, ut_params->sess);
+
+	rte_pktmbuf_offload_attach(ut_params->ibuf, ut_params->ol);
+
+	ut_params->obuf = process_crypto_request(ts_params->valid_devs[0],
+						ut_params->ibuf);
+	TEST_ASSERT_NOT_NULL(ut_params->obuf, "failed to retrieve obuf");
+
+	if (ut_params->op->dst.m) {
+		ciphertext = rte_pktmbuf_mtod(ut_params->op->dst.m, uint8_t *);
+	} else {
+		ciphertext = plaintext;
+	}
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8 - lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(ciphertext,
+				tdata->ciphertext.data,
+				tdata->ciphertext.len,
+				"Snow3G Ciphertext data not as expected");
+	return 0;
+}
+
+static int test_snow3g_decryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t ciphertext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_session(ts_params->valid_devs[0],
+					RTE_CRYPTO_CIPHER_OP_DECRYPT,
+					tdata->key.data, tdata->key.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* Clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	       rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/*
+	 * Append data which is padded to a
+	 * multiple of the algorithms block size
+	 */
+	ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16);
+
+	ciphertext = (uint8_t *) rte_pktmbuf_append(ut_params->ibuf,
+						ciphertext_pad_len);
+	memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_operation(tdata->iv.data, tdata->iv.len,
+						tdata->ciphertext.len);
+	if (retval < 0)
+		return retval;
+
+	rte_crypto_op_attach_session(ut_params->op, ut_params->sess);
+
+	rte_pktmbuf_offload_attach(ut_params->ibuf, ut_params->ol);
+
+	ut_params->obuf = process_crypto_request(ts_params->valid_devs[0],
+						ut_params->ibuf);
+	TEST_ASSERT_NOT_NULL(ut_params->obuf, "failed to retrieve obuf");
+
+	if (ut_params->op->dst.m) {
+		plaintext = rte_pktmbuf_mtod(ut_params->op->dst.m, uint8_t *);
+	} else {
+		plaintext = ciphertext;
+	}
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8 - lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(plaintext,
+				tdata->plaintext.data,
+				tdata->plaintext.len,
+				"Snow3G Plaintext data not as expected");
+	return 0;
+}
+
+static int
+test_snow3g_encryption_test_case_1(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_1);
+}
+
+static int
+test_snow3g_encryption_test_case_2(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_2);
+}
+
+static int
+test_snow3g_encryption_test_case_3(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_encryption_test_case_4(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_4);
+}
+
+static int
+test_snow3g_encryption_test_case_5(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_5);
+}
 
+static int
+test_snow3g_decryption_test_case_1(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_1);
+}
+
+static int
+test_snow3g_decryption_test_case_2(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_2);
+}
+
+static int
+test_snow3g_decryption_test_case_3(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_decryption_test_case_4(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_4);
+}
+
+static int
+test_snow3g_decryption_test_case_5(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_5);
+}
 /* ***** AES-GCM Tests ***** */
 
 static int
@@ -1917,8 +2211,30 @@ static struct unit_test_suite cryptodev_qat_testsuite  = {
 		TEST_CASE_ST(ut_setup, ut_teardown,
 				test_AES_CBC_HMAC_AES_XCBC_decrypt_digest_verify),
 
+		/** Snow3G encrypt only (UEA2) */
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_4),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_5),
 		TEST_CASE_ST(ut_setup, ut_teardown, test_stats),
 
+		/** Snow3G decrypt only (UEA2) */
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_4),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_5),
 		TEST_CASES_END() /**< NULL terminate unit test array */
 	}
 };
diff --git a/app/test/test_cryptodev.h b/app/test/test_cryptodev.h
index 034393e..a349fb2 100644
--- a/app/test/test_cryptodev.h
+++ b/app/test/test_cryptodev.h
@@ -1,7 +1,7 @@
 /*-
  *   BSD LICENSE
  *
- *   Copyright(c) 2015 Intel Corporation. All rights reserved.
+ *   Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
  *
  *   Redistribution and use in source and binary forms, with or without
  *   modification, are permitted provided that the following conditions
diff --git a/app/test/test_cryptodev_snow3g_test_vectors.h b/app/test/test_cryptodev_snow3g_test_vectors.h
new file mode 100644
index 0000000..7f74518
--- /dev/null
+++ b/app/test/test_cryptodev_snow3g_test_vectors.h
@@ -0,0 +1,323 @@
+/*-
+ *   BSD LICENSE
+ *
+ *   Copyright(c) 2016 Intel Corporation. All rights reserved.
+ *
+ *   Redistribution and use in source and binary forms, with or without
+ *   modification, are permitted provided that the following conditions
+ *   are met:
+ *
+ *   * Redistributions of source code must retain the above copyright
+ *     notice, this list of conditions and the following disclaimer.
+ *   * Redistributions in binary form must reproduce the above copyright
+ *     notice, this list of conditions and the following disclaimer in
+ *     the documentation and/or other materials provided with the
+ *     distribution.
+ *   * Neither the name of Intel Corporation nor the names of its
+ *     contributors may be used to endorse or promote products derived
+ *     from this software without specific prior written permission.
+ *
+ *   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ *   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ *   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ *   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ *   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ *   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ *   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ *   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ *   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ *   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ *   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#ifndef TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_
+#define TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_
+
+struct snow3g_test_data {
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} key;
+
+	struct {
+		uint8_t data[64] __rte_aligned(16);
+		unsigned len;
+	} iv;
+
+	struct {
+		uint8_t data[1024];
+		unsigned len;
+	} plaintext;
+
+	struct {
+		uint8_t data[1024];
+		unsigned len;
+	} ciphertext;
+
+	struct {
+		unsigned len;
+	} validDataLenInBits;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} aad;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} digest;
+};
+struct snow3g_test_data snow3g_test_case_1 = {
+	.key = {
+		.data = {
+			0x2B, 0xD6, 0x45, 0x9F, 0x82, 0xC5, 0xB3, 0x00,
+			0x95, 0x2C, 0x49, 0x10, 0x48, 0x81, 0xFF, 0x48
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00,
+			0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x7E, 0xC6, 0x12, 0x72, 0x74, 0x3B, 0xF1, 0x61,
+			0x47, 0x26, 0x44, 0x6A, 0x6C, 0x38, 0xCE, 0xD1,
+			0x66, 0xF6, 0xCA, 0x76, 0xEB, 0x54, 0x30, 0x04,
+			0x42, 0x86, 0x34, 0x6C, 0xEF, 0x13, 0x0F, 0x92,
+			0x92, 0x2B, 0x03, 0x45, 0x0D, 0x3A, 0x99, 0x75,
+			0xE5, 0xBD, 0x2E, 0xA0, 0xEB, 0x55, 0xAD, 0x8E,
+			0x1B, 0x19, 0x9E, 0x3E, 0xC4, 0x31, 0x60, 0x20,
+			0xE9, 0xA1, 0xB2, 0x85, 0xE7, 0x62, 0x79, 0x53,
+			0x59, 0xB7, 0xBD, 0xFD, 0x39, 0xBE, 0xF4, 0xB2,
+			0x48, 0x45, 0x83, 0xD5, 0xAF, 0xE0, 0x82, 0xAE,
+			0xE6, 0x38, 0xBF, 0x5F, 0xD5, 0xA6, 0x06, 0x19,
+			0x39, 0x01, 0xA0, 0x8F, 0x4A, 0xB4, 0x1A, 0xAB,
+			0x9B, 0x13, 0x48, 0x80
+		},
+		.len = 100
+	},
+	.ciphertext = {
+		.data = {
+			0x8C, 0xEB, 0xA6, 0x29, 0x43, 0xDC, 0xED, 0x3A,
+			0x09, 0x90, 0xB0, 0x6E, 0xA1, 0xB0, 0xA2, 0xC4,
+			0xFB, 0x3C, 0xED, 0xC7, 0x1B, 0x36, 0x9F, 0x42,
+			0xBA, 0x64, 0xC1, 0xEB, 0x66, 0x65, 0xE7, 0x2A,
+			0xA1, 0xC9, 0xBB, 0x0D, 0xEA, 0xA2, 0x0F, 0xE8,
+			0x60, 0x58, 0xB8, 0xBA, 0xEE, 0x2C, 0x2E, 0x7F,
+			0x0B, 0xEC, 0xCE, 0x48, 0xB5, 0x29, 0x32, 0xA5,
+			0x3C, 0x9D, 0x5F, 0x93, 0x1A, 0x3A, 0x7C, 0x53,
+			0x22, 0x59, 0xAF, 0x43, 0x25, 0xE2, 0xA6, 0x5E,
+			0x30, 0x84, 0xAD, 0x5F, 0x6A, 0x51, 0x3B, 0x7B,
+			0xDD, 0xC1, 0xB6, 0x5F, 0x0A, 0xA0, 0xD9, 0x7A,
+			0x05, 0x3D, 0xB5, 0x5A, 0x88, 0xC4, 0xC4, 0xF9,
+			0x60, 0x5E, 0x41, 0x40
+		},
+		.len = 100
+	},
+	.validDataLenInBits = {
+		.len = 798
+	},
+	.aad = {
+		.data = {
+			 0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00,
+			 0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_2 = {
+	.key = {
+		.data = {
+			0xEF, 0xA8, 0xB2, 0x22, 0x9E, 0x72, 0x0C, 0x2A,
+			0x7C, 0x36, 0xEA, 0x55, 0xE9, 0x60, 0x56, 0x95
+		},
+		.len = 16
+	},
+	.iv = {
+	       .data = {
+			0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00,
+			0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00
+		},
+	       .len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x10, 0x11, 0x12, 0x31, 0xE0, 0x60, 0x25, 0x3A,
+			0x43, 0xFD, 0x3F, 0x57, 0xE3, 0x76, 0x07, 0xAB,
+			0x28, 0x27, 0xB5, 0x99, 0xB6, 0xB1, 0xBB, 0xDA,
+			0x37, 0xA8, 0xAB, 0xCC, 0x5A, 0x8C, 0x55, 0x0D,
+			0x1B, 0xFB, 0x2F, 0x49, 0x46, 0x24, 0xFB, 0x50,
+			0x36, 0x7F, 0xA3, 0x6C, 0xE3, 0xBC, 0x68, 0xF1,
+			0x1C, 0xF9, 0x3B, 0x15, 0x10, 0x37, 0x6B, 0x02,
+			0x13, 0x0F, 0x81, 0x2A, 0x9F, 0xA1, 0x69, 0xD8
+		},
+		.len = 64
+	},
+	.ciphertext = {
+		.data = {
+				0xE0, 0xDA, 0x15, 0xCA, 0x8E, 0x25, 0x54, 0xF5,
+				0xE5, 0x6C, 0x94, 0x68, 0xDC, 0x6C, 0x7C, 0x12,
+				0x9C, 0x56, 0x8A, 0xA5, 0x03, 0x23, 0x17, 0xE0,
+				0x4E, 0x07, 0x29, 0x64, 0x6C, 0xAB, 0xEF, 0xA6,
+				0x89, 0x86, 0x4C, 0x41, 0x0F, 0x24, 0xF9, 0x19,
+				0xE6, 0x1E, 0x3D, 0xFD, 0xFA, 0xD7, 0x7E, 0x56,
+				0x0D, 0xB0, 0xA9, 0xCD, 0x36, 0xC3, 0x4A, 0xE4,
+				0x18, 0x14, 0x90, 0xB2, 0x9F, 0x5F, 0xA2, 0xFC
+		},
+		.len = 64
+	},
+	.validDataLenInBits = {
+		.len = 510
+	},
+	.aad = {
+		.data = {
+			 0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00,
+			 0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_3 = {
+	.key = {
+		.data = {
+			 0x5A, 0xCB, 0x1D, 0x64, 0x4C, 0x0D, 0x51, 0x20,
+			 0x4E, 0xA5, 0xF1, 0x45, 0x10, 0x10, 0xD8, 0x52
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00,
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xAD, 0x9C, 0x44, 0x1F, 0x89, 0x0B, 0x38, 0xC4,
+			0x57, 0xA4, 0x9D, 0x42, 0x14, 0x07, 0xE8
+		},
+		.len = 15
+	},
+	.ciphertext = {
+		.data = {
+			0xBA, 0x0F, 0x31, 0x30, 0x03, 0x34, 0xC5, 0x6B,
+			0x52, 0xA7, 0x49, 0x7C, 0xBA, 0xC0, 0x46
+		},
+		.len = 15
+	},
+	.validDataLenInBits = {
+		.len = 120
+	},
+	.aad = {
+		.data = {
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00,
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+};
+
+struct snow3g_test_data snow3g_test_case_4 = {
+	.key = {
+		.data = {
+			0xD3, 0xC5, 0xD5, 0x92, 0x32, 0x7F, 0xB1, 0x1C,
+			0x40, 0x35, 0xC6, 0x68, 0x0A, 0xF8, 0xC6, 0xD1
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x39, 0x8A, 0x59, 0xB4, 0x2C, 0x00, 0x00, 0x00,
+			0x39, 0x8A, 0x59, 0xB4, 0x2C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x98, 0x1B, 0xA6, 0x82, 0x4C, 0x1B, 0xFB, 0x1A,
+			0xB4, 0x85, 0x47, 0x20, 0x29, 0xB7, 0x1D, 0x80,
+			0x8C, 0xE3, 0x3E, 0x2C, 0xC3, 0xC0, 0xB5, 0xFC,
+			0x1F, 0x3D, 0xE8, 0xA6, 0xDC, 0x66, 0xB1, 0xF0
+		},
+		.len = 32
+	},
+	.ciphertext = {
+		.data = {
+			0x98, 0x9B, 0x71, 0x9C, 0xDC, 0x33, 0xCE, 0xB7,
+			0xCF, 0x27, 0x6A, 0x52, 0x82, 0x7C, 0xEF, 0x94,
+			0xA5, 0x6C, 0x40, 0xC0, 0xAB, 0x9D, 0x81, 0xF7,
+			0xA2, 0xA9, 0xBA, 0xC6, 0x0E, 0x11, 0xC4, 0xB0
+		},
+		.len = 32
+	},
+	.validDataLenInBits = {
+		.len = 253
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_5 = {
+	.key = {
+		.data = {
+			0x60, 0x90, 0xEA, 0xE0, 0x4C, 0x83, 0x70, 0x6E,
+			0xEC, 0xBF, 0x65, 0x2B, 0xE8, 0xE3, 0x65, 0x66
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x72, 0xA4, 0xF2, 0x0F, 0x48, 0x00, 0x00, 0x00,
+			0x72, 0xA4, 0xF2, 0x0F, 0x48, 0x00, 0x00, 0x00
+		},
+		.len = 16},
+	.plaintext = {
+		.data = {
+			0x40, 0x98, 0x1B, 0xA6, 0x82, 0x4C, 0x1B, 0xFB,
+			0x42, 0x86, 0xB2, 0x99, 0x78, 0x3D, 0xAF, 0x44,
+			0x2C, 0x09, 0x9F, 0x7A, 0xB0, 0xF5, 0x8D, 0x5C,
+			0x8E, 0x46, 0xB1, 0x04, 0xF0, 0x8F, 0x01, 0xB4,
+			0x1A, 0xB4, 0x85, 0x47, 0x20, 0x29, 0xB7, 0x1D,
+			0x36, 0xBD, 0x1A, 0x3D, 0x90, 0xDC, 0x3A, 0x41,
+			0xB4, 0x6D, 0x51, 0x67, 0x2A, 0xC4, 0xC9, 0x66,
+			0x3A, 0x2B, 0xE0, 0x63, 0xDA, 0x4B, 0xC8, 0xD2,
+			0x80, 0x8C, 0xE3, 0x3E, 0x2C, 0xCC, 0xBF, 0xC6,
+			0x34, 0xE1, 0xB2, 0x59, 0x06, 0x08, 0x76, 0xA0,
+			0xFB, 0xB5, 0xA4, 0x37, 0xEB, 0xCC, 0x8D, 0x31,
+			0xC1, 0x9E, 0x44, 0x54, 0x31, 0x87, 0x45, 0xE3,
+			0x98, 0x76, 0x45, 0x98, 0x7A, 0x98, 0x6F, 0x2C,
+			0xB0
+		},
+		.len = 105
+	},
+	.ciphertext = {
+		.data = {
+			0x58, 0x92, 0xBB, 0xA8, 0x8B, 0xBB, 0xCA, 0xAE,
+			0xAE, 0x76, 0x9A, 0xA0, 0x6B, 0x68, 0x3D, 0x3A,
+			0x17, 0xCC, 0x04, 0xA3, 0x69, 0x88, 0x16, 0x97,
+			0x43, 0x5E, 0x44, 0xFE, 0xD5, 0xFF, 0x9A, 0xF5,
+			0x7B, 0x9E, 0x89, 0x0D, 0x4D, 0x5C, 0x64, 0x70,
+			0x98, 0x85, 0xD4, 0x8A, 0xE4, 0x06, 0x90, 0xEC,
+			0x04, 0x3B, 0xAA, 0xE9, 0x70, 0x57, 0x96, 0xE4,
+			0xA9, 0xFF, 0x5A, 0x4B, 0x8D, 0x8B, 0x36, 0xD7,
+			0xF3, 0xFE, 0x57, 0xCC, 0x6C, 0xFD, 0x6C, 0xD0,
+			0x05, 0xCD, 0x38, 0x52, 0xA8, 0x5E, 0x94, 0xCE,
+			0x6B, 0xCD, 0x90, 0xD0, 0xD0, 0x78, 0x39, 0xCE,
+			0x09, 0x73, 0x35, 0x44, 0xCA, 0x8E, 0x35, 0x08,
+			0x43, 0x24, 0x85, 0x50, 0x92, 0x2A, 0xC1, 0x28,
+			0x18
+		},
+		.len = 105
+	},
+	.validDataLenInBits = {
+		.len = 837
+	}
+};
+
+#endif /* TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_ */
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices
  2016-01-28 17:46 [dpdk-dev] [PATCH 0/3] Snow3G UEA2 support for Intel Quick Assist Devices Deepak Kumar JAIN
                   ` (2 preceding siblings ...)
  2016-01-28 17:46 ` [dpdk-dev] [PATCH 3/3] app/test: add Snow3G UEA2 tests Deepak Kumar JAIN
@ 2016-02-23 14:02 ` Deepak Kumar JAIN
  2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
                     ` (3 more replies)
  2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
  4 siblings, 4 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-02-23 14:02 UTC (permalink / raw)
  To: dev

 This patchset contains fixes and refactoring for Snow3G(UEA2 and
 UIA2) wireless algorithm for Intel Quick Assist devices.

 QAT PMD previously supported only cipher/hash alg-chaining for AES/SHA.
 The code has been refactored to also support cipher-only and hash  only (for Snow3G only) functionality along with alg-chaining.

 Changes from v1: 

 1) Hash only fix and alg chainging fix

 2) Added hash test vectors for snow3g UIA2 functionality.

 This patchset depends on
 Cryptodev API changes 
 http://dpdk.org/ml/archives/dev/2016-February/033551.html

Deepak Kumar JAIN (3):
  crypto: add cipher/auth only support
  qat: add support for Snow3G
  app/test: add Snow3G tests

 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 doc/guides/cryptodevs/qat.rst                      |    8 +-
 doc/guides/rel_notes/release_16_04.rst             |    4 +
 drivers/crypto/qat/qat_adf/qat_algs.h              |   19 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c   |  280 +++++-
 drivers/crypto/qat/qat_crypto.c                    |  147 ++-
 drivers/crypto/qat/qat_crypto.h                    |   10 +
 10 files changed, 2228 insertions(+), 74 deletions(-)
 create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v2 1/3] crypto: add cipher/auth only support
  2016-02-23 14:02 ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Deepak Kumar JAIN
@ 2016-02-23 14:02   ` Deepak Kumar JAIN
  2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 2/3] qat: add support for Snow3G Deepak Kumar JAIN
                     ` (2 subsequent siblings)
  3 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-02-23 14:02 UTC (permalink / raw)
  To: dev

Refactored the existing functionality into
modular form to support the cipher/auth only
functionalities.

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 drivers/crypto/qat/qat_adf/qat_algs.h            |  18 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 210 ++++++++++++++++++++---
 drivers/crypto/qat/qat_crypto.c                  | 137 +++++++++++----
 drivers/crypto/qat/qat_crypto.h                  |  10 ++
 4 files changed, 308 insertions(+), 67 deletions(-)

diff --git a/drivers/crypto/qat/qat_adf/qat_algs.h b/drivers/crypto/qat/qat_adf/qat_algs.h
index 76c08c0..b73a5d0 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs.h
+++ b/drivers/crypto/qat/qat_adf/qat_algs.h
@@ -3,7 +3,7 @@
  *  redistributing this file, you may do so under either license.
  *
  *  GPL LICENSE SUMMARY
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  This program is free software; you can redistribute it and/or modify
  *  it under the terms of version 2 of the GNU General Public License as
  *  published by the Free Software Foundation.
@@ -17,7 +17,7 @@
  *  qat-linux@intel.com
  *
  *  BSD LICENSE
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  Redistribution and use in source and binary forms, with or without
  *  modification, are permitted provided that the following conditions
  *  are met:
@@ -104,11 +104,15 @@ struct qat_alg_ablkcipher_cd {
 
 int qat_get_inter_state_size(enum icp_qat_hw_auth_algo qat_hash_alg);
 
-int qat_alg_aead_session_create_content_desc(struct qat_session *cd,
-					uint8_t *enckey, uint32_t enckeylen,
-					uint8_t *authkey, uint32_t authkeylen,
-					uint32_t add_auth_data_length,
-					uint32_t digestsize);
+int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cd,
+						uint8_t *enckey,
+						uint32_t enckeylen);
+
+int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
+						uint8_t *authkey,
+						uint32_t authkeylen,
+						uint32_t add_auth_data_length,
+						uint32_t digestsize);
 
 void qat_alg_init_common_hdr(struct icp_qat_fw_comn_req_hdr *header);
 
diff --git a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
index ceaffb7..bef444b 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
+++ b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
@@ -3,7 +3,7 @@
  *  redistributing this file, you may do so under either license.
  *
  *  GPL LICENSE SUMMARY
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  This program is free software; you can redistribute it and/or modify
  *  it under the terms of version 2 of the GNU General Public License as
  *  published by the Free Software Foundation.
@@ -17,7 +17,7 @@
  *  qat-linux@intel.com
  *
  *  BSD LICENSE
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  Redistribution and use in source and binary forms, with or without
  *  modification, are permitted provided that the following conditions
  *  are met:
@@ -359,15 +359,139 @@ void qat_alg_init_common_hdr(struct icp_qat_fw_comn_req_hdr *header)
 					   ICP_QAT_FW_LA_NO_UPDATE_STATE);
 }
 
-int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
-			uint8_t *cipherkey, uint32_t cipherkeylen,
-			uint8_t *authkey, uint32_t authkeylen,
-			uint32_t add_auth_data_length,
-			uint32_t digestsize)
+int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
+						uint8_t *cipherkey,
+						uint32_t cipherkeylen)
 {
-	struct qat_alg_cd *content_desc = &cdesc->cd;
-	struct icp_qat_hw_cipher_algo_blk *cipher = &content_desc->cipher;
-	struct icp_qat_hw_auth_algo_blk *hash = &content_desc->hash;
+	struct icp_qat_hw_cipher_algo_blk *cipher;
+	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
+	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
+	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
+	void *ptr = &req_tmpl->cd_ctrl;
+	struct icp_qat_fw_cipher_cd_ctrl_hdr *cipher_cd_ctrl = ptr;
+	struct icp_qat_fw_auth_cd_ctrl_hdr *hash_cd_ctrl = ptr;
+	enum icp_qat_hw_cipher_convert key_convert;
+	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO;	/* no CCM/GCM/Snow3G */
+	uint16_t cipher_offset = 0;
+
+	PMD_INIT_FUNC_TRACE();
+
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		cipher =
+		    (struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk));
+		cipher_offset = sizeof(struct icp_qat_hw_auth_algo_blk);
+	} else {
+		cipher = (struct icp_qat_hw_cipher_algo_blk *)&cdesc->cd;
+		cipher_offset = 0;
+	}
+	/* CD setup */
+	if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT) {
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	} else {
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_CMP_AUTH_RES);
+	}
+
+	if (cdesc->qat_mode == ICP_QAT_HW_CIPHER_CTR_MODE) {
+		/* CTR Streaming ciphers are a special case. Decrypt = encrypt
+		 * Overriding default values previously set
+		 */
+		cdesc->qat_dir = ICP_QAT_HW_CIPHER_ENCRYPT;
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	} else if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT)
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	else
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
+	/* For Snow3G, set key convert and other bits */
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	}
+
+	cipher->aes.cipher_config.val =
+	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
+					cdesc->qat_cipher_alg, key_convert,
+					cdesc->qat_dir);
+	memcpy(cipher->aes.key, cipherkey, cipherkeylen);
+
+	proto = ICP_QAT_FW_LA_PROTO_GET(header->serv_specif_flags);
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2)
+		proto = ICP_QAT_FW_LA_SNOW_3G_PROTO;
+
+	/* Request template setup */
+	qat_alg_init_common_hdr(header);
+	header->service_cmd_id = cdesc->qat_cmd;
+
+	ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
+	/* Configure the common header protocol flags */
+	ICP_QAT_FW_LA_PROTO_SET(header->serv_specif_flags, proto);
+	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
+	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
+
+	/* Cipher CD config setup */
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
+		cipher_cd_ctrl->cipher_key_sz =
+			(ICP_QAT_HW_SNOW_3G_UEA2_KEY_SZ +
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ) >> 3;
+		cipher_cd_ctrl->cipher_state_sz =
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	} else {
+		cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
+		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	}
+
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else {
+		PMD_DRV_LOG(ERR, "invalid param, only authenticated "
+			    "encryption supported");
+		return -EFAULT;
+	}
+	return 0;
+}
+
+int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
+						uint8_t *authkey,
+						uint32_t authkeylen,
+						uint32_t add_auth_data_length,
+						uint32_t digestsize)
+{
+	struct icp_qat_hw_cipher_algo_blk *cipher;
+	struct icp_qat_hw_auth_algo_blk *hash;
 	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
 	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
 	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
@@ -379,31 +503,57 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 		((char *)&req_tmpl->serv_specif_rqpars +
 		sizeof(struct icp_qat_fw_la_cipher_req_params));
 	enum icp_qat_hw_cipher_convert key_convert;
-	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO; /* no CCM/GCM/Snow3G */
+	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO;	/* no CCM/GCM/Snow3G */
 	uint16_t state1_size = 0;
 	uint16_t state2_size = 0;
+	uint16_t cipher_offset = 0, hash_offset = 0;
 
 	PMD_INIT_FUNC_TRACE();
 
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		hash = (struct icp_qat_hw_auth_algo_blk *)&cdesc->cd;
+		cipher =
+		(struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk));
+		hash_offset = 0;
+		cipher_offset = ((char *)hash - (char *)cipher);
+	} else {
+		cipher = (struct icp_qat_hw_cipher_algo_blk *)&cdesc->cd;
+		hash = (struct icp_qat_hw_auth_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_cipher_algo_blk));
+		cipher_offset = 0;
+		hash_offset = ((char *)hash - (char *)cipher);
+	}
+
 	/* CD setup */
 	if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT) {
-		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_RET_AUTH_RES);
+					   ICP_QAT_FW_LA_RET_AUTH_RES);
 		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+					   ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
 	} else {
-		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+					   ICP_QAT_FW_LA_NO_RET_AUTH_RES);
 		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-				   ICP_QAT_FW_LA_CMP_AUTH_RES);
+					   ICP_QAT_FW_LA_CMP_AUTH_RES);
 	}
 
-	cipher->aes.cipher_config.val = ICP_QAT_HW_CIPHER_CONFIG_BUILD(
-			cdesc->qat_mode, cdesc->qat_cipher_alg, key_convert,
-			cdesc->qat_dir);
-	memcpy(cipher->aes.key, cipherkey, cipherkeylen);
+	if (cdesc->qat_mode == ICP_QAT_HW_CIPHER_CTR_MODE) {
+		/* CTR Streaming ciphers are a special case. Decrypt = encrypt
+		 * Overriding default values previously set
+		 */
+		cdesc->qat_dir = ICP_QAT_HW_CIPHER_ENCRYPT;
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	} else if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT)
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	else
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
+	cipher->aes.cipher_config.val =
+	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
+					cdesc->qat_cipher_alg, key_convert,
+					cdesc->qat_dir);
+	memcpy(cipher->aes.key, authkey, authkeylen);
 
 	hash->sha.inner_setup.auth_config.reserved = 0;
 	hash->sha.inner_setup.auth_config.config =
@@ -423,7 +573,7 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 	} else if ((cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_128) ||
 		(cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_64)) {
 		if (qat_alg_do_precomputes(cdesc->qat_hash_alg,
-			cipherkey, cipherkeylen, (uint8_t *)(hash->sha.state1 +
+			authkey, authkeylen, (uint8_t *)(hash->sha.state1 +
 			ICP_QAT_HW_GALOIS_128_STATE1_SZ), &state2_size)) {
 			PMD_DRV_LOG(ERR, "(GCM)precompute failed");
 			return -EFAULT;
@@ -454,15 +604,15 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 	/* Configure the common header protocol flags */
 	ICP_QAT_FW_LA_PROTO_SET(header->serv_specif_flags, proto);
 	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
-	cd_pars->u.s.content_desc_params_sz = sizeof(struct qat_alg_cd) >> 3;
+	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
 
 	/* Cipher CD config setup */
-	cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
+	cipher_cd_ctrl->cipher_key_sz = authkeylen >> 3;
 	cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
-	cipher_cd_ctrl->cipher_cfg_offset = 0;
+	cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
 
 	/* Auth CD config setup */
-	hash_cd_ctrl->hash_cfg_offset = ((char *)hash - (char *)cipher) >> 3;
+	hash_cd_ctrl->hash_cfg_offset = hash_offset >> 3;
 	hash_cd_ctrl->hash_flags = ICP_QAT_FW_AUTH_HDR_FLAG_NO_NESTED;
 	hash_cd_ctrl->inner_res_sz = digestsize;
 	hash_cd_ctrl->final_sz = digestsize;
@@ -505,8 +655,12 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 					>> 3);
 	auth_param->auth_res_sz = digestsize;
 
-
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_AUTH) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
 		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
 				ICP_QAT_FW_SLICE_CIPHER);
 		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
diff --git a/drivers/crypto/qat/qat_crypto.c b/drivers/crypto/qat/qat_crypto.c
index 69162b1..9fe48cb 100644
--- a/drivers/crypto/qat/qat_crypto.c
+++ b/drivers/crypto/qat/qat_crypto.c
@@ -90,16 +90,16 @@ void qat_crypto_sym_clear_session(struct rte_cryptodev *dev,
 static int
 qat_get_cmd_id(const struct rte_crypto_sym_xform *xform)
 {
-	if (xform->next == NULL)
-		return -1;
-
 	/* Cipher Only */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && xform->next == NULL)
-		return -1; /* return ICP_QAT_FW_LA_CMD_CIPHER; */
+		return ICP_QAT_FW_LA_CMD_CIPHER;
 
 	/* Authentication Only */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && xform->next == NULL)
-		return -1; /* return ICP_QAT_FW_LA_CMD_AUTH; */
+		return ICP_QAT_FW_LA_CMD_AUTH;
+
+	if (xform->next == NULL)
+		return -1;
 
 	/* Cipher then Authenticate */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER &&
@@ -139,31 +139,16 @@ qat_get_cipher_xform(struct rte_crypto_sym_xform *xform)
 
 	return NULL;
 }
-
-
 void *
-qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
+qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
 		struct rte_crypto_sym_xform *xform, void *session_private)
 {
 	struct qat_pmd_private *internals = dev->data->dev_private;
 
 	struct qat_session *session = session_private;
 
-	struct rte_crypto_auth_xform *auth_xform = NULL;
 	struct rte_crypto_cipher_xform *cipher_xform = NULL;
 
-	int qat_cmd_id;
-
-	PMD_INIT_FUNC_TRACE();
-
-	/* Get requested QAT command id */
-	qat_cmd_id = qat_get_cmd_id(xform);
-	if (qat_cmd_id < 0 || qat_cmd_id >= ICP_QAT_FW_LA_CMD_DELIMITER) {
-		PMD_DRV_LOG(ERR, "Unsupported xform chain requested");
-		goto error_out;
-	}
-	session->qat_cmd = (enum icp_qat_fw_la_cmd_id)qat_cmd_id;
-
 	/* Get cipher xform from crypto xform chain */
 	cipher_xform = qat_get_cipher_xform(xform);
 
@@ -205,8 +190,87 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 	else
 		session->qat_dir = ICP_QAT_HW_CIPHER_DECRYPT;
 
+	if (qat_alg_aead_session_create_content_desc_cipher(session,
+						cipher_xform->key.data,
+						cipher_xform->key.length))
+		goto error_out;
+
+	return session;
+
+error_out:
+	rte_mempool_put(internals->sess_mp, session);
+	return NULL;
+}
+
+
+void *
+qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
+		struct rte_crypto_sym_xform *xform, void *session_private)
+{
+	struct qat_pmd_private *internals = dev->data->dev_private;
+
+	struct qat_session *session = session_private;
+
+	int qat_cmd_id;
+
+	PMD_INIT_FUNC_TRACE();
+
+	/* Get requested QAT command id */
+	qat_cmd_id = qat_get_cmd_id(xform);
+	if (qat_cmd_id < 0 || qat_cmd_id >= ICP_QAT_FW_LA_CMD_DELIMITER) {
+		PMD_DRV_LOG(ERR, "Unsupported xform chain requested");
+		goto error_out;
+	}
+	session->qat_cmd = (enum icp_qat_fw_la_cmd_id)qat_cmd_id;
+	switch (session->qat_cmd) {
+	case ICP_QAT_FW_LA_CMD_CIPHER:
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_AUTH:
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_CIPHER_HASH:
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_HASH_CIPHER:
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_TRNG_GET_RANDOM:
+	case ICP_QAT_FW_LA_CMD_TRNG_TEST:
+	case ICP_QAT_FW_LA_CMD_SSL3_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_TLS_V1_1_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_TLS_V1_2_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_MGF1:
+	case ICP_QAT_FW_LA_CMD_AUTH_PRE_COMP:
+	case ICP_QAT_FW_LA_CMD_CIPHER_PRE_COMP:
+	case ICP_QAT_FW_LA_CMD_DELIMITER:
+	PMD_DRV_LOG(ERR, "Unsupported Service %u",
+		session->qat_cmd);
+		goto error_out;
+	default:
+	PMD_DRV_LOG(ERR, "Unsupported Service %u",
+		session->qat_cmd);
+		goto error_out;
+	}
+	return session;
+
+error_out:
+	rte_mempool_put(internals->sess_mp, session);
+	return NULL;
+}
+
+struct qat_session *
+qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
+				struct rte_crypto_sym_xform *xform,
+				struct qat_session *session_private)
+{
 
-	/* Get authentication xform from Crypto xform chain */
+	struct qat_pmd_private *internals = dev->data->dev_private;
+	struct qat_session *session = session_private;
+	struct rte_crypto_auth_xform *auth_xform = NULL;
+	struct rte_crypto_cipher_xform *cipher_xform = NULL;
 	auth_xform = qat_get_auth_xform(xform);
 
 	switch (auth_xform->algo) {
@@ -250,17 +314,26 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 				auth_xform->algo);
 		goto error_out;
 	}
+	cipher_xform = qat_get_cipher_xform(xform);
 
-	if (qat_alg_aead_session_create_content_desc(session,
-		cipher_xform->key.data,
-		cipher_xform->key.length,
-		auth_xform->key.data,
-		auth_xform->key.length,
-		auth_xform->add_auth_data_length,
-		auth_xform->digest_length))
-		goto error_out;
-
-	return (struct rte_crypto_sym_session *)session;
+	if ((session->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_128) ||
+			(session->qat_hash_alg ==
+				ICP_QAT_HW_AUTH_ALGO_GALOIS_64))  {
+		if (qat_alg_aead_session_create_content_desc_auth(session,
+				cipher_xform->key.data,
+				cipher_xform->key.length,
+				auth_xform->add_auth_data_length,
+				auth_xform->digest_length))
+			goto error_out;
+	} else {
+		if (qat_alg_aead_session_create_content_desc_auth(session,
+				auth_xform->key.data,
+				auth_xform->key.length,
+				auth_xform->add_auth_data_length,
+				auth_xform->digest_length))
+			goto error_out;
+	}
+	return session;
 
 error_out:
 	rte_mempool_put(internals->sess_mp, session);
diff --git a/drivers/crypto/qat/qat_crypto.h b/drivers/crypto/qat/qat_crypto.h
index 9323383..0afe74e 100644
--- a/drivers/crypto/qat/qat_crypto.h
+++ b/drivers/crypto/qat/qat_crypto.h
@@ -111,6 +111,16 @@ extern void *
 qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 		struct rte_crypto_sym_xform *xform, void *session_private);
 
+struct qat_session *
+qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
+				struct rte_crypto_sym_xform *xform,
+				struct qat_session *session_private);
+
+void *
+qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
+		struct rte_crypto_sym_xform *xform, void *session_private);
+
+
 extern void
 qat_crypto_sym_clear_session(struct rte_cryptodev *dev, void *session);
 
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v2 2/3] qat: add support for Snow3G
  2016-02-23 14:02 ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Deepak Kumar JAIN
  2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
@ 2016-02-23 14:02   ` Deepak Kumar JAIN
  2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
  2016-03-16  5:15   ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Cao, Min
  3 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-02-23 14:02 UTC (permalink / raw)
  To: dev

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 doc/guides/cryptodevs/qat.rst                    |  8 ++-
 doc/guides/rel_notes/release_16_04.rst           |  4 ++
 drivers/crypto/qat/qat_adf/qat_algs.h            |  1 +
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 86 +++++++++++++++++++++---
 drivers/crypto/qat/qat_crypto.c                  | 10 +++
 5 files changed, 97 insertions(+), 12 deletions(-)

diff --git a/doc/guides/cryptodevs/qat.rst b/doc/guides/cryptodevs/qat.rst
index 1901842..b5a48ec 100644
--- a/doc/guides/cryptodevs/qat.rst
+++ b/doc/guides/cryptodevs/qat.rst
@@ -1,5 +1,5 @@
 ..  BSD LICENSE
-    Copyright(c) 2015 Intel Corporation. All rights reserved.
+    Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
 
     Redistribution and use in source and binary forms, with or without
     modification, are permitted provided that the following conditions
@@ -47,6 +47,7 @@ Cipher algorithms:
 * ``RTE_CRYPTO_SYM_CIPHER_AES128_CBC``
 * ``RTE_CRYPTO_SYM_CIPHER_AES192_CBC``
 * ``RTE_CRYPTO_SYM_CIPHER_AES256_CBC``
+* ``RTE_CRYPTO_SYM_CIPHER_SNOW3G_UEA2``
 
 Hash algorithms:
 
@@ -54,14 +55,15 @@ Hash algorithms:
 * ``RTE_CRYPTO_AUTH_SHA256_HMAC``
 * ``RTE_CRYPTO_AUTH_SHA512_HMAC``
 * ``RTE_CRYPTO_AUTH_AES_XCBC_MAC``
+* ``RTE_CRYPTO_SYM_CIPHER_SNOW3G_UIA2``
 
 
 Limitations
 -----------
 
 * Chained mbufs are not supported.
-* Hash only is not supported.
-* Cipher only is not supported.
+* Hash only is not supported except Snow3G UIA2.
+* Cipher only is not supported except Snow3G UEA2.
 * Only in-place is currently supported (destination address is the same as source address).
 * Only supports the session-oriented API implementation (session-less APIs are not supported).
 * Not performance tuned.
diff --git a/doc/guides/rel_notes/release_16_04.rst b/doc/guides/rel_notes/release_16_04.rst
index 123a6fd..ee59bcf 100644
--- a/doc/guides/rel_notes/release_16_04.rst
+++ b/doc/guides/rel_notes/release_16_04.rst
@@ -39,6 +39,10 @@ This section should contain new features added in this release. Sample format:
 
   Enabled virtio 1.0 support for virtio pmd driver.
 
+* **Added the support of Snow3g UEA2 Cipher operation for Intel Quick Assist Devices.**
+
+   Enabled support for Snow3g Wireless algorithm for Intel Quick Assist devices.
+   Support for cipher only, Hash only is also provided laong with alg-chaing operations.
 
 Resolved Issues
 ---------------
diff --git a/drivers/crypto/qat/qat_adf/qat_algs.h b/drivers/crypto/qat/qat_adf/qat_algs.h
index b73a5d0..b47dbc2 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs.h
+++ b/drivers/crypto/qat/qat_adf/qat_algs.h
@@ -125,5 +125,6 @@ void qat_alg_ablkcipher_init_dec(struct qat_alg_ablkcipher_cd *cd,
 					unsigned int keylen);
 
 int qat_alg_validate_aes_key(int key_len, enum icp_qat_hw_cipher_algo *alg);
+int qat_alg_validate_snow3g_key(int key_len, enum icp_qat_hw_cipher_algo *alg);
 
 #endif
diff --git a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
index bef444b..dd27476 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
+++ b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
@@ -376,7 +376,8 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 
 	PMD_INIT_FUNC_TRACE();
 
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER &&
+		cdesc->qat_hash_alg != ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2) {
 		cipher =
 		    (struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
 				sizeof(struct icp_qat_hw_auth_algo_blk));
@@ -409,13 +410,20 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 	else
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
 	/* For Snow3G, set key convert and other bits */
 	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
 					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
-		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+		if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER)  {
+			ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_RET_AUTH_RES);
+			ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+		}
 	}
 
 	cipher->aes.cipher_config.val =
@@ -431,7 +439,6 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 	/* Request template setup */
 	qat_alg_init_common_hdr(header);
 	header->service_cmd_id = cdesc->qat_cmd;
-
 	ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
 					ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
 	/* Configure the common header protocol flags */
@@ -447,6 +454,10 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 		cipher_cd_ctrl->cipher_state_sz =
 			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ >> 3;
 		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+		if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER)  {
+		ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_DIGEST_IN_BUFFER);
+		}
 	} else {
 		cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
 		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
@@ -492,6 +503,7 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 {
 	struct icp_qat_hw_cipher_algo_blk *cipher;
 	struct icp_qat_hw_auth_algo_blk *hash;
+	struct icp_qat_hw_cipher_algo_blk *cipherconfig;
 	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
 	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
 	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
@@ -510,7 +522,8 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 
 	PMD_INIT_FUNC_TRACE();
 
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER &&
+		cdesc->qat_hash_alg != ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2) {
 		hash = (struct icp_qat_hw_auth_algo_blk *)&cdesc->cd;
 		cipher =
 		(struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
@@ -549,11 +562,13 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 	else
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
 	cipher->aes.cipher_config.val =
 	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
 					cdesc->qat_cipher_alg, key_convert,
 					cdesc->qat_dir);
-	memcpy(cipher->aes.key, authkey, authkeylen);
 
 	hash->sha.inner_setup.auth_config.reserved = 0;
 	hash->sha.inner_setup.auth_config.config =
@@ -561,6 +576,22 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 				cdesc->qat_hash_alg, digestsize);
 	hash->sha.inner_setup.auth_counter.counter =
 		rte_bswap32(qat_hash_get_block_size(cdesc->qat_hash_alg));
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)  {
+		hash->sha.inner_setup.auth_counter.counter = 0;
+		hash->sha.outer_setup.auth_config.reserved = 0;
+		cipherconfig = (struct icp_qat_hw_cipher_algo_blk *)
+				((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk)
+				+ 16);
+		cipherconfig->aes.cipher_config.val =
+		ICP_QAT_HW_CIPHER_CONFIG_BUILD(ICP_QAT_HW_CIPHER_ECB_MODE,
+			ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2,
+			ICP_QAT_HW_CIPHER_KEY_CONVERT,
+			ICP_QAT_HW_CIPHER_ENCRYPT);
+		memcpy(cipherconfig->aes.key, authkey, authkeylen);
+		memset(cipherconfig->aes.key + authkeylen, 0,
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ);
+	}
 
 	/* Do precomputes */
 	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_AES_XCBC_MAC) {
@@ -587,6 +618,9 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 					ICP_QAT_HW_GALOIS_H_SZ]) =
 			rte_bswap32(add_auth_data_length);
 		proto = ICP_QAT_FW_LA_GCM_PROTO;
+	} else if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)  {
+		proto = ICP_QAT_FW_LA_SNOW_3G_PROTO;
+		state1_size = qat_hash_get_state1_size(cdesc->qat_hash_alg);
 	} else {
 		if (qat_alg_do_precomputes(cdesc->qat_hash_alg,
 			authkey, authkeylen, (uint8_t *)(hash->sha.state1),
@@ -606,10 +640,25 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
 	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
 
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_AUTH)  {
+		ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
+		ICP_QAT_FW_LA_CIPH_IV_FLD_FLAG_SET(header->serv_specif_flags,
+			ICP_QAT_FW_CIPH_IV_64BIT_PTR);
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	}
+
 	/* Cipher CD config setup */
-	cipher_cd_ctrl->cipher_key_sz = authkeylen >> 3;
-	cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
-	cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	if (cdesc->qat_cmd != ICP_QAT_FW_LA_CMD_AUTH) {
+		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset>>3;
+	} else {
+		cipher_cd_ctrl->cipher_state_sz = 0;
+		cipher_cd_ctrl->cipher_cfg_offset = 0;
+	}
 
 	/* Auth CD config setup */
 	hash_cd_ctrl->hash_cfg_offset = hash_offset >> 3;
@@ -644,6 +693,13 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 		hash_cd_ctrl->inner_state1_sz = ICP_QAT_HW_GALOIS_128_STATE1_SZ;
 		memset(hash->sha.state1, 0, ICP_QAT_HW_GALOIS_128_STATE1_SZ);
 		break;
+	case ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2:
+		hash_cd_ctrl->inner_state2_sz =
+				ICP_QAT_HW_SNOW_3G_UIA2_STATE2_SZ;
+		hash_cd_ctrl->inner_state1_sz =
+				ICP_QAT_HW_SNOW_3G_UIA2_STATE1_SZ;
+		memset(hash->sha.state1, 0, ICP_QAT_HW_SNOW_3G_UIA2_STATE1_SZ);
+		break;
 	default:
 		PMD_DRV_LOG(ERR, "invalid HASH alg %u", cdesc->qat_hash_alg);
 		return -EFAULT;
@@ -753,3 +809,15 @@ int qat_alg_validate_aes_key(int key_len, enum icp_qat_hw_cipher_algo *alg)
 	}
 	return 0;
 }
+
+int qat_alg_validate_snow3g_key(int key_len, enum icp_qat_hw_cipher_algo *alg)
+{
+	switch (key_len) {
+	case ICP_QAT_HW_SNOW_3G_UEA2_KEY_SZ:
+		*alg = ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2;
+		break;
+	default:
+		return -EINVAL;
+	}
+	return 0;
+}
diff --git a/drivers/crypto/qat/qat_crypto.c b/drivers/crypto/qat/qat_crypto.c
index 9fe48cb..a2c2daf 100644
--- a/drivers/crypto/qat/qat_crypto.c
+++ b/drivers/crypto/qat/qat_crypto.c
@@ -169,6 +169,14 @@ qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
 		}
 		session->qat_mode = ICP_QAT_HW_CIPHER_CTR_MODE;
 		break;
+	case RTE_CRYPTO_CIPHER_SNOW3G_UEA2:
+		if (qat_alg_validate_snow3g_key(cipher_xform->key.length,
+					&session->qat_cipher_alg) != 0) {
+			PMD_DRV_LOG(ERR, "Invalid SNOW3G cipher key size");
+			goto error_out;
+		}
+		session->qat_mode = ICP_QAT_HW_CIPHER_ECB_MODE;
+		break;
 	case RTE_CRYPTO_CIPHER_NULL:
 	case RTE_CRYPTO_CIPHER_3DES_ECB:
 	case RTE_CRYPTO_CIPHER_3DES_CBC:
@@ -303,6 +311,8 @@ qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
 	case RTE_CRYPTO_AUTH_AES_CCM:
 	case RTE_CRYPTO_AUTH_KASUMI_F9:
 	case RTE_CRYPTO_AUTH_SNOW3G_UIA2:
+		session->qat_hash_alg = ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2;
+		break;
 	case RTE_CRYPTO_AUTH_AES_CMAC:
 	case RTE_CRYPTO_AUTH_AES_CBC_MAC:
 	case RTE_CRYPTO_AUTH_ZUC_EIA3:
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v2 3/3] app/test: add Snow3G tests
  2016-02-23 14:02 ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Deepak Kumar JAIN
  2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
  2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 2/3] qat: add support for Snow3G Deepak Kumar JAIN
@ 2016-02-23 14:02   ` Deepak Kumar JAIN
  2016-03-16  5:15   ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Cao, Min
  3 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-02-23 14:02 UTC (permalink / raw)
  To: dev

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 4 files changed, 1831 insertions(+), 3 deletions(-)
 create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c
index 29e4b29..1983184 100644
--- a/app/test/test_cryptodev.c
+++ b/app/test/test_cryptodev.c
@@ -42,7 +42,8 @@
 
 #include "test.h"
 #include "test_cryptodev.h"
-
+#include "test_cryptodev_snow3g_test_vectors.h"
+#include "test_cryptodev_snow3g_hash_test_vectors.h"
 static enum rte_cryptodev_type gbl_cryptodev_type;
 
 struct crypto_testsuite_params {
@@ -68,6 +69,9 @@ struct crypto_unittest_params {
 	uint8_t *digest;
 };
 
+#define ALIGN_POW2_ROUNDUP(num, align) \
+	(((num) + (align) - 1) & ~((align) - 1))
+
 /*
  * Forward declarations.
  */
@@ -1748,6 +1752,997 @@ test_AES_CBC_HMAC_AES_XCBC_decrypt_digest_verify(void)
 	return TEST_SUCCESS;
 }
 
+/* ***** Snow3G Tests ***** */
+static int
+create_snow3g_hash_session(uint8_t dev_id,
+	const uint8_t *key, const uint8_t key_len,
+	const uint8_t aad_len, const uint8_t auth_len,
+	enum rte_crypto_auth_operation op)
+{
+	uint8_t hash_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(hash_key, key, key_len);
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.next = NULL;
+
+	ut_params->auth_xform.auth.op = op;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	ut_params->auth_xform.auth.key.data = hash_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->auth_xform);
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+static int
+create_snow3g_cipher_session(uint8_t dev_id,
+			enum rte_crypto_cipher_operation op,
+			const uint8_t *key, const uint8_t key_len)
+{
+	uint8_t cipher_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(cipher_key, key, key_len);
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = NULL;
+
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = op;
+	ut_params->cipher_xform.cipher.key.data = cipher_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session */
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+						&ut_params->
+						cipher_xform);
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+
+static int
+create_snow3g_cipher_operation(const uint8_t *iv, const unsigned iv_len,
+			const unsigned data_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+	unsigned iv_pad_len = 0;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+				RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+				"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(ut_params->ibuf
+			, iv_pad_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = iv_pad_len;
+	return 0;
+}
+
+
+static int
+create_snow3g_cipher_auth_session(uint8_t dev_id,
+		enum rte_crypto_cipher_operation cipher_op,
+		enum rte_crypto_auth_operation auth_op,
+		const uint8_t *key, const uint8_t key_len,
+		const uint8_t aad_len, const uint8_t auth_len)
+{
+	uint8_t cipher_auth_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(cipher_auth_key, key, key_len);
+
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.next = NULL;
+
+	ut_params->auth_xform.auth.op = auth_op;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	/* Hash key = cipher key */
+	ut_params->auth_xform.auth.key.data = cipher_auth_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = &ut_params->auth_xform;
+
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = cipher_op;
+	ut_params->cipher_xform.cipher.key.data = cipher_auth_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session*/
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->cipher_xform);
+
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+
+static int
+create_snow3g_auth_cipher_session(uint8_t dev_id,
+		enum rte_crypto_cipher_operation cipher_op,
+		enum rte_crypto_auth_operation auth_op,
+		const uint8_t *key, const uint8_t key_len,
+		const uint8_t aad_len, const uint8_t auth_len)
+	{
+	uint8_t auth_cipher_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(auth_cipher_key, key, key_len);
+
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.auth.op = auth_op;
+	ut_params->auth_xform.next = &ut_params->cipher_xform;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	ut_params->auth_xform.auth.key.data = auth_cipher_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = NULL;
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = cipher_op;
+	ut_params->cipher_xform.cipher.key.data = auth_cipher_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session*/
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->auth_xform);
+
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+
+	return 0;
+}
+
+static int
+create_snow3g_hash_operation(const uint8_t *auth_tag,
+		const unsigned auth_tag_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len,
+		enum rte_crypto_auth_operation op)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned aad_buffer_len;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+		"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+	sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend(
+			ut_params->ibuf, aad_buffer_len);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+					"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+			ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+					ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+				"no room to append auth tag");
+	ut_params->digest = sym_op->auth.digest.data;
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len + aad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+	if (op == RTE_CRYPTO_AUTH_OP_GENERATE)
+		memset(sym_op->auth.digest.data, 0, auth_tag_len);
+	else
+		rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "digest:",
+		sym_op->auth.digest.data,
+		sym_op->auth.digest.length);
+#endif
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len;
+
+	return 0;
+}
+
+static int
+create_snow3g_cipher_hash_operation(const uint8_t *auth_tag,
+		const unsigned auth_tag_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len,
+		enum rte_crypto_auth_operation op,
+		const uint8_t *iv, const unsigned iv_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned iv_pad_len = 0;
+	unsigned aad_buffer_len;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+			"Failed to allocate pktmbuf offload");
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(
+		ut_params->ibuf, iv_pad_len);
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = iv_pad_len;
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+
+	sym_op->auth.aad.data =
+			(uint8_t *)rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+			"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+			ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+			ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+			"no room to append auth tag");
+	ut_params->digest = sym_op->auth.digest.data;
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len + aad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+	if (op == RTE_CRYPTO_AUTH_OP_GENERATE)
+		memset(sym_op->auth.digest.data, 0, auth_tag_len);
+	else
+		rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len);
+
+	#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "digest:",
+		sym_op->auth.digest.data,
+		sym_op->auth.digest.length);
+	#endif
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len;
+
+	return 0;
+}
+
+static int
+create_snow3g_auth_cipher_operation(const unsigned auth_tag_len,
+		const uint8_t *iv, const unsigned iv_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned iv_pad_len = 0;
+	unsigned aad_buffer_len = 0;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+			"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+			ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+			"no room to append auth tag");
+
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+
+	memset(sym_op->auth.digest.data, 0, auth_tag_len);
+
+	#ifdef RTE_APP_TEST_DEBUG
+		rte_hexdump(stdout, "digest:",
+			sym_op->auth.digest.data,
+			sym_op->auth.digest.length);
+	#endif
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(
+		ut_params->ibuf, iv_pad_len);
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+
+	sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend(
+	ut_params->ibuf, aad_buffer_len);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+				"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+				ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = aad_buffer_len + iv_pad_len;
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len + iv_pad_len;
+
+	return 0;
+}
+
+static int
+test_snow3g_authentication(const struct snow3g_hash_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+	unsigned plaintext_pad_len;
+	uint8_t *plaintext;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_hash_session(ts_params->valid_devs[0],
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len,
+			RTE_CRYPTO_AUTH_OP_GENERATE);
+	if (retval < 0)
+		return retval;
+
+	/* alloc mbuf and set payload */
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple of */
+	/* the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+				plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+	/* Create SNOW3G opertaion */
+	retval = create_snow3g_hash_operation(NULL, tdata->digest.len,
+			tdata->aad.data, tdata->aad.len, tdata->plaintext.len,
+			plaintext_pad_len, RTE_CRYPTO_AUTH_OP_GENERATE);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+				ut_params->op);
+	ut_params->obuf = ut_params->op->sym->m_src;
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+			+ plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+	ut_params->digest,
+	tdata->digest.data,
+	DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+	"Snow3G Generated auth tag not as expected");
+
+	return 0;
+}
+
+static int
+test_snow3g_authentication_verify(const struct snow3g_hash_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+	unsigned plaintext_pad_len;
+	uint8_t *plaintext;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_hash_session(ts_params->valid_devs[0],
+				tdata->key.data, tdata->key.len,
+				tdata->aad.len, tdata->digest.len,
+				RTE_CRYPTO_AUTH_OP_VERIFY);
+	if (retval < 0)
+		return retval;
+	/* alloc mbuf and set payload */
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/* of the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+					plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_hash_operation(tdata->digest.data,
+			tdata->digest.len,
+			tdata->aad.data, tdata->aad.len,
+			tdata->plaintext.len, plaintext_pad_len,
+			RTE_CRYPTO_AUTH_OP_VERIFY);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+				ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	if (ut_params->op->status == RTE_CRYPTO_OP_STATUS_SUCCESS)
+		return 0;
+	else
+		return -1;
+
+	return 0;
+}
+
+
+static int
+test_snow3g_hash_generate_test_case_1(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_1);
+}
+
+static int
+test_snow3g_hash_generate_test_case_2(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_2);
+}
+
+static int
+test_snow3g_hash_generate_test_case_3(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_3);
+}
+
+static int
+test_snow3g_hash_verify_test_case_1(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_1);
+
+}
+
+static int
+test_snow3g_hash_verify_test_case_2(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_2);
+}
+
+static int
+test_snow3g_hash_verify_test_case_3(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_3);
+}
+
+static int
+test_snow3g_encryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_session(ts_params->valid_devs[0],
+					RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+					tdata->key.data, tdata->key.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* Clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	       rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/*
+	 * Append data which is padded to a
+	 * multiple of the algorithms block size
+	 */
+	plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16);
+
+	plaintext = (uint8_t *) rte_pktmbuf_append(ut_params->ibuf,
+						plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_operation(tdata->iv.data, tdata->iv.len,
+						tdata->plaintext.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+						ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		ciphertext = plaintext;
+
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8 - lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ciphertext,
+		tdata->ciphertext.data,
+		tdata->ciphertext.len,
+		"Snow3G Ciphertext data not as expected");
+	return 0;
+}
+
+static int test_snow3g_decryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t ciphertext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_session(ts_params->valid_devs[0],
+					RTE_CRYPTO_CIPHER_OP_DECRYPT,
+					tdata->key.data, tdata->key.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* Clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	       rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/*
+	 * Append data which is padded to a
+	 * multiple of the algorithms block size
+	 */
+	ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16);
+
+	ciphertext = (uint8_t *) rte_pktmbuf_append(ut_params->ibuf,
+						ciphertext_pad_len);
+	memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_operation(tdata->iv.data, tdata->iv.len,
+						tdata->ciphertext.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+						ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		plaintext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		plaintext = ciphertext;
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8 - lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(plaintext,
+				tdata->plaintext.data,
+				tdata->plaintext.len,
+				"Snow3G Plaintext data not as expected");
+	return 0;
+}
+
+static int
+test_snow3g_authenticated_encryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_auth_session(ts_params->valid_devs[0],
+			RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+			RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len);
+	if (retval < 0)
+		return retval;
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+			rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/*  of the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+			plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_hash_operation(tdata->digest.data,
+			tdata->digest.len, tdata->aad.data,
+			tdata->aad.len, tdata->plaintext.len,
+			plaintext_pad_len, RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->iv.data, tdata->iv.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+			ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		ciphertext = plaintext;
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8-lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+			ciphertext,
+			tdata->ciphertext.data,
+			tdata->ciphertext.len,
+			"Snow3G Ciphertext data not as expected");
+
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+	    + plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+			ut_params->digest,
+			tdata->digest.data,
+			DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+			"Snow3G Generated auth tag not as expected");
+	return 0;
+}
+static int
+test_snow3g_encrypted_authentication(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_auth_cipher_session(ts_params->valid_devs[0],
+			RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+			RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+			rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/* of the algorithms block size */
+	plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 8);
+
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+			plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_auth_cipher_operation(
+		tdata->digest.len,
+		tdata->iv.data, tdata->iv.len,
+		tdata->aad.data, tdata->aad.len,
+		tdata->plaintext.len, plaintext_pad_len
+	);
+
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+			ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->aad.len + tdata->iv.len;
+	else
+		ciphertext = plaintext;
+
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8-lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+			+ plaintext_pad_len + tdata->aad.len + tdata->iv.len;
+
+	#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ciphertext,
+		tdata->ciphertext.data,
+		tdata->ciphertext.len,
+		"Snow3G Ciphertext data not as expected");
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ut_params->digest,
+		tdata->digest.data,
+		DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+		"Snow3G Generated auth tag not as expected");
+	return 0;
+}
+
+static int
+test_snow3g_encryption_test_case_1(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_1);
+}
+
+static int
+test_snow3g_encryption_test_case_2(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_2);
+}
+
+static int
+test_snow3g_encryption_test_case_3(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_encryption_test_case_4(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_4);
+}
+
+static int
+test_snow3g_encryption_test_case_5(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_5);
+}
+
+static int
+test_snow3g_decryption_test_case_1(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_1);
+}
+
+static int
+test_snow3g_decryption_test_case_2(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_2);
+}
+
+static int
+test_snow3g_decryption_test_case_3(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_decryption_test_case_4(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_4);
+}
+
+static int
+test_snow3g_decryption_test_case_5(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_5);
+}
+static int
+test_snow3g_authenticated_encryption_test_case_1(void)
+{
+	return test_snow3g_authenticated_encryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_encrypted_authentication_test_case_1(void)
+{
+	return test_snow3g_encrypted_authentication(&snow3g_test_case_6);
+}
 
 /* ***** AES-GCM Tests ***** */
 
@@ -1983,9 +2978,47 @@ static struct unit_test_suite cryptodev_qat_testsuite  = {
 				test_AES_CBC_HMAC_AES_XCBC_encrypt_digest),
 		TEST_CASE_ST(ut_setup, ut_teardown,
 				test_AES_CBC_HMAC_AES_XCBC_decrypt_digest_verify),
-
 		TEST_CASE_ST(ut_setup, ut_teardown, test_stats),
+		/** Snow3G encrypt only (UEA2) */
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_4),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_5),
+
 
+		/** Snow3G decrypt only (UEA2) */
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_4),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_5),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_authenticated_encryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encrypted_authentication_test_case_1),
 		TEST_CASES_END() /**< NULL terminate unit test array */
 	}
 };
diff --git a/app/test/test_cryptodev.h b/app/test/test_cryptodev.h
index c84ba42..b69649a 100644
--- a/app/test/test_cryptodev.h
+++ b/app/test/test_cryptodev.h
@@ -1,7 +1,7 @@
 /*-
  *   BSD LICENSE
  *
- *   Copyright(c) 2015 Intel Corporation. All rights reserved.
+ *   Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
  *
  *   Redistribution and use in source and binary forms, with or without
  *   modification, are permitted provided that the following conditions
@@ -58,6 +58,7 @@
 #define DIGEST_BYTE_LENGTH_SHA384		(BYTE_LENGTH(384))
 #define DIGEST_BYTE_LENGTH_SHA512		(BYTE_LENGTH(512))
 #define DIGEST_BYTE_LENGTH_AES_XCBC		(BYTE_LENGTH(96))
+#define DIGEST_BYTE_LENGTH_SNOW3G_UIA2		(BYTE_LENGTH(32))
 #define AES_XCBC_MAC_KEY_SZ			(16)
 
 #define TRUNCATED_DIGEST_BYTE_LENGTH_SHA1		(12)
diff --git a/app/test/test_cryptodev_snow3g_hash_test_vectors.h b/app/test/test_cryptodev_snow3g_hash_test_vectors.h
new file mode 100644
index 0000000..f4fa36d
--- /dev/null
+++ b/app/test/test_cryptodev_snow3g_hash_test_vectors.h
@@ -0,0 +1,415 @@
+/*-
+ *   BSD LICENSE
+ *
+ *   Copyright(c) 2016 Intel Corporation. All rights reserved.
+ *
+ *   Redistribution and use in source and binary forms, with or without
+ *   modification, are permitted provided that the following conditions
+ *   are met:
+ *
+ *	 * Redistributions of source code must retain the above copyright
+ *	   notice, this list of conditions and the following disclaimer.
+ *	 * Redistributions in binary form must reproduce the above copyright
+ *	   notice, this list of conditions and the following disclaimer in
+ *	   the documentation and/or other materials provided with the
+ *	   distribution.
+ *	 * Neither the name of Intel Corporation nor the names of its
+ *	   contributors may be used to endorse or promote products derived
+ *	   from this software without specific prior written permission.
+ *
+ *   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ *   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ *   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ *   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ *   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ *   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ *   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ *   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ *   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ *   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ *   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#ifndef TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_
+#define TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_
+
+struct snow3g_hash_test_data {
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} key;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} aad;
+
+	struct {
+		uint8_t data[2056];
+		unsigned len;
+	} plaintext;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} digest;
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_1 = {
+	.key = {
+		.data = {
+			0xC7, 0x36, 0xC6, 0xAA, 0xB2, 0x2B, 0xFF, 0xF9,
+			0x1E, 0x26, 0x98, 0xD2, 0xE2, 0x2A, 0xD5, 0x7E
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xD0, 0xA7, 0xD4, 0x63, 0xDF, 0x9F, 0xB2, 0xB2,
+			0x78, 0x83, 0x3F, 0xA0, 0x2E, 0x23, 0x5A, 0xA1,
+			0x72, 0xBD, 0x97, 0x0C, 0x14, 0x73, 0xE1, 0x29,
+			0x07, 0xFB, 0x64, 0x8B, 0x65, 0x99, 0xAA, 0xA0,
+			0xB2, 0x4A, 0x03, 0x86, 0x65, 0x42, 0x2B, 0x20,
+			0xA4, 0x99, 0x27, 0x6A, 0x50, 0x42, 0x70, 0x09
+		},
+		.len = 48
+	},
+	.digest = {
+		.data = {0x38, 0xB5, 0x54, 0xC0 },
+		.len  = 4
+	}
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_2 = {
+	.key = {
+		.data = {
+			0xF4, 0xEB, 0xEC, 0x69, 0xE7, 0x3E, 0xAF, 0x2E,
+			0xB2, 0xCF, 0x6A, 0xF4, 0xB3, 0x12, 0x0F, 0xFD
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x29, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0x77, 0x37,
+			0xA9, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0xF7, 0x37
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x10, 0xBF, 0xFF, 0x83, 0x9E, 0x0C, 0x71, 0x65,
+			0x8D, 0xBB, 0x2D, 0x17, 0x07, 0xE1, 0x45, 0x72,
+			0x4F, 0x41, 0xC1, 0x6F, 0x48, 0xBF, 0x40, 0x3C,
+			0x3B, 0x18, 0xE3, 0x8F, 0xD5, 0xD1, 0x66, 0x3B,
+			0x6F, 0x6D, 0x90, 0x01, 0x93, 0xE3, 0xCE, 0xA8,
+			0xBB, 0x4F, 0x1B, 0x4F, 0x5B, 0xE8, 0x22, 0x03,
+			0x22, 0x32, 0xA7, 0x8D, 0x7D, 0x75, 0x23, 0x8D,
+			0x5E, 0x6D, 0xAE, 0xCD, 0x3B, 0x43, 0x22, 0xCF,
+			0x59, 0xBC, 0x7E, 0xA8, 0x4A, 0xB1, 0x88, 0x11,
+			0xB5, 0xBF, 0xB7, 0xBC, 0x55, 0x3F, 0x4F, 0xE4,
+			0x44, 0x78, 0xCE, 0x28, 0x7A, 0x14, 0x87, 0x99,
+			0x90, 0xD1, 0x8D, 0x12, 0xCA, 0x79, 0xD2, 0xC8,
+			0x55, 0x14, 0x90, 0x21, 0xCD, 0x5C, 0xE8, 0xCA,
+			0x03, 0x71, 0xCA, 0x04, 0xFC, 0xCE, 0x14, 0x3E,
+			0x3D, 0x7C, 0xFE, 0xE9, 0x45, 0x85, 0xB5, 0x88,
+			0x5C, 0xAC, 0x46, 0x06, 0x8B
+		},
+	.len = 125
+	},
+	.digest = {
+		.data = {0x06, 0x17, 0x45, 0xAE},
+		.len  = 4
+	}
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_3 = {
+	.key = {
+		.data = {
+			0xB3, 0x12, 0x0F, 0xFD, 0xB2, 0xCF, 0x6A, 0xF4,
+			0xE7, 0x3E, 0xAF, 0x2E, 0xF4, 0xEB, 0xEC, 0x69
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x29, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0x77, 0x37,
+			0xA9, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0xF7, 0x37
+		},
+	.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+			0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01,
+			0xE0, 0x95, 0x80, 0x45, 0xF3, 0xA0, 0xBB, 0xA4,
+			0xE3, 0x96, 0x83, 0x46, 0xF0, 0xA3, 0xB8, 0xA7,
+			0xC0, 0x2A, 0x01, 0x8A, 0xE6, 0x40, 0x76, 0x52,
+			0x26, 0xB9, 0x87, 0xC9, 0x13, 0xE6, 0xCB, 0xF0,
+			0x83, 0x57, 0x00, 0x16, 0xCF, 0x83, 0xEF, 0xBC,
+			0x61, 0xC0, 0x82, 0x51, 0x3E, 0x21, 0x56, 0x1A,
+			0x42, 0x7C, 0x00, 0x9D, 0x28, 0xC2, 0x98, 0xEF,
+			0xAC, 0xE7, 0x8E, 0xD6, 0xD5, 0x6C, 0x2D, 0x45,
+			0x05, 0xAD, 0x03, 0x2E, 0x9C, 0x04, 0xDC, 0x60,
+			0xE7, 0x3A, 0x81, 0x69, 0x6D, 0xA6, 0x65, 0xC6,
+			0xC4, 0x86, 0x03, 0xA5, 0x7B, 0x45, 0xAB, 0x33,
+			0x22, 0x15, 0x85, 0xE6, 0x8E, 0xE3, 0x16, 0x91,
+			0x87, 0xFB, 0x02, 0x39, 0x52, 0x86, 0x32, 0xDD,
+			0x65, 0x6C, 0x80, 0x7E, 0xA3, 0x24, 0x8B, 0x7B,
+			0x46, 0xD0, 0x02, 0xB2, 0xB5, 0xC7, 0x45, 0x8E,
+			0xB8, 0x5B, 0x9C, 0xE9, 0x58, 0x79, 0xE0, 0x34,
+			0x08, 0x59, 0x05, 0x5E, 0x3B, 0x0A, 0xBB, 0xC3,
+			0xEA, 0xCE, 0x87, 0x19, 0xCA, 0xA8, 0x02, 0x65,
+			0xC9, 0x72, 0x05, 0xD5, 0xDC, 0x4B, 0xCC, 0x90,
+			0x2F, 0xE1, 0x83, 0x96, 0x29, 0xED, 0x71, 0x32,
+			0x8A, 0x0F, 0x04, 0x49, 0xF5, 0x88, 0x55, 0x7E,
+			0x68, 0x98, 0x86, 0x0E, 0x04, 0x2A, 0xEC, 0xD8,
+			0x4B, 0x24, 0x04, 0xC2, 0x12, 0xC9, 0x22, 0x2D,
+			0xA5, 0xBF, 0x8A, 0x89, 0xEF, 0x67, 0x97, 0x87,
+			0x0C, 0xF5, 0x07, 0x71, 0xA6, 0x0F, 0x66, 0xA2,
+			0xEE, 0x62, 0x85, 0x36, 0x57, 0xAD, 0xDF, 0x04,
+			0xCD, 0xDE, 0x07, 0xFA, 0x41, 0x4E, 0x11, 0xF1,
+			0x2B, 0x4D, 0x81, 0xB9, 0xB4, 0xE8, 0xAC, 0x53,
+			0x8E, 0xA3, 0x06, 0x66, 0x68, 0x8D, 0x88, 0x1F,
+			0x6C, 0x34, 0x84, 0x21, 0x99, 0x2F, 0x31, 0xB9,
+			0x4F, 0x88, 0x06, 0xED, 0x8F, 0xCC, 0xFF, 0x4C,
+			0x91, 0x23, 0xB8, 0x96, 0x42, 0x52, 0x7A, 0xD6,
+			0x13, 0xB1, 0x09, 0xBF, 0x75, 0x16, 0x74, 0x85,
+			0xF1, 0x26, 0x8B, 0xF8, 0x84, 0xB4, 0xCD, 0x23,
+			0xD2, 0x9A, 0x09, 0x34, 0x92, 0x57, 0x03, 0xD6,
+			0x34, 0x09, 0x8F, 0x77, 0x67, 0xF1, 0xBE, 0x74,
+			0x91, 0xE7, 0x08, 0xA8, 0xBB, 0x94, 0x9A, 0x38,
+			0x73, 0x70, 0x8A, 0xEF, 0x4A, 0x36, 0x23, 0x9E,
+			0x50, 0xCC, 0x08, 0x23, 0x5C, 0xD5, 0xED, 0x6B,
+			0xBE, 0x57, 0x86, 0x68, 0xA1, 0x7B, 0x58, 0xC1,
+			0x17, 0x1D, 0x0B, 0x90, 0xE8, 0x13, 0xA9, 0xE4,
+			0xF5, 0x8A, 0x89, 0xD7, 0x19, 0xB1, 0x10, 0x42,
+			0xD6, 0x36, 0x0B, 0x1B, 0x0F, 0x52, 0xDE, 0xB7,
+			0x30, 0xA5, 0x8D, 0x58, 0xFA, 0xF4, 0x63, 0x15,
+			0x95, 0x4B, 0x0A, 0x87, 0x26, 0x91, 0x47, 0x59,
+			0x77, 0xDC, 0x88, 0xC0, 0xD7, 0x33, 0xFE, 0xFF,
+			0x54, 0x60, 0x0A, 0x0C, 0xC1, 0xD0, 0x30, 0x0A,
+			0xAA, 0xEB, 0x94, 0x57, 0x2C, 0x6E, 0x95, 0xB0,
+			0x1A, 0xE9, 0x0D, 0xE0, 0x4F, 0x1D, 0xCE, 0x47,
+			0xF8, 0x7E, 0x8F, 0xA7, 0xBE, 0xBF, 0x77, 0xE1,
+			0xDB, 0xC2, 0x0D, 0x6B, 0xA8, 0x5C, 0xB9, 0x14,
+			0x3D, 0x51, 0x8B, 0x28, 0x5D, 0xFA, 0x04, 0xB6,
+			0x98, 0xBF, 0x0C, 0xF7, 0x81, 0x9F, 0x20, 0xFA,
+			0x7A, 0x28, 0x8E, 0xB0, 0x70, 0x3D, 0x99, 0x5C,
+			0x59, 0x94, 0x0C, 0x7C, 0x66, 0xDE, 0x57, 0xA9,
+			0xB7, 0x0F, 0x82, 0x37, 0x9B, 0x70, 0xE2, 0x03,
+			0x1E, 0x45, 0x0F, 0xCF, 0xD2, 0x18, 0x13, 0x26,
+			0xFC, 0xD2, 0x8D, 0x88, 0x23, 0xBA, 0xAA, 0x80,
+			0xDF, 0x6E, 0x0F, 0x44, 0x35, 0x59, 0x64, 0x75,
+			0x39, 0xFD, 0x89, 0x07, 0xC0, 0xFF, 0xD9, 0xD7,
+			0x9C, 0x13, 0x0E, 0xD8, 0x1C, 0x9A, 0xFD, 0x9B,
+			0x7E, 0x84, 0x8C, 0x9F, 0xED, 0x38, 0x44, 0x3D,
+			0x5D, 0x38, 0x0E, 0x53, 0xFB, 0xDB, 0x8A, 0xC8,
+			0xC3, 0xD3, 0xF0, 0x68, 0x76, 0x05, 0x4F, 0x12,
+			0x24, 0x61, 0x10, 0x7D, 0xE9, 0x2F, 0xEA, 0x09,
+			0xC6, 0xF6, 0x92, 0x3A, 0x18, 0x8D, 0x53, 0xAF,
+			0xE5, 0x4A, 0x10, 0xF6, 0x0E, 0x6E, 0x9D, 0x5A,
+			0x03, 0xD9, 0x96, 0xB5, 0xFB, 0xC8, 0x20, 0xF8,
+			0xA6, 0x37, 0x11, 0x6A, 0x27, 0xAD, 0x04, 0xB4,
+			0x44, 0xA0, 0x93, 0x2D, 0xD6, 0x0F, 0xBD, 0x12,
+			0x67, 0x1C, 0x11, 0xE1, 0xC0, 0xEC, 0x73, 0xE7,
+			0x89, 0x87, 0x9F, 0xAA, 0x3D, 0x42, 0xC6, 0x4D,
+			0x20, 0xCD, 0x12, 0x52, 0x74, 0x2A, 0x37, 0x68,
+			0xC2, 0x5A, 0x90, 0x15, 0x85, 0x88, 0x8E, 0xCE,
+			0xE1, 0xE6, 0x12, 0xD9, 0x93, 0x6B, 0x40, 0x3B,
+			0x07, 0x75, 0x94, 0x9A, 0x66, 0xCD, 0xFD, 0x99,
+			0xA2, 0x9B, 0x13, 0x45, 0xBA, 0xA8, 0xD9, 0xD5,
+			0x40, 0x0C, 0x91, 0x02, 0x4B, 0x0A, 0x60, 0x73,
+			0x63, 0xB0, 0x13, 0xCE, 0x5D, 0xE9, 0xAE, 0x86,
+			0x9D, 0x3B, 0x8D, 0x95, 0xB0, 0x57, 0x0B, 0x3C,
+			0x2D, 0x39, 0x14, 0x22, 0xD3, 0x24, 0x50, 0xCB,
+			0xCF, 0xAE, 0x96, 0x65, 0x22, 0x86, 0xE9, 0x6D,
+			0xEC, 0x12, 0x14, 0xA9, 0x34, 0x65, 0x27, 0x98,
+			0x0A, 0x81, 0x92, 0xEA, 0xC1, 0xC3, 0x9A, 0x3A,
+			0xAF, 0x6F, 0x15, 0x35, 0x1D, 0xA6, 0xBE, 0x76,
+			0x4D, 0xF8, 0x97, 0x72, 0xEC, 0x04, 0x07, 0xD0,
+			0x6E, 0x44, 0x15, 0xBE, 0xFA, 0xE7, 0xC9, 0x25,
+			0x80, 0xDF, 0x9B, 0xF5, 0x07, 0x49, 0x7C, 0x8F,
+			0x29, 0x95, 0x16, 0x0D, 0x4E, 0x21, 0x8D, 0xAA,
+			0xCB, 0x02, 0x94, 0x4A, 0xBF, 0x83, 0x34, 0x0C,
+			0xE8, 0xBE, 0x16, 0x86, 0xA9, 0x60, 0xFA, 0xF9,
+			0x0E, 0x2D, 0x90, 0xC5, 0x5C, 0xC6, 0x47, 0x5B,
+			0xAB, 0xC3, 0x17, 0x1A, 0x80, 0xA3, 0x63, 0x17,
+			0x49, 0x54, 0x95, 0x5D, 0x71, 0x01, 0xDA, 0xB1,
+			0x6A, 0xE8, 0x17, 0x91, 0x67, 0xE2, 0x14, 0x44,
+			0xB4, 0x43, 0xA9, 0xEA, 0xAA, 0x7C, 0x91, 0xDE,
+			0x36, 0xD1, 0x18, 0xC3, 0x9D, 0x38, 0x9F, 0x8D,
+			0xD4, 0x46, 0x9A, 0x84, 0x6C, 0x9A, 0x26, 0x2B,
+			0xF7, 0xFA, 0x18, 0x48, 0x7A, 0x79, 0xE8, 0xDE,
+			0x11, 0x69, 0x9E, 0x0B, 0x8F, 0xDF, 0x55, 0x7C,
+			0xB4, 0x87, 0x19, 0xD4, 0x53, 0xBA, 0x71, 0x30,
+			0x56, 0x10, 0x9B, 0x93, 0xA2, 0x18, 0xC8, 0x96,
+			0x75, 0xAC, 0x19, 0x5F, 0xB4, 0xFB, 0x06, 0x63,
+			0x9B, 0x37, 0x97, 0x14, 0x49, 0x55, 0xB3, 0xC9,
+			0x32, 0x7D, 0x1A, 0xEC, 0x00, 0x3D, 0x42, 0xEC,
+			0xD0, 0xEA, 0x98, 0xAB, 0xF1, 0x9F, 0xFB, 0x4A,
+			0xF3, 0x56, 0x1A, 0x67, 0xE7, 0x7C, 0x35, 0xBF,
+			0x15, 0xC5, 0x9C, 0x24, 0x12, 0xDA, 0x88, 0x1D,
+			0xB0, 0x2B, 0x1B, 0xFB, 0xCE, 0xBF, 0xAC, 0x51,
+			0x52, 0xBC, 0x99, 0xBC, 0x3F, 0x1D, 0x15, 0xF7,
+			0x71, 0x00, 0x1B, 0x70, 0x29, 0xFE, 0xDB, 0x02,
+			0x8F, 0x8B, 0x85, 0x2B, 0xC4, 0x40, 0x7E, 0xB8,
+			0x3F, 0x89, 0x1C, 0x9C, 0xA7, 0x33, 0x25, 0x4F,
+			0xDD, 0x1E, 0x9E, 0xDB, 0x56, 0x91, 0x9C, 0xE9,
+			0xFE, 0xA2, 0x1C, 0x17, 0x40, 0x72, 0x52, 0x1C,
+			0x18, 0x31, 0x9A, 0x54, 0xB5, 0xD4, 0xEF, 0xBE,
+			0xBD, 0xDF, 0x1D, 0x8B, 0x69, 0xB1, 0xCB, 0xF2,
+			0x5F, 0x48, 0x9F, 0xCC, 0x98, 0x13, 0x72, 0x54,
+			0x7C, 0xF4, 0x1D, 0x00, 0x8E, 0xF0, 0xBC, 0xA1,
+			0x92, 0x6F, 0x93, 0x4B, 0x73, 0x5E, 0x09, 0x0B,
+			0x3B, 0x25, 0x1E, 0xB3, 0x3A, 0x36, 0xF8, 0x2E,
+			0xD9, 0xB2, 0x9C, 0xF4, 0xCB, 0x94, 0x41, 0x88,
+			0xFA, 0x0E, 0x1E, 0x38, 0xDD, 0x77, 0x8F, 0x7D,
+			0x1C, 0x9D, 0x98, 0x7B, 0x28, 0xD1, 0x32, 0xDF,
+			0xB9, 0x73, 0x1F, 0xA4, 0xF4, 0xB4, 0x16, 0x93,
+			0x5B, 0xE4, 0x9D, 0xE3, 0x05, 0x16, 0xAF, 0x35,
+			0x78, 0x58, 0x1F, 0x2F, 0x13, 0xF5, 0x61, 0xC0,
+			0x66, 0x33, 0x61, 0x94, 0x1E, 0xAB, 0x24, 0x9A,
+			0x4B, 0xC1, 0x23, 0xF8, 0xD1, 0x5C, 0xD7, 0x11,
+			0xA9, 0x56, 0xA1, 0xBF, 0x20, 0xFE, 0x6E, 0xB7,
+			0x8A, 0xEA, 0x23, 0x73, 0x36, 0x1D, 0xA0, 0x42,
+			0x6C, 0x79, 0xA5, 0x30, 0xC3, 0xBB, 0x1D, 0xE0,
+			0xC9, 0x97, 0x22, 0xEF, 0x1F, 0xDE, 0x39, 0xAC,
+			0x2B, 0x00, 0xA0, 0xA8, 0xEE, 0x7C, 0x80, 0x0A,
+			0x08, 0xBC, 0x22, 0x64, 0xF8, 0x9F, 0x4E, 0xFF,
+			0xE6, 0x27, 0xAC, 0x2F, 0x05, 0x31, 0xFB, 0x55,
+			0x4F, 0x6D, 0x21, 0xD7, 0x4C, 0x59, 0x0A, 0x70,
+			0xAD, 0xFA, 0xA3, 0x90, 0xBD, 0xFB, 0xB3, 0xD6,
+			0x8E, 0x46, 0x21, 0x5C, 0xAB, 0x18, 0x7D, 0x23,
+			0x68, 0xD5, 0xA7, 0x1F, 0x5E, 0xBE, 0xC0, 0x81,
+			0xCD, 0x3B, 0x20, 0xC0, 0x82, 0xDB, 0xE4, 0xCD,
+			0x2F, 0xAC, 0xA2, 0x87, 0x73, 0x79, 0x5D, 0x6B,
+			0x0C, 0x10, 0x20, 0x4B, 0x65, 0x9A, 0x93, 0x9E,
+			0xF2, 0x9B, 0xBE, 0x10, 0x88, 0x24, 0x36, 0x24,
+			0x42, 0x99, 0x27, 0xA7, 0xEB, 0x57, 0x6D, 0xD3,
+			0xA0, 0x0E, 0xA5, 0xE0, 0x1A, 0xF5, 0xD4, 0x75,
+			0x83, 0xB2, 0x27, 0x2C, 0x0C, 0x16, 0x1A, 0x80,
+			0x65, 0x21, 0xA1, 0x6F, 0xF9, 0xB0, 0xA7, 0x22,
+			0xC0, 0xCF, 0x26, 0xB0, 0x25, 0xD5, 0x83, 0x6E,
+			0x22, 0x58, 0xA4, 0xF7, 0xD4, 0x77, 0x3A, 0xC8,
+			0x01, 0xE4, 0x26, 0x3B, 0xC2, 0x94, 0xF4, 0x3D,
+			0xEF, 0x7F, 0xA8, 0x70, 0x3F, 0x3A, 0x41, 0x97,
+			0x46, 0x35, 0x25, 0x88, 0x76, 0x52, 0xB0, 0xB2,
+			0xA4, 0xA2, 0xA7, 0xCF, 0x87, 0xF0, 0x09, 0x14,
+			0x87, 0x1E, 0x25, 0x03, 0x91, 0x13, 0xC7, 0xE1,
+			0x61, 0x8D, 0xA3, 0x40, 0x64, 0xB5, 0x7A, 0x43,
+			0xC4, 0x63, 0x24, 0x9F, 0xB8, 0xD0, 0x5E, 0x0F,
+			0x26, 0xF4, 0xA6, 0xD8, 0x49, 0x72, 0xE7, 0xA9,
+			0x05, 0x48, 0x24, 0x14, 0x5F, 0x91, 0x29, 0x5C,
+			0xDB, 0xE3, 0x9A, 0x6F, 0x92, 0x0F, 0xAC, 0xC6,
+			0x59, 0x71, 0x2B, 0x46, 0xA5, 0x4B, 0xA2, 0x95,
+			0xBB, 0xE6, 0xA9, 0x01, 0x54, 0xE9, 0x1B, 0x33,
+			0x98, 0x5A, 0x2B, 0xCD, 0x42, 0x0A, 0xD5, 0xC6,
+			0x7E, 0xC9, 0xAD, 0x8E, 0xB7, 0xAC, 0x68, 0x64,
+			0xDB, 0x27, 0x2A, 0x51, 0x6B, 0xC9, 0x4C, 0x28,
+			0x39, 0xB0, 0xA8, 0x16, 0x9A, 0x6B, 0xF5, 0x8E,
+			0x1A, 0x0C, 0x2A, 0xDA, 0x8C, 0x88, 0x3B, 0x7B,
+			0xF4, 0x97, 0xA4, 0x91, 0x71, 0x26, 0x8E, 0xD1,
+			0x5D, 0xDD, 0x29, 0x69, 0x38, 0x4E, 0x7F, 0xF4,
+			0xBF, 0x4A, 0xAB, 0x2E, 0xC9, 0xEC, 0xC6, 0x52,
+			0x9C, 0xF6, 0x29, 0xE2, 0xDF, 0x0F, 0x08, 0xA7,
+			0x7A, 0x65, 0xAF, 0xA1, 0x2A, 0xA9, 0xB5, 0x05,
+			0xDF, 0x8B, 0x28, 0x7E, 0xF6, 0xCC, 0x91, 0x49,
+			0x3D, 0x1C, 0xAA, 0x39, 0x07, 0x6E, 0x28, 0xEF,
+			0x1E, 0xA0, 0x28, 0xF5, 0x11, 0x8D, 0xE6, 0x1A,
+			0xE0, 0x2B, 0xB6, 0xAE, 0xFC, 0x33, 0x43, 0xA0,
+			0x50, 0x29, 0x2F, 0x19, 0x9F, 0x40, 0x18, 0x57,
+			0xB2, 0xBE, 0xAD, 0x5E, 0x6E, 0xE2, 0xA1, 0xF1,
+			0x91, 0x02, 0x2F, 0x92, 0x78, 0x01, 0x6F, 0x04,
+			0x77, 0x91, 0xA9, 0xD1, 0x8D, 0xA7, 0xD2, 0xA6,
+			0xD2, 0x7F, 0x2E, 0x0E, 0x51, 0xC2, 0xF6, 0xEA,
+			0x30, 0xE8, 0xAC, 0x49, 0xA0, 0x60, 0x4F, 0x4C,
+			0x13, 0x54, 0x2E, 0x85, 0xB6, 0x83, 0x81, 0xB9,
+			0xFD, 0xCF, 0xA0, 0xCE, 0x4B, 0x2D, 0x34, 0x13,
+			0x54, 0x85, 0x2D, 0x36, 0x02, 0x45, 0xC5, 0x36,
+			0xB6, 0x12, 0xAF, 0x71, 0xF3, 0xE7, 0x7C, 0x90,
+			0x95, 0xAE, 0x2D, 0xBD, 0xE5, 0x04, 0xB2, 0x65,
+			0x73, 0x3D, 0xAB, 0xFE, 0x10, 0xA2, 0x0F, 0xC7,
+			0xD6, 0xD3, 0x2C, 0x21, 0xCC, 0xC7, 0x2B, 0x8B,
+			0x34, 0x44, 0xAE, 0x66, 0x3D, 0x65, 0x92, 0x2D,
+			0x17, 0xF8, 0x2C, 0xAA, 0x2B, 0x86, 0x5C, 0xD8,
+			0x89, 0x13, 0xD2, 0x91, 0xA6, 0x58, 0x99, 0x02,
+			0x6E, 0xA1, 0x32, 0x84, 0x39, 0x72, 0x3C, 0x19,
+			0x8C, 0x36, 0xB0, 0xC3, 0xC8, 0xD0, 0x85, 0xBF,
+			0xAF, 0x8A, 0x32, 0x0F, 0xDE, 0x33, 0x4B, 0x4A,
+			0x49, 0x19, 0xB4, 0x4C, 0x2B, 0x95, 0xF6, 0xE8,
+			0xEC, 0xF7, 0x33, 0x93, 0xF7, 0xF0, 0xD2, 0xA4,
+			0x0E, 0x60, 0xB1, 0xD4, 0x06, 0x52, 0x6B, 0x02,
+			0x2D, 0xDC, 0x33, 0x18, 0x10, 0xB1, 0xA5, 0xF7,
+			0xC3, 0x47, 0xBD, 0x53, 0xED, 0x1F, 0x10, 0x5D,
+			0x6A, 0x0D, 0x30, 0xAB, 0xA4, 0x77, 0xE1, 0x78,
+			0x88, 0x9A, 0xB2, 0xEC, 0x55, 0xD5, 0x58, 0xDE,
+			0xAB, 0x26, 0x30, 0x20, 0x43, 0x36, 0x96, 0x2B,
+			0x4D, 0xB5, 0xB6, 0x63, 0xB6, 0x90, 0x2B, 0x89,
+			0xE8, 0x5B, 0x31, 0xBC, 0x6A, 0xF5, 0x0F, 0xC5,
+			0x0A, 0xCC, 0xB3, 0xFB, 0x9B, 0x57, 0xB6, 0x63,
+			0x29, 0x70, 0x31, 0x37, 0x8D, 0xB4, 0x78, 0x96,
+			0xD7, 0xFB, 0xAF, 0x6C, 0x60, 0x0A, 0xDD, 0x2C,
+			0x67, 0xF9, 0x36, 0xDB, 0x03, 0x79, 0x86, 0xDB,
+			0x85, 0x6E, 0xB4, 0x9C, 0xF2, 0xDB, 0x3F, 0x7D,
+			0xA6, 0xD2, 0x36, 0x50, 0xE4, 0x38, 0xF1, 0x88,
+			0x40, 0x41, 0xB0, 0x13, 0x11, 0x9E, 0x4C, 0x2A,
+			0xE5, 0xAF, 0x37, 0xCC, 0xCD, 0xFB, 0x68, 0x66,
+			0x07, 0x38, 0xB5, 0x8B, 0x3C, 0x59, 0xD1, 0xC0,
+			0x24, 0x84, 0x37, 0x47, 0x2A, 0xBA, 0x1F, 0x35,
+			0xCA, 0x1F, 0xB9, 0x0C, 0xD7, 0x14, 0xAA, 0x9F,
+			0x63, 0x55, 0x34, 0xF4, 0x9E, 0x7C, 0x5B, 0xBA,
+			0x81, 0xC2, 0xB6, 0xB3, 0x6F, 0xDE, 0xE2, 0x1C,
+			0xA2, 0x7E, 0x34, 0x7F, 0x79, 0x3D, 0x2C, 0xE9,
+			0x44, 0xED, 0xB2, 0x3C, 0x8C, 0x9B, 0x91, 0x4B,
+			0xE1, 0x03, 0x35, 0xE3, 0x50, 0xFE, 0xB5, 0x07,
+			0x03, 0x94, 0xB7, 0xA4, 0xA1, 0x5C, 0x0C, 0xA1,
+			0x20, 0x28, 0x35, 0x68, 0xB7, 0xBF, 0xC2, 0x54,
+			0xFE, 0x83, 0x8B, 0x13, 0x7A, 0x21, 0x47, 0xCE,
+			0x7C, 0x11, 0x3A, 0x3A, 0x4D, 0x65, 0x49, 0x9D,
+			0x9E, 0x86, 0xB8, 0x7D, 0xBC, 0xC7, 0xF0, 0x3B,
+			0xBD, 0x3A, 0x3A, 0xB1, 0xAA, 0x24, 0x3E, 0xCE,
+			0x5B, 0xA9, 0xBC, 0xF2, 0x5F, 0x82, 0x83, 0x6C,
+			0xFE, 0x47, 0x3B, 0x2D, 0x83, 0xE7, 0xA7, 0x20,
+			0x1C, 0xD0, 0xB9, 0x6A, 0x72, 0x45, 0x1E, 0x86,
+			0x3F, 0x6C, 0x3B, 0xA6, 0x64, 0xA6, 0xD0, 0x73,
+			0xD1, 0xF7, 0xB5, 0xED, 0x99, 0x08, 0x65, 0xD9,
+			0x78, 0xBD, 0x38, 0x15, 0xD0, 0x60, 0x94, 0xFC,
+			0x9A, 0x2A, 0xBA, 0x52, 0x21, 0xC2, 0x2D, 0x5A,
+			0xB9, 0x96, 0x38, 0x9E, 0x37, 0x21, 0xE3, 0xAF,
+			0x5F, 0x05, 0xBE, 0xDD, 0xC2, 0x87, 0x5E, 0x0D,
+			0xFA, 0xEB, 0x39, 0x02, 0x1E, 0xE2, 0x7A, 0x41,
+			0x18, 0x7C, 0xBB, 0x45, 0xEF, 0x40, 0xC3, 0xE7,
+			0x3B, 0xC0, 0x39, 0x89, 0xF9, 0xA3, 0x0D, 0x12,
+			0xC5, 0x4B, 0xA7, 0xD2, 0x14, 0x1D, 0xA8, 0xA8,
+			0x75, 0x49, 0x3E, 0x65, 0x77, 0x6E, 0xF3, 0x5F,
+			0x97, 0xDE, 0xBC, 0x22, 0x86, 0xCC, 0x4A, 0xF9,
+			0xB4, 0x62, 0x3E, 0xEE, 0x90, 0x2F, 0x84, 0x0C,
+			0x52, 0xF1, 0xB8, 0xAD, 0x65, 0x89, 0x39, 0xAE,
+			0xF7, 0x1F, 0x3F, 0x72, 0xB9, 0xEC, 0x1D, 0xE2,
+			0x15, 0x88, 0xBD, 0x35, 0x48, 0x4E, 0xA4, 0x44,
+			0x36, 0x34, 0x3F, 0xF9, 0x5E, 0xAD, 0x6A, 0xB1,
+			0xD8, 0xAF, 0xB1, 0xB2, 0xA3, 0x03, 0xDF, 0x1B,
+			0x71, 0xE5, 0x3C, 0x4A, 0xEA, 0x6B, 0x2E, 0x3E,
+			0x93, 0x72, 0xBE, 0x0D, 0x1B, 0xC9, 0x97, 0x98,
+			0xB0, 0xCE, 0x3C, 0xC1, 0x0D, 0x2A, 0x59, 0x6D,
+			0x56, 0x5D, 0xBA, 0x82, 0xF8, 0x8C, 0xE4, 0xCF,
+			0xF3, 0xB3, 0x3D, 0x5D, 0x24, 0xE9, 0xC0, 0x83,
+			0x11, 0x24, 0xBF, 0x1A, 0xD5, 0x4B, 0x79, 0x25,
+			0x32, 0x98, 0x3D, 0xD6, 0xC3, 0xA8, 0xB7, 0xD0
+		},
+	.len = 2056
+	},
+	.digest = {
+		.data = {0x17, 0x9F, 0x2F, 0xA6},
+		.len  = 4
+	}
+};
+
+#endif /* TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_ */
diff --git a/app/test/test_cryptodev_snow3g_test_vectors.h b/app/test/test_cryptodev_snow3g_test_vectors.h
new file mode 100644
index 0000000..403406d
--- /dev/null
+++ b/app/test/test_cryptodev_snow3g_test_vectors.h
@@ -0,0 +1,379 @@
+/*-
+ *   BSD LICENSE
+ *
+ *   Copyright(c) 2015 Intel Corporation. All rights reserved.
+ *
+ *   Redistribution and use in source and binary forms, with or without
+ *   modification, are permitted provided that the following conditions
+ *   are met:
+ *
+ *   * Redistributions of source code must retain the above copyright
+ *     notice, this list of conditions and the following disclaimer.
+ *   * Redistributions in binary form must reproduce the above copyright
+ *     notice, this list of conditions and the following disclaimer in
+ *     the documentation and/or other materials provided with the
+ *     distribution.
+ *   * Neither the name of Intel Corporation nor the names of its
+ *     contributors may be used to endorse or promote products derived
+ *     from this software without specific prior written permission.
+ *
+ *   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ *   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ *   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ *   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ *   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ *   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ *   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ *   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ *   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ *   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ *   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#ifndef TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_
+#define TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_
+
+struct snow3g_test_data {
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} key;
+
+	struct {
+		uint8_t data[64] __rte_aligned(16);
+		unsigned len;
+	} iv;
+
+	struct {
+		uint8_t data[1024];
+		unsigned len;
+	} plaintext;
+
+	struct {
+		uint8_t data[1024];
+		unsigned len;
+	} ciphertext;
+
+	struct {
+		unsigned len;
+	} validDataLenInBits;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} aad;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} digest;
+};
+struct snow3g_test_data snow3g_test_case_1 = {
+	.key = {
+		.data = {
+			0x2B, 0xD6, 0x45, 0x9F, 0x82, 0xC5, 0xB3, 0x00,
+			0x95, 0x2C, 0x49, 0x10, 0x48, 0x81, 0xFF, 0x48
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00,
+			0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x7E, 0xC6, 0x12, 0x72, 0x74, 0x3B, 0xF1, 0x61,
+			0x47, 0x26, 0x44, 0x6A, 0x6C, 0x38, 0xCE, 0xD1,
+			0x66, 0xF6, 0xCA, 0x76, 0xEB, 0x54, 0x30, 0x04,
+			0x42, 0x86, 0x34, 0x6C, 0xEF, 0x13, 0x0F, 0x92,
+			0x92, 0x2B, 0x03, 0x45, 0x0D, 0x3A, 0x99, 0x75,
+			0xE5, 0xBD, 0x2E, 0xA0, 0xEB, 0x55, 0xAD, 0x8E,
+			0x1B, 0x19, 0x9E, 0x3E, 0xC4, 0x31, 0x60, 0x20,
+			0xE9, 0xA1, 0xB2, 0x85, 0xE7, 0x62, 0x79, 0x53,
+			0x59, 0xB7, 0xBD, 0xFD, 0x39, 0xBE, 0xF4, 0xB2,
+			0x48, 0x45, 0x83, 0xD5, 0xAF, 0xE0, 0x82, 0xAE,
+			0xE6, 0x38, 0xBF, 0x5F, 0xD5, 0xA6, 0x06, 0x19,
+			0x39, 0x01, 0xA0, 0x8F, 0x4A, 0xB4, 0x1A, 0xAB,
+			0x9B, 0x13, 0x48, 0x80
+		},
+		.len = 100
+	},
+	.ciphertext = {
+		.data = {
+			0x8C, 0xEB, 0xA6, 0x29, 0x43, 0xDC, 0xED, 0x3A,
+			0x09, 0x90, 0xB0, 0x6E, 0xA1, 0xB0, 0xA2, 0xC4,
+			0xFB, 0x3C, 0xED, 0xC7, 0x1B, 0x36, 0x9F, 0x42,
+			0xBA, 0x64, 0xC1, 0xEB, 0x66, 0x65, 0xE7, 0x2A,
+			0xA1, 0xC9, 0xBB, 0x0D, 0xEA, 0xA2, 0x0F, 0xE8,
+			0x60, 0x58, 0xB8, 0xBA, 0xEE, 0x2C, 0x2E, 0x7F,
+			0x0B, 0xEC, 0xCE, 0x48, 0xB5, 0x29, 0x32, 0xA5,
+			0x3C, 0x9D, 0x5F, 0x93, 0x1A, 0x3A, 0x7C, 0x53,
+			0x22, 0x59, 0xAF, 0x43, 0x25, 0xE2, 0xA6, 0x5E,
+			0x30, 0x84, 0xAD, 0x5F, 0x6A, 0x51, 0x3B, 0x7B,
+			0xDD, 0xC1, 0xB6, 0x5F, 0x0A, 0xA0, 0xD9, 0x7A,
+			0x05, 0x3D, 0xB5, 0x5A, 0x88, 0xC4, 0xC4, 0xF9,
+			0x60, 0x5E, 0x41, 0x40
+		},
+		.len = 100
+	},
+	.validDataLenInBits = {
+		.len = 798
+	},
+	.aad = {
+		.data = {
+			 0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00,
+			 0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_2 = {
+	.key = {
+		.data = {
+			0xEF, 0xA8, 0xB2, 0x22, 0x9E, 0x72, 0x0C, 0x2A,
+			0x7C, 0x36, 0xEA, 0x55, 0xE9, 0x60, 0x56, 0x95
+		},
+		.len = 16
+	},
+	.iv = {
+	       .data = {
+			0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00,
+			0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00
+		},
+	       .len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x10, 0x11, 0x12, 0x31, 0xE0, 0x60, 0x25, 0x3A,
+			0x43, 0xFD, 0x3F, 0x57, 0xE3, 0x76, 0x07, 0xAB,
+			0x28, 0x27, 0xB5, 0x99, 0xB6, 0xB1, 0xBB, 0xDA,
+			0x37, 0xA8, 0xAB, 0xCC, 0x5A, 0x8C, 0x55, 0x0D,
+			0x1B, 0xFB, 0x2F, 0x49, 0x46, 0x24, 0xFB, 0x50,
+			0x36, 0x7F, 0xA3, 0x6C, 0xE3, 0xBC, 0x68, 0xF1,
+			0x1C, 0xF9, 0x3B, 0x15, 0x10, 0x37, 0x6B, 0x02,
+			0x13, 0x0F, 0x81, 0x2A, 0x9F, 0xA1, 0x69, 0xD8
+		},
+		.len = 64
+	},
+	.ciphertext = {
+		.data = {
+				0xE0, 0xDA, 0x15, 0xCA, 0x8E, 0x25, 0x54, 0xF5,
+				0xE5, 0x6C, 0x94, 0x68, 0xDC, 0x6C, 0x7C, 0x12,
+				0x9C, 0x56, 0x8A, 0xA5, 0x03, 0x23, 0x17, 0xE0,
+				0x4E, 0x07, 0x29, 0x64, 0x6C, 0xAB, 0xEF, 0xA6,
+				0x89, 0x86, 0x4C, 0x41, 0x0F, 0x24, 0xF9, 0x19,
+				0xE6, 0x1E, 0x3D, 0xFD, 0xFA, 0xD7, 0x7E, 0x56,
+				0x0D, 0xB0, 0xA9, 0xCD, 0x36, 0xC3, 0x4A, 0xE4,
+				0x18, 0x14, 0x90, 0xB2, 0x9F, 0x5F, 0xA2, 0xFC
+		},
+		.len = 64
+	},
+	.validDataLenInBits = {
+		.len = 510
+	},
+	.aad = {
+		.data = {
+			 0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00,
+			 0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_3 = {
+	.key = {
+		.data = {
+			 0x5A, 0xCB, 0x1D, 0x64, 0x4C, 0x0D, 0x51, 0x20,
+			 0x4E, 0xA5, 0xF1, 0x45, 0x10, 0x10, 0xD8, 0x52
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00,
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xAD, 0x9C, 0x44, 0x1F, 0x89, 0x0B, 0x38, 0xC4,
+			0x57, 0xA4, 0x9D, 0x42, 0x14, 0x07, 0xE8
+		},
+		.len = 15
+	},
+	.ciphertext = {
+		.data = {
+			0xBA, 0x0F, 0x31, 0x30, 0x03, 0x34, 0xC5, 0x6B,
+			0x52, 0xA7, 0x49, 0x7C, 0xBA, 0xC0, 0x46
+		},
+		.len = 15
+	},
+	.validDataLenInBits = {
+		.len = 120
+	},
+	.aad = {
+		.data = {
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00,
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.digest = {
+		.data = {0xE8, 0x60, 0x5A, 0x3E},
+		.len  = 4
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_4 = {
+	.key = {
+		.data = {
+			0xD3, 0xC5, 0xD5, 0x92, 0x32, 0x7F, 0xB1, 0x1C,
+			0x40, 0x35, 0xC6, 0x68, 0x0A, 0xF8, 0xC6, 0xD1
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x39, 0x8A, 0x59, 0xB4, 0x2C, 0x00, 0x00, 0x00,
+			0x39, 0x8A, 0x59, 0xB4, 0x2C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x98, 0x1B, 0xA6, 0x82, 0x4C, 0x1B, 0xFB, 0x1A,
+			0xB4, 0x85, 0x47, 0x20, 0x29, 0xB7, 0x1D, 0x80,
+			0x8C, 0xE3, 0x3E, 0x2C, 0xC3, 0xC0, 0xB5, 0xFC,
+			0x1F, 0x3D, 0xE8, 0xA6, 0xDC, 0x66, 0xB1, 0xF0
+		},
+		.len = 32
+	},
+	.ciphertext = {
+		.data = {
+			0x98, 0x9B, 0x71, 0x9C, 0xDC, 0x33, 0xCE, 0xB7,
+			0xCF, 0x27, 0x6A, 0x52, 0x82, 0x7C, 0xEF, 0x94,
+			0xA5, 0x6C, 0x40, 0xC0, 0xAB, 0x9D, 0x81, 0xF7,
+			0xA2, 0xA9, 0xBA, 0xC6, 0x0E, 0x11, 0xC4, 0xB0
+		},
+		.len = 32
+	},
+	.validDataLenInBits = {
+		.len = 253
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_5 = {
+	.key = {
+		.data = {
+			0x60, 0x90, 0xEA, 0xE0, 0x4C, 0x83, 0x70, 0x6E,
+			0xEC, 0xBF, 0x65, 0x2B, 0xE8, 0xE3, 0x65, 0x66
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x72, 0xA4, 0xF2, 0x0F, 0x48, 0x00, 0x00, 0x00,
+			0x72, 0xA4, 0xF2, 0x0F, 0x48, 0x00, 0x00, 0x00
+		},
+		.len = 16},
+	.plaintext = {
+		.data = {
+			0x40, 0x98, 0x1B, 0xA6, 0x82, 0x4C, 0x1B, 0xFB,
+			0x42, 0x86, 0xB2, 0x99, 0x78, 0x3D, 0xAF, 0x44,
+			0x2C, 0x09, 0x9F, 0x7A, 0xB0, 0xF5, 0x8D, 0x5C,
+			0x8E, 0x46, 0xB1, 0x04, 0xF0, 0x8F, 0x01, 0xB4,
+			0x1A, 0xB4, 0x85, 0x47, 0x20, 0x29, 0xB7, 0x1D,
+			0x36, 0xBD, 0x1A, 0x3D, 0x90, 0xDC, 0x3A, 0x41,
+			0xB4, 0x6D, 0x51, 0x67, 0x2A, 0xC4, 0xC9, 0x66,
+			0x3A, 0x2B, 0xE0, 0x63, 0xDA, 0x4B, 0xC8, 0xD2,
+			0x80, 0x8C, 0xE3, 0x3E, 0x2C, 0xCC, 0xBF, 0xC6,
+			0x34, 0xE1, 0xB2, 0x59, 0x06, 0x08, 0x76, 0xA0,
+			0xFB, 0xB5, 0xA4, 0x37, 0xEB, 0xCC, 0x8D, 0x31,
+			0xC1, 0x9E, 0x44, 0x54, 0x31, 0x87, 0x45, 0xE3,
+			0x98, 0x76, 0x45, 0x98, 0x7A, 0x98, 0x6F, 0x2C,
+			0xB0
+		},
+		.len = 105
+	},
+	.ciphertext = {
+		.data = {
+			0x58, 0x92, 0xBB, 0xA8, 0x8B, 0xBB, 0xCA, 0xAE,
+			0xAE, 0x76, 0x9A, 0xA0, 0x6B, 0x68, 0x3D, 0x3A,
+			0x17, 0xCC, 0x04, 0xA3, 0x69, 0x88, 0x16, 0x97,
+			0x43, 0x5E, 0x44, 0xFE, 0xD5, 0xFF, 0x9A, 0xF5,
+			0x7B, 0x9E, 0x89, 0x0D, 0x4D, 0x5C, 0x64, 0x70,
+			0x98, 0x85, 0xD4, 0x8A, 0xE4, 0x06, 0x90, 0xEC,
+			0x04, 0x3B, 0xAA, 0xE9, 0x70, 0x57, 0x96, 0xE4,
+			0xA9, 0xFF, 0x5A, 0x4B, 0x8D, 0x8B, 0x36, 0xD7,
+			0xF3, 0xFE, 0x57, 0xCC, 0x6C, 0xFD, 0x6C, 0xD0,
+			0x05, 0xCD, 0x38, 0x52, 0xA8, 0x5E, 0x94, 0xCE,
+			0x6B, 0xCD, 0x90, 0xD0, 0xD0, 0x78, 0x39, 0xCE,
+			0x09, 0x73, 0x35, 0x44, 0xCA, 0x8E, 0x35, 0x08,
+			0x43, 0x24, 0x85, 0x50, 0x92, 0x2A, 0xC1, 0x28,
+			0x18
+		},
+		.len = 105
+	},
+	.validDataLenInBits = {
+		.len = 837
+	}
+};
+struct snow3g_test_data snow3g_test_case_6 = {
+	.key = {
+		.data = {
+			0xC7, 0x36, 0xC6, 0xAA, 0xB2, 0x2B, 0xFF, 0xF9,
+			0x1E, 0x26, 0x98, 0xD2, 0xE2, 0x2A, 0xD5, 0x7E
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.aad = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xD0, 0xA7, 0xD4, 0x63, 0xDF, 0x9F, 0xB2, 0xB2,
+			0x78, 0x83, 0x3F, 0xA0, 0x2E, 0x23, 0x5A, 0xA1,
+			0x72, 0xBD, 0x97, 0x0C, 0x14, 0x73, 0xE1, 0x29,
+			0x07, 0xFB, 0x64, 0x8B, 0x65, 0x99, 0xAA, 0xA0,
+			0xB2, 0x4A, 0x03, 0x86, 0x65, 0x42, 0x2B, 0x20,
+			0xA4, 0x99, 0x27, 0x6A, 0x50, 0x42, 0x70, 0x09
+		},
+		.len = 48
+	},
+	.ciphertext = {
+	   .data = {
+			0x95, 0x2E, 0x5A, 0xE1, 0x50, 0xB8, 0x59, 0x2A,
+			0x9B, 0xA0, 0x38, 0xA9, 0x8E, 0x2F, 0xED, 0xAB,
+			0xFD, 0xC8, 0x3B, 0x47, 0x46, 0x0B, 0x50, 0x16,
+			0xEC, 0x88, 0x45, 0xB6, 0x05, 0xC7, 0x54, 0xF8,
+			0xBD, 0x91, 0xAA, 0xB6, 0xA4, 0xDC, 0x64, 0xB4,
+			0xCB, 0xEB, 0x97, 0x06, 0x4C, 0xF7, 0x02, 0x3D
+		},
+		.len = 48
+	},
+	.digest = {
+		.data = {0x38, 0xB5, 0x54, 0xC0 },
+		.len  = 4
+	},
+	.validDataLenInBits = {
+		.len = 384
+	}
+};
+
+#endif /* TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_ */
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v3 0/3] Snow3G support for Intel Quick Assist Devices
  2016-01-28 17:46 [dpdk-dev] [PATCH 0/3] Snow3G UEA2 support for Intel Quick Assist Devices Deepak Kumar JAIN
                   ` (3 preceding siblings ...)
  2016-02-23 14:02 ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Deepak Kumar JAIN
@ 2016-03-03 13:01 ` Deepak Kumar JAIN
  2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
                     ` (5 more replies)
  4 siblings, 6 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-03-03 13:01 UTC (permalink / raw)
  To: dev

 This patchset contains fixes and refactoring for Snow3G(UEA2 and
 UIA2) wireless algorithm for Intel Quick Assist devices.

 QAT PMD previously supported only cipher/hash alg-chaining for AES/SHA.
 The code has been refactored to also support cipher-only and hash  only
 (for Snow3G only) functionality along with alg-chaining.

 Changes from v2:

 1) Rebasing based on below mentioned patchset.

 This patchset depends on
 cryptodev API changes
 http://dpdk.org/ml/archives/dev/2016-February/034212.html

Deepak Kumar JAIN (3):
  crypto: add cipher/auth only support
  qat: add support for Snow3G
  app/test: add Snow3G tests

 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 doc/guides/cryptodevs/qat.rst                      |    8 +-
 doc/guides/rel_notes/release_16_04.rst             |    6 +
 drivers/crypto/qat/qat_adf/qat_algs.h              |   19 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c   |  280 +++++-
 drivers/crypto/qat/qat_crypto.c                    |  149 ++-
 drivers/crypto/qat/qat_crypto.h                    |   10 +
 10 files changed, 2231 insertions(+), 75 deletions(-)
 create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v3 1/3] crypto: add cipher/auth only support
  2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
@ 2016-03-03 13:01   ` Deepak Kumar JAIN
  2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 2/3] qat: add support for Snow3G Deepak Kumar JAIN
                     ` (4 subsequent siblings)
  5 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-03-03 13:01 UTC (permalink / raw)
  To: dev

Refactored the existing functionality into
modular form to support the cipher/auth only
functionalities.

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 drivers/crypto/qat/qat_adf/qat_algs.h            |  18 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 210 ++++++++++++++++++++---
 drivers/crypto/qat/qat_crypto.c                  | 137 +++++++++++----
 drivers/crypto/qat/qat_crypto.h                  |  10 ++
 4 files changed, 308 insertions(+), 67 deletions(-)

diff --git a/drivers/crypto/qat/qat_adf/qat_algs.h b/drivers/crypto/qat/qat_adf/qat_algs.h
index 76c08c0..b73a5d0 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs.h
+++ b/drivers/crypto/qat/qat_adf/qat_algs.h
@@ -3,7 +3,7 @@
  *  redistributing this file, you may do so under either license.
  *
  *  GPL LICENSE SUMMARY
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  This program is free software; you can redistribute it and/or modify
  *  it under the terms of version 2 of the GNU General Public License as
  *  published by the Free Software Foundation.
@@ -17,7 +17,7 @@
  *  qat-linux@intel.com
  *
  *  BSD LICENSE
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  Redistribution and use in source and binary forms, with or without
  *  modification, are permitted provided that the following conditions
  *  are met:
@@ -104,11 +104,15 @@ struct qat_alg_ablkcipher_cd {
 
 int qat_get_inter_state_size(enum icp_qat_hw_auth_algo qat_hash_alg);
 
-int qat_alg_aead_session_create_content_desc(struct qat_session *cd,
-					uint8_t *enckey, uint32_t enckeylen,
-					uint8_t *authkey, uint32_t authkeylen,
-					uint32_t add_auth_data_length,
-					uint32_t digestsize);
+int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cd,
+						uint8_t *enckey,
+						uint32_t enckeylen);
+
+int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
+						uint8_t *authkey,
+						uint32_t authkeylen,
+						uint32_t add_auth_data_length,
+						uint32_t digestsize);
 
 void qat_alg_init_common_hdr(struct icp_qat_fw_comn_req_hdr *header);
 
diff --git a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
index ceaffb7..bef444b 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
+++ b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
@@ -3,7 +3,7 @@
  *  redistributing this file, you may do so under either license.
  *
  *  GPL LICENSE SUMMARY
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  This program is free software; you can redistribute it and/or modify
  *  it under the terms of version 2 of the GNU General Public License as
  *  published by the Free Software Foundation.
@@ -17,7 +17,7 @@
  *  qat-linux@intel.com
  *
  *  BSD LICENSE
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  Redistribution and use in source and binary forms, with or without
  *  modification, are permitted provided that the following conditions
  *  are met:
@@ -359,15 +359,139 @@ void qat_alg_init_common_hdr(struct icp_qat_fw_comn_req_hdr *header)
 					   ICP_QAT_FW_LA_NO_UPDATE_STATE);
 }
 
-int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
-			uint8_t *cipherkey, uint32_t cipherkeylen,
-			uint8_t *authkey, uint32_t authkeylen,
-			uint32_t add_auth_data_length,
-			uint32_t digestsize)
+int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
+						uint8_t *cipherkey,
+						uint32_t cipherkeylen)
 {
-	struct qat_alg_cd *content_desc = &cdesc->cd;
-	struct icp_qat_hw_cipher_algo_blk *cipher = &content_desc->cipher;
-	struct icp_qat_hw_auth_algo_blk *hash = &content_desc->hash;
+	struct icp_qat_hw_cipher_algo_blk *cipher;
+	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
+	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
+	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
+	void *ptr = &req_tmpl->cd_ctrl;
+	struct icp_qat_fw_cipher_cd_ctrl_hdr *cipher_cd_ctrl = ptr;
+	struct icp_qat_fw_auth_cd_ctrl_hdr *hash_cd_ctrl = ptr;
+	enum icp_qat_hw_cipher_convert key_convert;
+	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO;	/* no CCM/GCM/Snow3G */
+	uint16_t cipher_offset = 0;
+
+	PMD_INIT_FUNC_TRACE();
+
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		cipher =
+		    (struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk));
+		cipher_offset = sizeof(struct icp_qat_hw_auth_algo_blk);
+	} else {
+		cipher = (struct icp_qat_hw_cipher_algo_blk *)&cdesc->cd;
+		cipher_offset = 0;
+	}
+	/* CD setup */
+	if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT) {
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	} else {
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_CMP_AUTH_RES);
+	}
+
+	if (cdesc->qat_mode == ICP_QAT_HW_CIPHER_CTR_MODE) {
+		/* CTR Streaming ciphers are a special case. Decrypt = encrypt
+		 * Overriding default values previously set
+		 */
+		cdesc->qat_dir = ICP_QAT_HW_CIPHER_ENCRYPT;
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	} else if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT)
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	else
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
+	/* For Snow3G, set key convert and other bits */
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	}
+
+	cipher->aes.cipher_config.val =
+	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
+					cdesc->qat_cipher_alg, key_convert,
+					cdesc->qat_dir);
+	memcpy(cipher->aes.key, cipherkey, cipherkeylen);
+
+	proto = ICP_QAT_FW_LA_PROTO_GET(header->serv_specif_flags);
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2)
+		proto = ICP_QAT_FW_LA_SNOW_3G_PROTO;
+
+	/* Request template setup */
+	qat_alg_init_common_hdr(header);
+	header->service_cmd_id = cdesc->qat_cmd;
+
+	ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
+	/* Configure the common header protocol flags */
+	ICP_QAT_FW_LA_PROTO_SET(header->serv_specif_flags, proto);
+	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
+	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
+
+	/* Cipher CD config setup */
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
+		cipher_cd_ctrl->cipher_key_sz =
+			(ICP_QAT_HW_SNOW_3G_UEA2_KEY_SZ +
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ) >> 3;
+		cipher_cd_ctrl->cipher_state_sz =
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	} else {
+		cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
+		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	}
+
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else {
+		PMD_DRV_LOG(ERR, "invalid param, only authenticated "
+			    "encryption supported");
+		return -EFAULT;
+	}
+	return 0;
+}
+
+int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
+						uint8_t *authkey,
+						uint32_t authkeylen,
+						uint32_t add_auth_data_length,
+						uint32_t digestsize)
+{
+	struct icp_qat_hw_cipher_algo_blk *cipher;
+	struct icp_qat_hw_auth_algo_blk *hash;
 	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
 	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
 	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
@@ -379,31 +503,57 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 		((char *)&req_tmpl->serv_specif_rqpars +
 		sizeof(struct icp_qat_fw_la_cipher_req_params));
 	enum icp_qat_hw_cipher_convert key_convert;
-	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO; /* no CCM/GCM/Snow3G */
+	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO;	/* no CCM/GCM/Snow3G */
 	uint16_t state1_size = 0;
 	uint16_t state2_size = 0;
+	uint16_t cipher_offset = 0, hash_offset = 0;
 
 	PMD_INIT_FUNC_TRACE();
 
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		hash = (struct icp_qat_hw_auth_algo_blk *)&cdesc->cd;
+		cipher =
+		(struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk));
+		hash_offset = 0;
+		cipher_offset = ((char *)hash - (char *)cipher);
+	} else {
+		cipher = (struct icp_qat_hw_cipher_algo_blk *)&cdesc->cd;
+		hash = (struct icp_qat_hw_auth_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_cipher_algo_blk));
+		cipher_offset = 0;
+		hash_offset = ((char *)hash - (char *)cipher);
+	}
+
 	/* CD setup */
 	if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT) {
-		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_RET_AUTH_RES);
+					   ICP_QAT_FW_LA_RET_AUTH_RES);
 		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+					   ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
 	} else {
-		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+					   ICP_QAT_FW_LA_NO_RET_AUTH_RES);
 		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-				   ICP_QAT_FW_LA_CMP_AUTH_RES);
+					   ICP_QAT_FW_LA_CMP_AUTH_RES);
 	}
 
-	cipher->aes.cipher_config.val = ICP_QAT_HW_CIPHER_CONFIG_BUILD(
-			cdesc->qat_mode, cdesc->qat_cipher_alg, key_convert,
-			cdesc->qat_dir);
-	memcpy(cipher->aes.key, cipherkey, cipherkeylen);
+	if (cdesc->qat_mode == ICP_QAT_HW_CIPHER_CTR_MODE) {
+		/* CTR Streaming ciphers are a special case. Decrypt = encrypt
+		 * Overriding default values previously set
+		 */
+		cdesc->qat_dir = ICP_QAT_HW_CIPHER_ENCRYPT;
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	} else if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT)
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	else
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
+	cipher->aes.cipher_config.val =
+	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
+					cdesc->qat_cipher_alg, key_convert,
+					cdesc->qat_dir);
+	memcpy(cipher->aes.key, authkey, authkeylen);
 
 	hash->sha.inner_setup.auth_config.reserved = 0;
 	hash->sha.inner_setup.auth_config.config =
@@ -423,7 +573,7 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 	} else if ((cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_128) ||
 		(cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_64)) {
 		if (qat_alg_do_precomputes(cdesc->qat_hash_alg,
-			cipherkey, cipherkeylen, (uint8_t *)(hash->sha.state1 +
+			authkey, authkeylen, (uint8_t *)(hash->sha.state1 +
 			ICP_QAT_HW_GALOIS_128_STATE1_SZ), &state2_size)) {
 			PMD_DRV_LOG(ERR, "(GCM)precompute failed");
 			return -EFAULT;
@@ -454,15 +604,15 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 	/* Configure the common header protocol flags */
 	ICP_QAT_FW_LA_PROTO_SET(header->serv_specif_flags, proto);
 	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
-	cd_pars->u.s.content_desc_params_sz = sizeof(struct qat_alg_cd) >> 3;
+	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
 
 	/* Cipher CD config setup */
-	cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
+	cipher_cd_ctrl->cipher_key_sz = authkeylen >> 3;
 	cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
-	cipher_cd_ctrl->cipher_cfg_offset = 0;
+	cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
 
 	/* Auth CD config setup */
-	hash_cd_ctrl->hash_cfg_offset = ((char *)hash - (char *)cipher) >> 3;
+	hash_cd_ctrl->hash_cfg_offset = hash_offset >> 3;
 	hash_cd_ctrl->hash_flags = ICP_QAT_FW_AUTH_HDR_FLAG_NO_NESTED;
 	hash_cd_ctrl->inner_res_sz = digestsize;
 	hash_cd_ctrl->final_sz = digestsize;
@@ -505,8 +655,12 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 					>> 3);
 	auth_param->auth_res_sz = digestsize;
 
-
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_AUTH) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
 		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
 				ICP_QAT_FW_SLICE_CIPHER);
 		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
diff --git a/drivers/crypto/qat/qat_crypto.c b/drivers/crypto/qat/qat_crypto.c
index 38dc956..ad06e85 100644
--- a/drivers/crypto/qat/qat_crypto.c
+++ b/drivers/crypto/qat/qat_crypto.c
@@ -90,16 +90,16 @@ void qat_crypto_sym_clear_session(struct rte_cryptodev *dev,
 static int
 qat_get_cmd_id(const struct rte_crypto_sym_xform *xform)
 {
-	if (xform->next == NULL)
-		return -1;
-
 	/* Cipher Only */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && xform->next == NULL)
-		return -1; /* return ICP_QAT_FW_LA_CMD_CIPHER; */
+		return ICP_QAT_FW_LA_CMD_CIPHER;
 
 	/* Authentication Only */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && xform->next == NULL)
-		return -1; /* return ICP_QAT_FW_LA_CMD_AUTH; */
+		return ICP_QAT_FW_LA_CMD_AUTH;
+
+	if (xform->next == NULL)
+		return -1;
 
 	/* Cipher then Authenticate */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER &&
@@ -139,31 +139,16 @@ qat_get_cipher_xform(struct rte_crypto_sym_xform *xform)
 
 	return NULL;
 }
-
-
 void *
-qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
+qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
 		struct rte_crypto_sym_xform *xform, void *session_private)
 {
 	struct qat_pmd_private *internals = dev->data->dev_private;
 
 	struct qat_session *session = session_private;
 
-	struct rte_crypto_auth_xform *auth_xform = NULL;
 	struct rte_crypto_cipher_xform *cipher_xform = NULL;
 
-	int qat_cmd_id;
-
-	PMD_INIT_FUNC_TRACE();
-
-	/* Get requested QAT command id */
-	qat_cmd_id = qat_get_cmd_id(xform);
-	if (qat_cmd_id < 0 || qat_cmd_id >= ICP_QAT_FW_LA_CMD_DELIMITER) {
-		PMD_DRV_LOG(ERR, "Unsupported xform chain requested");
-		goto error_out;
-	}
-	session->qat_cmd = (enum icp_qat_fw_la_cmd_id)qat_cmd_id;
-
 	/* Get cipher xform from crypto xform chain */
 	cipher_xform = qat_get_cipher_xform(xform);
 
@@ -205,8 +190,87 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 	else
 		session->qat_dir = ICP_QAT_HW_CIPHER_DECRYPT;
 
+	if (qat_alg_aead_session_create_content_desc_cipher(session,
+						cipher_xform->key.data,
+						cipher_xform->key.length))
+		goto error_out;
+
+	return session;
+
+error_out:
+	rte_mempool_put(internals->sess_mp, session);
+	return NULL;
+}
+
+
+void *
+qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
+		struct rte_crypto_sym_xform *xform, void *session_private)
+{
+	struct qat_pmd_private *internals = dev->data->dev_private;
+
+	struct qat_session *session = session_private;
+
+	int qat_cmd_id;
+
+	PMD_INIT_FUNC_TRACE();
+
+	/* Get requested QAT command id */
+	qat_cmd_id = qat_get_cmd_id(xform);
+	if (qat_cmd_id < 0 || qat_cmd_id >= ICP_QAT_FW_LA_CMD_DELIMITER) {
+		PMD_DRV_LOG(ERR, "Unsupported xform chain requested");
+		goto error_out;
+	}
+	session->qat_cmd = (enum icp_qat_fw_la_cmd_id)qat_cmd_id;
+	switch (session->qat_cmd) {
+	case ICP_QAT_FW_LA_CMD_CIPHER:
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_AUTH:
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_CIPHER_HASH:
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_HASH_CIPHER:
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_TRNG_GET_RANDOM:
+	case ICP_QAT_FW_LA_CMD_TRNG_TEST:
+	case ICP_QAT_FW_LA_CMD_SSL3_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_TLS_V1_1_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_TLS_V1_2_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_MGF1:
+	case ICP_QAT_FW_LA_CMD_AUTH_PRE_COMP:
+	case ICP_QAT_FW_LA_CMD_CIPHER_PRE_COMP:
+	case ICP_QAT_FW_LA_CMD_DELIMITER:
+	PMD_DRV_LOG(ERR, "Unsupported Service %u",
+		session->qat_cmd);
+		goto error_out;
+	default:
+	PMD_DRV_LOG(ERR, "Unsupported Service %u",
+		session->qat_cmd);
+		goto error_out;
+	}
+	return session;
+
+error_out:
+	rte_mempool_put(internals->sess_mp, session);
+	return NULL;
+}
+
+struct qat_session *
+qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
+				struct rte_crypto_sym_xform *xform,
+				struct qat_session *session_private)
+{
 
-	/* Get authentication xform from Crypto xform chain */
+	struct qat_pmd_private *internals = dev->data->dev_private;
+	struct qat_session *session = session_private;
+	struct rte_crypto_auth_xform *auth_xform = NULL;
+	struct rte_crypto_cipher_xform *cipher_xform = NULL;
 	auth_xform = qat_get_auth_xform(xform);
 
 	switch (auth_xform->algo) {
@@ -250,17 +314,26 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 				auth_xform->algo);
 		goto error_out;
 	}
+	cipher_xform = qat_get_cipher_xform(xform);
 
-	if (qat_alg_aead_session_create_content_desc(session,
-		cipher_xform->key.data,
-		cipher_xform->key.length,
-		auth_xform->key.data,
-		auth_xform->key.length,
-		auth_xform->add_auth_data_length,
-		auth_xform->digest_length))
-		goto error_out;
-
-	return (struct rte_crypto_sym_session *)session;
+	if ((session->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_128) ||
+			(session->qat_hash_alg ==
+				ICP_QAT_HW_AUTH_ALGO_GALOIS_64))  {
+		if (qat_alg_aead_session_create_content_desc_auth(session,
+				cipher_xform->key.data,
+				cipher_xform->key.length,
+				auth_xform->add_auth_data_length,
+				auth_xform->digest_length))
+			goto error_out;
+	} else {
+		if (qat_alg_aead_session_create_content_desc_auth(session,
+				auth_xform->key.data,
+				auth_xform->key.length,
+				auth_xform->add_auth_data_length,
+				auth_xform->digest_length))
+			goto error_out;
+	}
+	return session;
 
 error_out:
 	rte_mempool_put(internals->sess_mp, session);
diff --git a/drivers/crypto/qat/qat_crypto.h b/drivers/crypto/qat/qat_crypto.h
index 9323383..0afe74e 100644
--- a/drivers/crypto/qat/qat_crypto.h
+++ b/drivers/crypto/qat/qat_crypto.h
@@ -111,6 +111,16 @@ extern void *
 qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 		struct rte_crypto_sym_xform *xform, void *session_private);
 
+struct qat_session *
+qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
+				struct rte_crypto_sym_xform *xform,
+				struct qat_session *session_private);
+
+void *
+qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
+		struct rte_crypto_sym_xform *xform, void *session_private);
+
+
 extern void
 qat_crypto_sym_clear_session(struct rte_cryptodev *dev, void *session);
 
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v3 2/3] qat: add support for Snow3G
  2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
  2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
@ 2016-03-03 13:01   ` Deepak Kumar JAIN
  2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
                     ` (3 subsequent siblings)
  5 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-03-03 13:01 UTC (permalink / raw)
  To: dev

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 doc/guides/cryptodevs/qat.rst                    |  8 ++-
 doc/guides/rel_notes/release_16_04.rst           |  6 ++
 drivers/crypto/qat/qat_adf/qat_algs.h            |  1 +
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 86 +++++++++++++++++++++---
 drivers/crypto/qat/qat_crypto.c                  | 12 +++-
 5 files changed, 100 insertions(+), 13 deletions(-)

diff --git a/doc/guides/cryptodevs/qat.rst b/doc/guides/cryptodevs/qat.rst
index 23402b4..af52047 100644
--- a/doc/guides/cryptodevs/qat.rst
+++ b/doc/guides/cryptodevs/qat.rst
@@ -1,5 +1,5 @@
 ..  BSD LICENSE
-    Copyright(c) 2015 Intel Corporation. All rights reserved.
+    Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
 
     Redistribution and use in source and binary forms, with or without
     modification, are permitted provided that the following conditions
@@ -47,6 +47,7 @@ Cipher algorithms:
 * ``RTE_CRYPTO_SYM_CIPHER_AES128_CBC``
 * ``RTE_CRYPTO_SYM_CIPHER_AES192_CBC``
 * ``RTE_CRYPTO_SYM_CIPHER_AES256_CBC``
+* ``RTE_CRYPTO_SYM_CIPHER_SNOW3G_UEA2``
 
 Hash algorithms:
 
@@ -54,14 +55,15 @@ Hash algorithms:
 * ``RTE_CRYPTO_AUTH_SHA256_HMAC``
 * ``RTE_CRYPTO_AUTH_SHA512_HMAC``
 * ``RTE_CRYPTO_AUTH_AES_XCBC_MAC``
+* ``RTE_CRYPTO_AUTH_SNOW3G_UIA2``
 
 
 Limitations
 -----------
 
 * Chained mbufs are not supported.
-* Hash only is not supported.
-* Cipher only is not supported.
+* Hash only is not supported except Snow3G UIA2.
+* Cipher only is not supported except Snow3G UEA2.
 * Only in-place is currently supported (destination address is the same as source address).
 * Only supports the session-oriented API implementation (session-less APIs are not supported).
 * Not performance tuned.
diff --git a/doc/guides/rel_notes/release_16_04.rst b/doc/guides/rel_notes/release_16_04.rst
index 64e913d..d8ead62 100644
--- a/doc/guides/rel_notes/release_16_04.rst
+++ b/doc/guides/rel_notes/release_16_04.rst
@@ -35,6 +35,12 @@ This section should contain new features added in this release. Sample format:
 
   Refer to the previous release notes for examples.
 
+* **Added support of Snow3G (UEA2 and UIA2) for Intel Quick Assist Devices.**
+
+  Enabled support for Snow3g Wireless algorithm for Intel Quick Assist devices.
+  Support for cipher only, Hash only is also provided
+  along with alg-chaining operations.
+
 * **Enabled bulk allocation of mbufs.**
 
   A new function ``rte_pktmbuf_alloc_bulk()`` has been added to allow the user
diff --git a/drivers/crypto/qat/qat_adf/qat_algs.h b/drivers/crypto/qat/qat_adf/qat_algs.h
index b73a5d0..b47dbc2 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs.h
+++ b/drivers/crypto/qat/qat_adf/qat_algs.h
@@ -125,5 +125,6 @@ void qat_alg_ablkcipher_init_dec(struct qat_alg_ablkcipher_cd *cd,
 					unsigned int keylen);
 
 int qat_alg_validate_aes_key(int key_len, enum icp_qat_hw_cipher_algo *alg);
+int qat_alg_validate_snow3g_key(int key_len, enum icp_qat_hw_cipher_algo *alg);
 
 #endif
diff --git a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
index bef444b..dd27476 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
+++ b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
@@ -376,7 +376,8 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 
 	PMD_INIT_FUNC_TRACE();
 
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER &&
+		cdesc->qat_hash_alg != ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2) {
 		cipher =
 		    (struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
 				sizeof(struct icp_qat_hw_auth_algo_blk));
@@ -409,13 +410,20 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 	else
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
 	/* For Snow3G, set key convert and other bits */
 	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
 					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
-		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+		if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER)  {
+			ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_RET_AUTH_RES);
+			ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+		}
 	}
 
 	cipher->aes.cipher_config.val =
@@ -431,7 +439,6 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 	/* Request template setup */
 	qat_alg_init_common_hdr(header);
 	header->service_cmd_id = cdesc->qat_cmd;
-
 	ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
 					ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
 	/* Configure the common header protocol flags */
@@ -447,6 +454,10 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 		cipher_cd_ctrl->cipher_state_sz =
 			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ >> 3;
 		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+		if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER)  {
+		ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_DIGEST_IN_BUFFER);
+		}
 	} else {
 		cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
 		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
@@ -492,6 +503,7 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 {
 	struct icp_qat_hw_cipher_algo_blk *cipher;
 	struct icp_qat_hw_auth_algo_blk *hash;
+	struct icp_qat_hw_cipher_algo_blk *cipherconfig;
 	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
 	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
 	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
@@ -510,7 +522,8 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 
 	PMD_INIT_FUNC_TRACE();
 
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER &&
+		cdesc->qat_hash_alg != ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2) {
 		hash = (struct icp_qat_hw_auth_algo_blk *)&cdesc->cd;
 		cipher =
 		(struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
@@ -549,11 +562,13 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 	else
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
 	cipher->aes.cipher_config.val =
 	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
 					cdesc->qat_cipher_alg, key_convert,
 					cdesc->qat_dir);
-	memcpy(cipher->aes.key, authkey, authkeylen);
 
 	hash->sha.inner_setup.auth_config.reserved = 0;
 	hash->sha.inner_setup.auth_config.config =
@@ -561,6 +576,22 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 				cdesc->qat_hash_alg, digestsize);
 	hash->sha.inner_setup.auth_counter.counter =
 		rte_bswap32(qat_hash_get_block_size(cdesc->qat_hash_alg));
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)  {
+		hash->sha.inner_setup.auth_counter.counter = 0;
+		hash->sha.outer_setup.auth_config.reserved = 0;
+		cipherconfig = (struct icp_qat_hw_cipher_algo_blk *)
+				((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk)
+				+ 16);
+		cipherconfig->aes.cipher_config.val =
+		ICP_QAT_HW_CIPHER_CONFIG_BUILD(ICP_QAT_HW_CIPHER_ECB_MODE,
+			ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2,
+			ICP_QAT_HW_CIPHER_KEY_CONVERT,
+			ICP_QAT_HW_CIPHER_ENCRYPT);
+		memcpy(cipherconfig->aes.key, authkey, authkeylen);
+		memset(cipherconfig->aes.key + authkeylen, 0,
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ);
+	}
 
 	/* Do precomputes */
 	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_AES_XCBC_MAC) {
@@ -587,6 +618,9 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 					ICP_QAT_HW_GALOIS_H_SZ]) =
 			rte_bswap32(add_auth_data_length);
 		proto = ICP_QAT_FW_LA_GCM_PROTO;
+	} else if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)  {
+		proto = ICP_QAT_FW_LA_SNOW_3G_PROTO;
+		state1_size = qat_hash_get_state1_size(cdesc->qat_hash_alg);
 	} else {
 		if (qat_alg_do_precomputes(cdesc->qat_hash_alg,
 			authkey, authkeylen, (uint8_t *)(hash->sha.state1),
@@ -606,10 +640,25 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
 	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
 
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_AUTH)  {
+		ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
+		ICP_QAT_FW_LA_CIPH_IV_FLD_FLAG_SET(header->serv_specif_flags,
+			ICP_QAT_FW_CIPH_IV_64BIT_PTR);
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	}
+
 	/* Cipher CD config setup */
-	cipher_cd_ctrl->cipher_key_sz = authkeylen >> 3;
-	cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
-	cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	if (cdesc->qat_cmd != ICP_QAT_FW_LA_CMD_AUTH) {
+		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset>>3;
+	} else {
+		cipher_cd_ctrl->cipher_state_sz = 0;
+		cipher_cd_ctrl->cipher_cfg_offset = 0;
+	}
 
 	/* Auth CD config setup */
 	hash_cd_ctrl->hash_cfg_offset = hash_offset >> 3;
@@ -644,6 +693,13 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 		hash_cd_ctrl->inner_state1_sz = ICP_QAT_HW_GALOIS_128_STATE1_SZ;
 		memset(hash->sha.state1, 0, ICP_QAT_HW_GALOIS_128_STATE1_SZ);
 		break;
+	case ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2:
+		hash_cd_ctrl->inner_state2_sz =
+				ICP_QAT_HW_SNOW_3G_UIA2_STATE2_SZ;
+		hash_cd_ctrl->inner_state1_sz =
+				ICP_QAT_HW_SNOW_3G_UIA2_STATE1_SZ;
+		memset(hash->sha.state1, 0, ICP_QAT_HW_SNOW_3G_UIA2_STATE1_SZ);
+		break;
 	default:
 		PMD_DRV_LOG(ERR, "invalid HASH alg %u", cdesc->qat_hash_alg);
 		return -EFAULT;
@@ -753,3 +809,15 @@ int qat_alg_validate_aes_key(int key_len, enum icp_qat_hw_cipher_algo *alg)
 	}
 	return 0;
 }
+
+int qat_alg_validate_snow3g_key(int key_len, enum icp_qat_hw_cipher_algo *alg)
+{
+	switch (key_len) {
+	case ICP_QAT_HW_SNOW_3G_UEA2_KEY_SZ:
+		*alg = ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2;
+		break;
+	default:
+		return -EINVAL;
+	}
+	return 0;
+}
diff --git a/drivers/crypto/qat/qat_crypto.c b/drivers/crypto/qat/qat_crypto.c
index ad06e85..cb16aae 100644
--- a/drivers/crypto/qat/qat_crypto.c
+++ b/drivers/crypto/qat/qat_crypto.c
@@ -169,6 +169,14 @@ qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
 		}
 		session->qat_mode = ICP_QAT_HW_CIPHER_CTR_MODE;
 		break;
+	case RTE_CRYPTO_CIPHER_SNOW3G_UEA2:
+		if (qat_alg_validate_snow3g_key(cipher_xform->key.length,
+					&session->qat_cipher_alg) != 0) {
+			PMD_DRV_LOG(ERR, "Invalid SNOW3G cipher key size");
+			goto error_out;
+		}
+		session->qat_mode = ICP_QAT_HW_CIPHER_ECB_MODE;
+		break;
 	case RTE_CRYPTO_CIPHER_NULL:
 	case RTE_CRYPTO_CIPHER_3DES_ECB:
 	case RTE_CRYPTO_CIPHER_3DES_CBC:
@@ -290,6 +298,9 @@ qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
 	case RTE_CRYPTO_AUTH_AES_GMAC:
 		session->qat_hash_alg = ICP_QAT_HW_AUTH_ALGO_GALOIS_128;
 		break;
+	case RTE_CRYPTO_AUTH_SNOW3G_UIA2:
+                session->qat_hash_alg = ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2;
+                break;
 	case RTE_CRYPTO_AUTH_NULL:
 	case RTE_CRYPTO_AUTH_SHA1:
 	case RTE_CRYPTO_AUTH_SHA256:
@@ -302,7 +313,6 @@ qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
 	case RTE_CRYPTO_AUTH_MD5_HMAC:
 	case RTE_CRYPTO_AUTH_AES_CCM:
 	case RTE_CRYPTO_AUTH_KASUMI_F9:
-	case RTE_CRYPTO_AUTH_SNOW3G_UIA2:
 	case RTE_CRYPTO_AUTH_AES_CMAC:
 	case RTE_CRYPTO_AUTH_AES_CBC_MAC:
 	case RTE_CRYPTO_AUTH_ZUC_EIA3:
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v3 3/3] app/test: add Snow3G tests
  2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
  2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
  2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 2/3] qat: add support for Snow3G Deepak Kumar JAIN
@ 2016-03-03 13:01   ` Deepak Kumar JAIN
  2016-03-07 13:55   ` [dpdk-dev] [PATCH v3 0/3] Snow3G support for Intel Quick Assist Devices De Lara Guarch, Pablo
                     ` (2 subsequent siblings)
  5 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-03-03 13:01 UTC (permalink / raw)
  To: dev

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 4 files changed, 1831 insertions(+), 3 deletions(-)
 create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c
index acba98a..a37018c 100644
--- a/app/test/test_cryptodev.c
+++ b/app/test/test_cryptodev.c
@@ -42,7 +42,8 @@
 
 #include "test.h"
 #include "test_cryptodev.h"
-
+#include "test_cryptodev_snow3g_test_vectors.h"
+#include "test_cryptodev_snow3g_hash_test_vectors.h"
 static enum rte_cryptodev_type gbl_cryptodev_type;
 
 struct crypto_testsuite_params {
@@ -68,6 +69,9 @@ struct crypto_unittest_params {
 	uint8_t *digest;
 };
 
+#define ALIGN_POW2_ROUNDUP(num, align) \
+	(((num) + (align) - 1) & ~((align) - 1))
+
 /*
  * Forward declarations.
  */
@@ -1748,6 +1752,997 @@ test_AES_CBC_HMAC_AES_XCBC_decrypt_digest_verify(void)
 	return TEST_SUCCESS;
 }
 
+/* ***** Snow3G Tests ***** */
+static int
+create_snow3g_hash_session(uint8_t dev_id,
+	const uint8_t *key, const uint8_t key_len,
+	const uint8_t aad_len, const uint8_t auth_len,
+	enum rte_crypto_auth_operation op)
+{
+	uint8_t hash_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(hash_key, key, key_len);
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.next = NULL;
+
+	ut_params->auth_xform.auth.op = op;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	ut_params->auth_xform.auth.key.data = hash_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->auth_xform);
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+static int
+create_snow3g_cipher_session(uint8_t dev_id,
+			enum rte_crypto_cipher_operation op,
+			const uint8_t *key, const uint8_t key_len)
+{
+	uint8_t cipher_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(cipher_key, key, key_len);
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = NULL;
+
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = op;
+	ut_params->cipher_xform.cipher.key.data = cipher_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session */
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+						&ut_params->
+						cipher_xform);
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+
+static int
+create_snow3g_cipher_operation(const uint8_t *iv, const unsigned iv_len,
+			const unsigned data_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+	unsigned iv_pad_len = 0;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+				RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+				"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(ut_params->ibuf
+			, iv_pad_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = iv_pad_len;
+	return 0;
+}
+
+
+static int
+create_snow3g_cipher_auth_session(uint8_t dev_id,
+		enum rte_crypto_cipher_operation cipher_op,
+		enum rte_crypto_auth_operation auth_op,
+		const uint8_t *key, const uint8_t key_len,
+		const uint8_t aad_len, const uint8_t auth_len)
+{
+	uint8_t cipher_auth_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(cipher_auth_key, key, key_len);
+
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.next = NULL;
+
+	ut_params->auth_xform.auth.op = auth_op;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	/* Hash key = cipher key */
+	ut_params->auth_xform.auth.key.data = cipher_auth_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = &ut_params->auth_xform;
+
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = cipher_op;
+	ut_params->cipher_xform.cipher.key.data = cipher_auth_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session*/
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->cipher_xform);
+
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+
+static int
+create_snow3g_auth_cipher_session(uint8_t dev_id,
+		enum rte_crypto_cipher_operation cipher_op,
+		enum rte_crypto_auth_operation auth_op,
+		const uint8_t *key, const uint8_t key_len,
+		const uint8_t aad_len, const uint8_t auth_len)
+	{
+	uint8_t auth_cipher_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(auth_cipher_key, key, key_len);
+
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.auth.op = auth_op;
+	ut_params->auth_xform.next = &ut_params->cipher_xform;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	ut_params->auth_xform.auth.key.data = auth_cipher_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = NULL;
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = cipher_op;
+	ut_params->cipher_xform.cipher.key.data = auth_cipher_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session*/
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->auth_xform);
+
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+
+	return 0;
+}
+
+static int
+create_snow3g_hash_operation(const uint8_t *auth_tag,
+		const unsigned auth_tag_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len,
+		enum rte_crypto_auth_operation op)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned aad_buffer_len;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+		"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+	sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend(
+			ut_params->ibuf, aad_buffer_len);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+					"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+			ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+					ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+				"no room to append auth tag");
+	ut_params->digest = sym_op->auth.digest.data;
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len + aad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+	if (op == RTE_CRYPTO_AUTH_OP_GENERATE)
+		memset(sym_op->auth.digest.data, 0, auth_tag_len);
+	else
+		rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "digest:",
+		sym_op->auth.digest.data,
+		sym_op->auth.digest.length);
+#endif
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len;
+
+	return 0;
+}
+
+static int
+create_snow3g_cipher_hash_operation(const uint8_t *auth_tag,
+		const unsigned auth_tag_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len,
+		enum rte_crypto_auth_operation op,
+		const uint8_t *iv, const unsigned iv_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned iv_pad_len = 0;
+	unsigned aad_buffer_len;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+			"Failed to allocate pktmbuf offload");
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(
+		ut_params->ibuf, iv_pad_len);
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = iv_pad_len;
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+
+	sym_op->auth.aad.data =
+			(uint8_t *)rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+			"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+			ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+			ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+			"no room to append auth tag");
+	ut_params->digest = sym_op->auth.digest.data;
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len + aad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+	if (op == RTE_CRYPTO_AUTH_OP_GENERATE)
+		memset(sym_op->auth.digest.data, 0, auth_tag_len);
+	else
+		rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len);
+
+	#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "digest:",
+		sym_op->auth.digest.data,
+		sym_op->auth.digest.length);
+	#endif
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len;
+
+	return 0;
+}
+
+static int
+create_snow3g_auth_cipher_operation(const unsigned auth_tag_len,
+		const uint8_t *iv, const unsigned iv_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned iv_pad_len = 0;
+	unsigned aad_buffer_len = 0;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+			"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+			ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+			"no room to append auth tag");
+
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+
+	memset(sym_op->auth.digest.data, 0, auth_tag_len);
+
+	#ifdef RTE_APP_TEST_DEBUG
+		rte_hexdump(stdout, "digest:",
+			sym_op->auth.digest.data,
+			sym_op->auth.digest.length);
+	#endif
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(
+		ut_params->ibuf, iv_pad_len);
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+
+	sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend(
+	ut_params->ibuf, aad_buffer_len);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+				"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+				ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = aad_buffer_len + iv_pad_len;
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len + iv_pad_len;
+
+	return 0;
+}
+
+static int
+test_snow3g_authentication(const struct snow3g_hash_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+	unsigned plaintext_pad_len;
+	uint8_t *plaintext;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_hash_session(ts_params->valid_devs[0],
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len,
+			RTE_CRYPTO_AUTH_OP_GENERATE);
+	if (retval < 0)
+		return retval;
+
+	/* alloc mbuf and set payload */
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple of */
+	/* the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+				plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+	/* Create SNOW3G opertaion */
+	retval = create_snow3g_hash_operation(NULL, tdata->digest.len,
+			tdata->aad.data, tdata->aad.len, tdata->plaintext.len,
+			plaintext_pad_len, RTE_CRYPTO_AUTH_OP_GENERATE);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+				ut_params->op);
+	ut_params->obuf = ut_params->op->sym->m_src;
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+			+ plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+	ut_params->digest,
+	tdata->digest.data,
+	DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+	"Snow3G Generated auth tag not as expected");
+
+	return 0;
+}
+
+static int
+test_snow3g_authentication_verify(const struct snow3g_hash_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+	unsigned plaintext_pad_len;
+	uint8_t *plaintext;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_hash_session(ts_params->valid_devs[0],
+				tdata->key.data, tdata->key.len,
+				tdata->aad.len, tdata->digest.len,
+				RTE_CRYPTO_AUTH_OP_VERIFY);
+	if (retval < 0)
+		return retval;
+	/* alloc mbuf and set payload */
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/* of the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+					plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_hash_operation(tdata->digest.data,
+			tdata->digest.len,
+			tdata->aad.data, tdata->aad.len,
+			tdata->plaintext.len, plaintext_pad_len,
+			RTE_CRYPTO_AUTH_OP_VERIFY);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+				ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	if (ut_params->op->status == RTE_CRYPTO_OP_STATUS_SUCCESS)
+		return 0;
+	else
+		return -1;
+
+	return 0;
+}
+
+
+static int
+test_snow3g_hash_generate_test_case_1(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_1);
+}
+
+static int
+test_snow3g_hash_generate_test_case_2(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_2);
+}
+
+static int
+test_snow3g_hash_generate_test_case_3(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_3);
+}
+
+static int
+test_snow3g_hash_verify_test_case_1(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_1);
+
+}
+
+static int
+test_snow3g_hash_verify_test_case_2(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_2);
+}
+
+static int
+test_snow3g_hash_verify_test_case_3(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_3);
+}
+
+static int
+test_snow3g_encryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_session(ts_params->valid_devs[0],
+					RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+					tdata->key.data, tdata->key.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* Clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	       rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/*
+	 * Append data which is padded to a
+	 * multiple of the algorithms block size
+	 */
+	plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16);
+
+	plaintext = (uint8_t *) rte_pktmbuf_append(ut_params->ibuf,
+						plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_operation(tdata->iv.data, tdata->iv.len,
+						tdata->plaintext.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+						ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		ciphertext = plaintext;
+
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8 - lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ciphertext,
+		tdata->ciphertext.data,
+		tdata->ciphertext.len,
+		"Snow3G Ciphertext data not as expected");
+	return 0;
+}
+
+static int test_snow3g_decryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t ciphertext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_session(ts_params->valid_devs[0],
+					RTE_CRYPTO_CIPHER_OP_DECRYPT,
+					tdata->key.data, tdata->key.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* Clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	       rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/*
+	 * Append data which is padded to a
+	 * multiple of the algorithms block size
+	 */
+	ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16);
+
+	ciphertext = (uint8_t *) rte_pktmbuf_append(ut_params->ibuf,
+						ciphertext_pad_len);
+	memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_operation(tdata->iv.data, tdata->iv.len,
+						tdata->ciphertext.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+						ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		plaintext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		plaintext = ciphertext;
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8 - lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(plaintext,
+				tdata->plaintext.data,
+				tdata->plaintext.len,
+				"Snow3G Plaintext data not as expected");
+	return 0;
+}
+
+static int
+test_snow3g_authenticated_encryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_auth_session(ts_params->valid_devs[0],
+			RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+			RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len);
+	if (retval < 0)
+		return retval;
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+			rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/*  of the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+			plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_hash_operation(tdata->digest.data,
+			tdata->digest.len, tdata->aad.data,
+			tdata->aad.len, tdata->plaintext.len,
+			plaintext_pad_len, RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->iv.data, tdata->iv.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+			ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		ciphertext = plaintext;
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8-lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+			ciphertext,
+			tdata->ciphertext.data,
+			tdata->ciphertext.len,
+			"Snow3G Ciphertext data not as expected");
+
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+	    + plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+			ut_params->digest,
+			tdata->digest.data,
+			DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+			"Snow3G Generated auth tag not as expected");
+	return 0;
+}
+static int
+test_snow3g_encrypted_authentication(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_auth_cipher_session(ts_params->valid_devs[0],
+			RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+			RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+			rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/* of the algorithms block size */
+	plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 8);
+
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+			plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_auth_cipher_operation(
+		tdata->digest.len,
+		tdata->iv.data, tdata->iv.len,
+		tdata->aad.data, tdata->aad.len,
+		tdata->plaintext.len, plaintext_pad_len
+	);
+
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+			ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->aad.len + tdata->iv.len;
+	else
+		ciphertext = plaintext;
+
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8-lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+			+ plaintext_pad_len + tdata->aad.len + tdata->iv.len;
+
+	#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ciphertext,
+		tdata->ciphertext.data,
+		tdata->ciphertext.len,
+		"Snow3G Ciphertext data not as expected");
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ut_params->digest,
+		tdata->digest.data,
+		DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+		"Snow3G Generated auth tag not as expected");
+	return 0;
+}
+
+static int
+test_snow3g_encryption_test_case_1(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_1);
+}
+
+static int
+test_snow3g_encryption_test_case_2(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_2);
+}
+
+static int
+test_snow3g_encryption_test_case_3(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_encryption_test_case_4(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_4);
+}
+
+static int
+test_snow3g_encryption_test_case_5(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_5);
+}
+
+static int
+test_snow3g_decryption_test_case_1(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_1);
+}
+
+static int
+test_snow3g_decryption_test_case_2(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_2);
+}
+
+static int
+test_snow3g_decryption_test_case_3(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_decryption_test_case_4(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_4);
+}
+
+static int
+test_snow3g_decryption_test_case_5(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_5);
+}
+static int
+test_snow3g_authenticated_encryption_test_case_1(void)
+{
+	return test_snow3g_authenticated_encryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_encrypted_authentication_test_case_1(void)
+{
+	return test_snow3g_encrypted_authentication(&snow3g_test_case_6);
+}
 
 /* ***** AES-GCM Tests ***** */
 
@@ -1999,9 +2994,47 @@ static struct unit_test_suite cryptodev_qat_testsuite  = {
 				test_AES_CBC_HMAC_AES_XCBC_encrypt_digest),
 		TEST_CASE_ST(ut_setup, ut_teardown,
 				test_AES_CBC_HMAC_AES_XCBC_decrypt_digest_verify),
-
 		TEST_CASE_ST(ut_setup, ut_teardown, test_stats),
+		/** Snow3G encrypt only (UEA2) */
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_4),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_5),
+
 
+		/** Snow3G decrypt only (UEA2) */
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_4),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_5),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_authenticated_encryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encrypted_authentication_test_case_1),
 		TEST_CASES_END() /**< NULL terminate unit test array */
 	}
 };
diff --git a/app/test/test_cryptodev.h b/app/test/test_cryptodev.h
index c84ba42..b69649a 100644
--- a/app/test/test_cryptodev.h
+++ b/app/test/test_cryptodev.h
@@ -1,7 +1,7 @@
 /*-
  *   BSD LICENSE
  *
- *   Copyright(c) 2015 Intel Corporation. All rights reserved.
+ *   Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
  *
  *   Redistribution and use in source and binary forms, with or without
  *   modification, are permitted provided that the following conditions
@@ -58,6 +58,7 @@
 #define DIGEST_BYTE_LENGTH_SHA384		(BYTE_LENGTH(384))
 #define DIGEST_BYTE_LENGTH_SHA512		(BYTE_LENGTH(512))
 #define DIGEST_BYTE_LENGTH_AES_XCBC		(BYTE_LENGTH(96))
+#define DIGEST_BYTE_LENGTH_SNOW3G_UIA2		(BYTE_LENGTH(32))
 #define AES_XCBC_MAC_KEY_SZ			(16)
 
 #define TRUNCATED_DIGEST_BYTE_LENGTH_SHA1		(12)
diff --git a/app/test/test_cryptodev_snow3g_hash_test_vectors.h b/app/test/test_cryptodev_snow3g_hash_test_vectors.h
new file mode 100644
index 0000000..f4fa36d
--- /dev/null
+++ b/app/test/test_cryptodev_snow3g_hash_test_vectors.h
@@ -0,0 +1,415 @@
+/*-
+ *   BSD LICENSE
+ *
+ *   Copyright(c) 2016 Intel Corporation. All rights reserved.
+ *
+ *   Redistribution and use in source and binary forms, with or without
+ *   modification, are permitted provided that the following conditions
+ *   are met:
+ *
+ *	 * Redistributions of source code must retain the above copyright
+ *	   notice, this list of conditions and the following disclaimer.
+ *	 * Redistributions in binary form must reproduce the above copyright
+ *	   notice, this list of conditions and the following disclaimer in
+ *	   the documentation and/or other materials provided with the
+ *	   distribution.
+ *	 * Neither the name of Intel Corporation nor the names of its
+ *	   contributors may be used to endorse or promote products derived
+ *	   from this software without specific prior written permission.
+ *
+ *   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ *   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ *   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ *   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ *   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ *   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ *   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ *   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ *   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ *   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ *   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#ifndef TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_
+#define TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_
+
+struct snow3g_hash_test_data {
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} key;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} aad;
+
+	struct {
+		uint8_t data[2056];
+		unsigned len;
+	} plaintext;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} digest;
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_1 = {
+	.key = {
+		.data = {
+			0xC7, 0x36, 0xC6, 0xAA, 0xB2, 0x2B, 0xFF, 0xF9,
+			0x1E, 0x26, 0x98, 0xD2, 0xE2, 0x2A, 0xD5, 0x7E
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xD0, 0xA7, 0xD4, 0x63, 0xDF, 0x9F, 0xB2, 0xB2,
+			0x78, 0x83, 0x3F, 0xA0, 0x2E, 0x23, 0x5A, 0xA1,
+			0x72, 0xBD, 0x97, 0x0C, 0x14, 0x73, 0xE1, 0x29,
+			0x07, 0xFB, 0x64, 0x8B, 0x65, 0x99, 0xAA, 0xA0,
+			0xB2, 0x4A, 0x03, 0x86, 0x65, 0x42, 0x2B, 0x20,
+			0xA4, 0x99, 0x27, 0x6A, 0x50, 0x42, 0x70, 0x09
+		},
+		.len = 48
+	},
+	.digest = {
+		.data = {0x38, 0xB5, 0x54, 0xC0 },
+		.len  = 4
+	}
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_2 = {
+	.key = {
+		.data = {
+			0xF4, 0xEB, 0xEC, 0x69, 0xE7, 0x3E, 0xAF, 0x2E,
+			0xB2, 0xCF, 0x6A, 0xF4, 0xB3, 0x12, 0x0F, 0xFD
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x29, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0x77, 0x37,
+			0xA9, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0xF7, 0x37
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x10, 0xBF, 0xFF, 0x83, 0x9E, 0x0C, 0x71, 0x65,
+			0x8D, 0xBB, 0x2D, 0x17, 0x07, 0xE1, 0x45, 0x72,
+			0x4F, 0x41, 0xC1, 0x6F, 0x48, 0xBF, 0x40, 0x3C,
+			0x3B, 0x18, 0xE3, 0x8F, 0xD5, 0xD1, 0x66, 0x3B,
+			0x6F, 0x6D, 0x90, 0x01, 0x93, 0xE3, 0xCE, 0xA8,
+			0xBB, 0x4F, 0x1B, 0x4F, 0x5B, 0xE8, 0x22, 0x03,
+			0x22, 0x32, 0xA7, 0x8D, 0x7D, 0x75, 0x23, 0x8D,
+			0x5E, 0x6D, 0xAE, 0xCD, 0x3B, 0x43, 0x22, 0xCF,
+			0x59, 0xBC, 0x7E, 0xA8, 0x4A, 0xB1, 0x88, 0x11,
+			0xB5, 0xBF, 0xB7, 0xBC, 0x55, 0x3F, 0x4F, 0xE4,
+			0x44, 0x78, 0xCE, 0x28, 0x7A, 0x14, 0x87, 0x99,
+			0x90, 0xD1, 0x8D, 0x12, 0xCA, 0x79, 0xD2, 0xC8,
+			0x55, 0x14, 0x90, 0x21, 0xCD, 0x5C, 0xE8, 0xCA,
+			0x03, 0x71, 0xCA, 0x04, 0xFC, 0xCE, 0x14, 0x3E,
+			0x3D, 0x7C, 0xFE, 0xE9, 0x45, 0x85, 0xB5, 0x88,
+			0x5C, 0xAC, 0x46, 0x06, 0x8B
+		},
+	.len = 125
+	},
+	.digest = {
+		.data = {0x06, 0x17, 0x45, 0xAE},
+		.len  = 4
+	}
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_3 = {
+	.key = {
+		.data = {
+			0xB3, 0x12, 0x0F, 0xFD, 0xB2, 0xCF, 0x6A, 0xF4,
+			0xE7, 0x3E, 0xAF, 0x2E, 0xF4, 0xEB, 0xEC, 0x69
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x29, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0x77, 0x37,
+			0xA9, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0xF7, 0x37
+		},
+	.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+			0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01,
+			0xE0, 0x95, 0x80, 0x45, 0xF3, 0xA0, 0xBB, 0xA4,
+			0xE3, 0x96, 0x83, 0x46, 0xF0, 0xA3, 0xB8, 0xA7,
+			0xC0, 0x2A, 0x01, 0x8A, 0xE6, 0x40, 0x76, 0x52,
+			0x26, 0xB9, 0x87, 0xC9, 0x13, 0xE6, 0xCB, 0xF0,
+			0x83, 0x57, 0x00, 0x16, 0xCF, 0x83, 0xEF, 0xBC,
+			0x61, 0xC0, 0x82, 0x51, 0x3E, 0x21, 0x56, 0x1A,
+			0x42, 0x7C, 0x00, 0x9D, 0x28, 0xC2, 0x98, 0xEF,
+			0xAC, 0xE7, 0x8E, 0xD6, 0xD5, 0x6C, 0x2D, 0x45,
+			0x05, 0xAD, 0x03, 0x2E, 0x9C, 0x04, 0xDC, 0x60,
+			0xE7, 0x3A, 0x81, 0x69, 0x6D, 0xA6, 0x65, 0xC6,
+			0xC4, 0x86, 0x03, 0xA5, 0x7B, 0x45, 0xAB, 0x33,
+			0x22, 0x15, 0x85, 0xE6, 0x8E, 0xE3, 0x16, 0x91,
+			0x87, 0xFB, 0x02, 0x39, 0x52, 0x86, 0x32, 0xDD,
+			0x65, 0x6C, 0x80, 0x7E, 0xA3, 0x24, 0x8B, 0x7B,
+			0x46, 0xD0, 0x02, 0xB2, 0xB5, 0xC7, 0x45, 0x8E,
+			0xB8, 0x5B, 0x9C, 0xE9, 0x58, 0x79, 0xE0, 0x34,
+			0x08, 0x59, 0x05, 0x5E, 0x3B, 0x0A, 0xBB, 0xC3,
+			0xEA, 0xCE, 0x87, 0x19, 0xCA, 0xA8, 0x02, 0x65,
+			0xC9, 0x72, 0x05, 0xD5, 0xDC, 0x4B, 0xCC, 0x90,
+			0x2F, 0xE1, 0x83, 0x96, 0x29, 0xED, 0x71, 0x32,
+			0x8A, 0x0F, 0x04, 0x49, 0xF5, 0x88, 0x55, 0x7E,
+			0x68, 0x98, 0x86, 0x0E, 0x04, 0x2A, 0xEC, 0xD8,
+			0x4B, 0x24, 0x04, 0xC2, 0x12, 0xC9, 0x22, 0x2D,
+			0xA5, 0xBF, 0x8A, 0x89, 0xEF, 0x67, 0x97, 0x87,
+			0x0C, 0xF5, 0x07, 0x71, 0xA6, 0x0F, 0x66, 0xA2,
+			0xEE, 0x62, 0x85, 0x36, 0x57, 0xAD, 0xDF, 0x04,
+			0xCD, 0xDE, 0x07, 0xFA, 0x41, 0x4E, 0x11, 0xF1,
+			0x2B, 0x4D, 0x81, 0xB9, 0xB4, 0xE8, 0xAC, 0x53,
+			0x8E, 0xA3, 0x06, 0x66, 0x68, 0x8D, 0x88, 0x1F,
+			0x6C, 0x34, 0x84, 0x21, 0x99, 0x2F, 0x31, 0xB9,
+			0x4F, 0x88, 0x06, 0xED, 0x8F, 0xCC, 0xFF, 0x4C,
+			0x91, 0x23, 0xB8, 0x96, 0x42, 0x52, 0x7A, 0xD6,
+			0x13, 0xB1, 0x09, 0xBF, 0x75, 0x16, 0x74, 0x85,
+			0xF1, 0x26, 0x8B, 0xF8, 0x84, 0xB4, 0xCD, 0x23,
+			0xD2, 0x9A, 0x09, 0x34, 0x92, 0x57, 0x03, 0xD6,
+			0x34, 0x09, 0x8F, 0x77, 0x67, 0xF1, 0xBE, 0x74,
+			0x91, 0xE7, 0x08, 0xA8, 0xBB, 0x94, 0x9A, 0x38,
+			0x73, 0x70, 0x8A, 0xEF, 0x4A, 0x36, 0x23, 0x9E,
+			0x50, 0xCC, 0x08, 0x23, 0x5C, 0xD5, 0xED, 0x6B,
+			0xBE, 0x57, 0x86, 0x68, 0xA1, 0x7B, 0x58, 0xC1,
+			0x17, 0x1D, 0x0B, 0x90, 0xE8, 0x13, 0xA9, 0xE4,
+			0xF5, 0x8A, 0x89, 0xD7, 0x19, 0xB1, 0x10, 0x42,
+			0xD6, 0x36, 0x0B, 0x1B, 0x0F, 0x52, 0xDE, 0xB7,
+			0x30, 0xA5, 0x8D, 0x58, 0xFA, 0xF4, 0x63, 0x15,
+			0x95, 0x4B, 0x0A, 0x87, 0x26, 0x91, 0x47, 0x59,
+			0x77, 0xDC, 0x88, 0xC0, 0xD7, 0x33, 0xFE, 0xFF,
+			0x54, 0x60, 0x0A, 0x0C, 0xC1, 0xD0, 0x30, 0x0A,
+			0xAA, 0xEB, 0x94, 0x57, 0x2C, 0x6E, 0x95, 0xB0,
+			0x1A, 0xE9, 0x0D, 0xE0, 0x4F, 0x1D, 0xCE, 0x47,
+			0xF8, 0x7E, 0x8F, 0xA7, 0xBE, 0xBF, 0x77, 0xE1,
+			0xDB, 0xC2, 0x0D, 0x6B, 0xA8, 0x5C, 0xB9, 0x14,
+			0x3D, 0x51, 0x8B, 0x28, 0x5D, 0xFA, 0x04, 0xB6,
+			0x98, 0xBF, 0x0C, 0xF7, 0x81, 0x9F, 0x20, 0xFA,
+			0x7A, 0x28, 0x8E, 0xB0, 0x70, 0x3D, 0x99, 0x5C,
+			0x59, 0x94, 0x0C, 0x7C, 0x66, 0xDE, 0x57, 0xA9,
+			0xB7, 0x0F, 0x82, 0x37, 0x9B, 0x70, 0xE2, 0x03,
+			0x1E, 0x45, 0x0F, 0xCF, 0xD2, 0x18, 0x13, 0x26,
+			0xFC, 0xD2, 0x8D, 0x88, 0x23, 0xBA, 0xAA, 0x80,
+			0xDF, 0x6E, 0x0F, 0x44, 0x35, 0x59, 0x64, 0x75,
+			0x39, 0xFD, 0x89, 0x07, 0xC0, 0xFF, 0xD9, 0xD7,
+			0x9C, 0x13, 0x0E, 0xD8, 0x1C, 0x9A, 0xFD, 0x9B,
+			0x7E, 0x84, 0x8C, 0x9F, 0xED, 0x38, 0x44, 0x3D,
+			0x5D, 0x38, 0x0E, 0x53, 0xFB, 0xDB, 0x8A, 0xC8,
+			0xC3, 0xD3, 0xF0, 0x68, 0x76, 0x05, 0x4F, 0x12,
+			0x24, 0x61, 0x10, 0x7D, 0xE9, 0x2F, 0xEA, 0x09,
+			0xC6, 0xF6, 0x92, 0x3A, 0x18, 0x8D, 0x53, 0xAF,
+			0xE5, 0x4A, 0x10, 0xF6, 0x0E, 0x6E, 0x9D, 0x5A,
+			0x03, 0xD9, 0x96, 0xB5, 0xFB, 0xC8, 0x20, 0xF8,
+			0xA6, 0x37, 0x11, 0x6A, 0x27, 0xAD, 0x04, 0xB4,
+			0x44, 0xA0, 0x93, 0x2D, 0xD6, 0x0F, 0xBD, 0x12,
+			0x67, 0x1C, 0x11, 0xE1, 0xC0, 0xEC, 0x73, 0xE7,
+			0x89, 0x87, 0x9F, 0xAA, 0x3D, 0x42, 0xC6, 0x4D,
+			0x20, 0xCD, 0x12, 0x52, 0x74, 0x2A, 0x37, 0x68,
+			0xC2, 0x5A, 0x90, 0x15, 0x85, 0x88, 0x8E, 0xCE,
+			0xE1, 0xE6, 0x12, 0xD9, 0x93, 0x6B, 0x40, 0x3B,
+			0x07, 0x75, 0x94, 0x9A, 0x66, 0xCD, 0xFD, 0x99,
+			0xA2, 0x9B, 0x13, 0x45, 0xBA, 0xA8, 0xD9, 0xD5,
+			0x40, 0x0C, 0x91, 0x02, 0x4B, 0x0A, 0x60, 0x73,
+			0x63, 0xB0, 0x13, 0xCE, 0x5D, 0xE9, 0xAE, 0x86,
+			0x9D, 0x3B, 0x8D, 0x95, 0xB0, 0x57, 0x0B, 0x3C,
+			0x2D, 0x39, 0x14, 0x22, 0xD3, 0x24, 0x50, 0xCB,
+			0xCF, 0xAE, 0x96, 0x65, 0x22, 0x86, 0xE9, 0x6D,
+			0xEC, 0x12, 0x14, 0xA9, 0x34, 0x65, 0x27, 0x98,
+			0x0A, 0x81, 0x92, 0xEA, 0xC1, 0xC3, 0x9A, 0x3A,
+			0xAF, 0x6F, 0x15, 0x35, 0x1D, 0xA6, 0xBE, 0x76,
+			0x4D, 0xF8, 0x97, 0x72, 0xEC, 0x04, 0x07, 0xD0,
+			0x6E, 0x44, 0x15, 0xBE, 0xFA, 0xE7, 0xC9, 0x25,
+			0x80, 0xDF, 0x9B, 0xF5, 0x07, 0x49, 0x7C, 0x8F,
+			0x29, 0x95, 0x16, 0x0D, 0x4E, 0x21, 0x8D, 0xAA,
+			0xCB, 0x02, 0x94, 0x4A, 0xBF, 0x83, 0x34, 0x0C,
+			0xE8, 0xBE, 0x16, 0x86, 0xA9, 0x60, 0xFA, 0xF9,
+			0x0E, 0x2D, 0x90, 0xC5, 0x5C, 0xC6, 0x47, 0x5B,
+			0xAB, 0xC3, 0x17, 0x1A, 0x80, 0xA3, 0x63, 0x17,
+			0x49, 0x54, 0x95, 0x5D, 0x71, 0x01, 0xDA, 0xB1,
+			0x6A, 0xE8, 0x17, 0x91, 0x67, 0xE2, 0x14, 0x44,
+			0xB4, 0x43, 0xA9, 0xEA, 0xAA, 0x7C, 0x91, 0xDE,
+			0x36, 0xD1, 0x18, 0xC3, 0x9D, 0x38, 0x9F, 0x8D,
+			0xD4, 0x46, 0x9A, 0x84, 0x6C, 0x9A, 0x26, 0x2B,
+			0xF7, 0xFA, 0x18, 0x48, 0x7A, 0x79, 0xE8, 0xDE,
+			0x11, 0x69, 0x9E, 0x0B, 0x8F, 0xDF, 0x55, 0x7C,
+			0xB4, 0x87, 0x19, 0xD4, 0x53, 0xBA, 0x71, 0x30,
+			0x56, 0x10, 0x9B, 0x93, 0xA2, 0x18, 0xC8, 0x96,
+			0x75, 0xAC, 0x19, 0x5F, 0xB4, 0xFB, 0x06, 0x63,
+			0x9B, 0x37, 0x97, 0x14, 0x49, 0x55, 0xB3, 0xC9,
+			0x32, 0x7D, 0x1A, 0xEC, 0x00, 0x3D, 0x42, 0xEC,
+			0xD0, 0xEA, 0x98, 0xAB, 0xF1, 0x9F, 0xFB, 0x4A,
+			0xF3, 0x56, 0x1A, 0x67, 0xE7, 0x7C, 0x35, 0xBF,
+			0x15, 0xC5, 0x9C, 0x24, 0x12, 0xDA, 0x88, 0x1D,
+			0xB0, 0x2B, 0x1B, 0xFB, 0xCE, 0xBF, 0xAC, 0x51,
+			0x52, 0xBC, 0x99, 0xBC, 0x3F, 0x1D, 0x15, 0xF7,
+			0x71, 0x00, 0x1B, 0x70, 0x29, 0xFE, 0xDB, 0x02,
+			0x8F, 0x8B, 0x85, 0x2B, 0xC4, 0x40, 0x7E, 0xB8,
+			0x3F, 0x89, 0x1C, 0x9C, 0xA7, 0x33, 0x25, 0x4F,
+			0xDD, 0x1E, 0x9E, 0xDB, 0x56, 0x91, 0x9C, 0xE9,
+			0xFE, 0xA2, 0x1C, 0x17, 0x40, 0x72, 0x52, 0x1C,
+			0x18, 0x31, 0x9A, 0x54, 0xB5, 0xD4, 0xEF, 0xBE,
+			0xBD, 0xDF, 0x1D, 0x8B, 0x69, 0xB1, 0xCB, 0xF2,
+			0x5F, 0x48, 0x9F, 0xCC, 0x98, 0x13, 0x72, 0x54,
+			0x7C, 0xF4, 0x1D, 0x00, 0x8E, 0xF0, 0xBC, 0xA1,
+			0x92, 0x6F, 0x93, 0x4B, 0x73, 0x5E, 0x09, 0x0B,
+			0x3B, 0x25, 0x1E, 0xB3, 0x3A, 0x36, 0xF8, 0x2E,
+			0xD9, 0xB2, 0x9C, 0xF4, 0xCB, 0x94, 0x41, 0x88,
+			0xFA, 0x0E, 0x1E, 0x38, 0xDD, 0x77, 0x8F, 0x7D,
+			0x1C, 0x9D, 0x98, 0x7B, 0x28, 0xD1, 0x32, 0xDF,
+			0xB9, 0x73, 0x1F, 0xA4, 0xF4, 0xB4, 0x16, 0x93,
+			0x5B, 0xE4, 0x9D, 0xE3, 0x05, 0x16, 0xAF, 0x35,
+			0x78, 0x58, 0x1F, 0x2F, 0x13, 0xF5, 0x61, 0xC0,
+			0x66, 0x33, 0x61, 0x94, 0x1E, 0xAB, 0x24, 0x9A,
+			0x4B, 0xC1, 0x23, 0xF8, 0xD1, 0x5C, 0xD7, 0x11,
+			0xA9, 0x56, 0xA1, 0xBF, 0x20, 0xFE, 0x6E, 0xB7,
+			0x8A, 0xEA, 0x23, 0x73, 0x36, 0x1D, 0xA0, 0x42,
+			0x6C, 0x79, 0xA5, 0x30, 0xC3, 0xBB, 0x1D, 0xE0,
+			0xC9, 0x97, 0x22, 0xEF, 0x1F, 0xDE, 0x39, 0xAC,
+			0x2B, 0x00, 0xA0, 0xA8, 0xEE, 0x7C, 0x80, 0x0A,
+			0x08, 0xBC, 0x22, 0x64, 0xF8, 0x9F, 0x4E, 0xFF,
+			0xE6, 0x27, 0xAC, 0x2F, 0x05, 0x31, 0xFB, 0x55,
+			0x4F, 0x6D, 0x21, 0xD7, 0x4C, 0x59, 0x0A, 0x70,
+			0xAD, 0xFA, 0xA3, 0x90, 0xBD, 0xFB, 0xB3, 0xD6,
+			0x8E, 0x46, 0x21, 0x5C, 0xAB, 0x18, 0x7D, 0x23,
+			0x68, 0xD5, 0xA7, 0x1F, 0x5E, 0xBE, 0xC0, 0x81,
+			0xCD, 0x3B, 0x20, 0xC0, 0x82, 0xDB, 0xE4, 0xCD,
+			0x2F, 0xAC, 0xA2, 0x87, 0x73, 0x79, 0x5D, 0x6B,
+			0x0C, 0x10, 0x20, 0x4B, 0x65, 0x9A, 0x93, 0x9E,
+			0xF2, 0x9B, 0xBE, 0x10, 0x88, 0x24, 0x36, 0x24,
+			0x42, 0x99, 0x27, 0xA7, 0xEB, 0x57, 0x6D, 0xD3,
+			0xA0, 0x0E, 0xA5, 0xE0, 0x1A, 0xF5, 0xD4, 0x75,
+			0x83, 0xB2, 0x27, 0x2C, 0x0C, 0x16, 0x1A, 0x80,
+			0x65, 0x21, 0xA1, 0x6F, 0xF9, 0xB0, 0xA7, 0x22,
+			0xC0, 0xCF, 0x26, 0xB0, 0x25, 0xD5, 0x83, 0x6E,
+			0x22, 0x58, 0xA4, 0xF7, 0xD4, 0x77, 0x3A, 0xC8,
+			0x01, 0xE4, 0x26, 0x3B, 0xC2, 0x94, 0xF4, 0x3D,
+			0xEF, 0x7F, 0xA8, 0x70, 0x3F, 0x3A, 0x41, 0x97,
+			0x46, 0x35, 0x25, 0x88, 0x76, 0x52, 0xB0, 0xB2,
+			0xA4, 0xA2, 0xA7, 0xCF, 0x87, 0xF0, 0x09, 0x14,
+			0x87, 0x1E, 0x25, 0x03, 0x91, 0x13, 0xC7, 0xE1,
+			0x61, 0x8D, 0xA3, 0x40, 0x64, 0xB5, 0x7A, 0x43,
+			0xC4, 0x63, 0x24, 0x9F, 0xB8, 0xD0, 0x5E, 0x0F,
+			0x26, 0xF4, 0xA6, 0xD8, 0x49, 0x72, 0xE7, 0xA9,
+			0x05, 0x48, 0x24, 0x14, 0x5F, 0x91, 0x29, 0x5C,
+			0xDB, 0xE3, 0x9A, 0x6F, 0x92, 0x0F, 0xAC, 0xC6,
+			0x59, 0x71, 0x2B, 0x46, 0xA5, 0x4B, 0xA2, 0x95,
+			0xBB, 0xE6, 0xA9, 0x01, 0x54, 0xE9, 0x1B, 0x33,
+			0x98, 0x5A, 0x2B, 0xCD, 0x42, 0x0A, 0xD5, 0xC6,
+			0x7E, 0xC9, 0xAD, 0x8E, 0xB7, 0xAC, 0x68, 0x64,
+			0xDB, 0x27, 0x2A, 0x51, 0x6B, 0xC9, 0x4C, 0x28,
+			0x39, 0xB0, 0xA8, 0x16, 0x9A, 0x6B, 0xF5, 0x8E,
+			0x1A, 0x0C, 0x2A, 0xDA, 0x8C, 0x88, 0x3B, 0x7B,
+			0xF4, 0x97, 0xA4, 0x91, 0x71, 0x26, 0x8E, 0xD1,
+			0x5D, 0xDD, 0x29, 0x69, 0x38, 0x4E, 0x7F, 0xF4,
+			0xBF, 0x4A, 0xAB, 0x2E, 0xC9, 0xEC, 0xC6, 0x52,
+			0x9C, 0xF6, 0x29, 0xE2, 0xDF, 0x0F, 0x08, 0xA7,
+			0x7A, 0x65, 0xAF, 0xA1, 0x2A, 0xA9, 0xB5, 0x05,
+			0xDF, 0x8B, 0x28, 0x7E, 0xF6, 0xCC, 0x91, 0x49,
+			0x3D, 0x1C, 0xAA, 0x39, 0x07, 0x6E, 0x28, 0xEF,
+			0x1E, 0xA0, 0x28, 0xF5, 0x11, 0x8D, 0xE6, 0x1A,
+			0xE0, 0x2B, 0xB6, 0xAE, 0xFC, 0x33, 0x43, 0xA0,
+			0x50, 0x29, 0x2F, 0x19, 0x9F, 0x40, 0x18, 0x57,
+			0xB2, 0xBE, 0xAD, 0x5E, 0x6E, 0xE2, 0xA1, 0xF1,
+			0x91, 0x02, 0x2F, 0x92, 0x78, 0x01, 0x6F, 0x04,
+			0x77, 0x91, 0xA9, 0xD1, 0x8D, 0xA7, 0xD2, 0xA6,
+			0xD2, 0x7F, 0x2E, 0x0E, 0x51, 0xC2, 0xF6, 0xEA,
+			0x30, 0xE8, 0xAC, 0x49, 0xA0, 0x60, 0x4F, 0x4C,
+			0x13, 0x54, 0x2E, 0x85, 0xB6, 0x83, 0x81, 0xB9,
+			0xFD, 0xCF, 0xA0, 0xCE, 0x4B, 0x2D, 0x34, 0x13,
+			0x54, 0x85, 0x2D, 0x36, 0x02, 0x45, 0xC5, 0x36,
+			0xB6, 0x12, 0xAF, 0x71, 0xF3, 0xE7, 0x7C, 0x90,
+			0x95, 0xAE, 0x2D, 0xBD, 0xE5, 0x04, 0xB2, 0x65,
+			0x73, 0x3D, 0xAB, 0xFE, 0x10, 0xA2, 0x0F, 0xC7,
+			0xD6, 0xD3, 0x2C, 0x21, 0xCC, 0xC7, 0x2B, 0x8B,
+			0x34, 0x44, 0xAE, 0x66, 0x3D, 0x65, 0x92, 0x2D,
+			0x17, 0xF8, 0x2C, 0xAA, 0x2B, 0x86, 0x5C, 0xD8,
+			0x89, 0x13, 0xD2, 0x91, 0xA6, 0x58, 0x99, 0x02,
+			0x6E, 0xA1, 0x32, 0x84, 0x39, 0x72, 0x3C, 0x19,
+			0x8C, 0x36, 0xB0, 0xC3, 0xC8, 0xD0, 0x85, 0xBF,
+			0xAF, 0x8A, 0x32, 0x0F, 0xDE, 0x33, 0x4B, 0x4A,
+			0x49, 0x19, 0xB4, 0x4C, 0x2B, 0x95, 0xF6, 0xE8,
+			0xEC, 0xF7, 0x33, 0x93, 0xF7, 0xF0, 0xD2, 0xA4,
+			0x0E, 0x60, 0xB1, 0xD4, 0x06, 0x52, 0x6B, 0x02,
+			0x2D, 0xDC, 0x33, 0x18, 0x10, 0xB1, 0xA5, 0xF7,
+			0xC3, 0x47, 0xBD, 0x53, 0xED, 0x1F, 0x10, 0x5D,
+			0x6A, 0x0D, 0x30, 0xAB, 0xA4, 0x77, 0xE1, 0x78,
+			0x88, 0x9A, 0xB2, 0xEC, 0x55, 0xD5, 0x58, 0xDE,
+			0xAB, 0x26, 0x30, 0x20, 0x43, 0x36, 0x96, 0x2B,
+			0x4D, 0xB5, 0xB6, 0x63, 0xB6, 0x90, 0x2B, 0x89,
+			0xE8, 0x5B, 0x31, 0xBC, 0x6A, 0xF5, 0x0F, 0xC5,
+			0x0A, 0xCC, 0xB3, 0xFB, 0x9B, 0x57, 0xB6, 0x63,
+			0x29, 0x70, 0x31, 0x37, 0x8D, 0xB4, 0x78, 0x96,
+			0xD7, 0xFB, 0xAF, 0x6C, 0x60, 0x0A, 0xDD, 0x2C,
+			0x67, 0xF9, 0x36, 0xDB, 0x03, 0x79, 0x86, 0xDB,
+			0x85, 0x6E, 0xB4, 0x9C, 0xF2, 0xDB, 0x3F, 0x7D,
+			0xA6, 0xD2, 0x36, 0x50, 0xE4, 0x38, 0xF1, 0x88,
+			0x40, 0x41, 0xB0, 0x13, 0x11, 0x9E, 0x4C, 0x2A,
+			0xE5, 0xAF, 0x37, 0xCC, 0xCD, 0xFB, 0x68, 0x66,
+			0x07, 0x38, 0xB5, 0x8B, 0x3C, 0x59, 0xD1, 0xC0,
+			0x24, 0x84, 0x37, 0x47, 0x2A, 0xBA, 0x1F, 0x35,
+			0xCA, 0x1F, 0xB9, 0x0C, 0xD7, 0x14, 0xAA, 0x9F,
+			0x63, 0x55, 0x34, 0xF4, 0x9E, 0x7C, 0x5B, 0xBA,
+			0x81, 0xC2, 0xB6, 0xB3, 0x6F, 0xDE, 0xE2, 0x1C,
+			0xA2, 0x7E, 0x34, 0x7F, 0x79, 0x3D, 0x2C, 0xE9,
+			0x44, 0xED, 0xB2, 0x3C, 0x8C, 0x9B, 0x91, 0x4B,
+			0xE1, 0x03, 0x35, 0xE3, 0x50, 0xFE, 0xB5, 0x07,
+			0x03, 0x94, 0xB7, 0xA4, 0xA1, 0x5C, 0x0C, 0xA1,
+			0x20, 0x28, 0x35, 0x68, 0xB7, 0xBF, 0xC2, 0x54,
+			0xFE, 0x83, 0x8B, 0x13, 0x7A, 0x21, 0x47, 0xCE,
+			0x7C, 0x11, 0x3A, 0x3A, 0x4D, 0x65, 0x49, 0x9D,
+			0x9E, 0x86, 0xB8, 0x7D, 0xBC, 0xC7, 0xF0, 0x3B,
+			0xBD, 0x3A, 0x3A, 0xB1, 0xAA, 0x24, 0x3E, 0xCE,
+			0x5B, 0xA9, 0xBC, 0xF2, 0x5F, 0x82, 0x83, 0x6C,
+			0xFE, 0x47, 0x3B, 0x2D, 0x83, 0xE7, 0xA7, 0x20,
+			0x1C, 0xD0, 0xB9, 0x6A, 0x72, 0x45, 0x1E, 0x86,
+			0x3F, 0x6C, 0x3B, 0xA6, 0x64, 0xA6, 0xD0, 0x73,
+			0xD1, 0xF7, 0xB5, 0xED, 0x99, 0x08, 0x65, 0xD9,
+			0x78, 0xBD, 0x38, 0x15, 0xD0, 0x60, 0x94, 0xFC,
+			0x9A, 0x2A, 0xBA, 0x52, 0x21, 0xC2, 0x2D, 0x5A,
+			0xB9, 0x96, 0x38, 0x9E, 0x37, 0x21, 0xE3, 0xAF,
+			0x5F, 0x05, 0xBE, 0xDD, 0xC2, 0x87, 0x5E, 0x0D,
+			0xFA, 0xEB, 0x39, 0x02, 0x1E, 0xE2, 0x7A, 0x41,
+			0x18, 0x7C, 0xBB, 0x45, 0xEF, 0x40, 0xC3, 0xE7,
+			0x3B, 0xC0, 0x39, 0x89, 0xF9, 0xA3, 0x0D, 0x12,
+			0xC5, 0x4B, 0xA7, 0xD2, 0x14, 0x1D, 0xA8, 0xA8,
+			0x75, 0x49, 0x3E, 0x65, 0x77, 0x6E, 0xF3, 0x5F,
+			0x97, 0xDE, 0xBC, 0x22, 0x86, 0xCC, 0x4A, 0xF9,
+			0xB4, 0x62, 0x3E, 0xEE, 0x90, 0x2F, 0x84, 0x0C,
+			0x52, 0xF1, 0xB8, 0xAD, 0x65, 0x89, 0x39, 0xAE,
+			0xF7, 0x1F, 0x3F, 0x72, 0xB9, 0xEC, 0x1D, 0xE2,
+			0x15, 0x88, 0xBD, 0x35, 0x48, 0x4E, 0xA4, 0x44,
+			0x36, 0x34, 0x3F, 0xF9, 0x5E, 0xAD, 0x6A, 0xB1,
+			0xD8, 0xAF, 0xB1, 0xB2, 0xA3, 0x03, 0xDF, 0x1B,
+			0x71, 0xE5, 0x3C, 0x4A, 0xEA, 0x6B, 0x2E, 0x3E,
+			0x93, 0x72, 0xBE, 0x0D, 0x1B, 0xC9, 0x97, 0x98,
+			0xB0, 0xCE, 0x3C, 0xC1, 0x0D, 0x2A, 0x59, 0x6D,
+			0x56, 0x5D, 0xBA, 0x82, 0xF8, 0x8C, 0xE4, 0xCF,
+			0xF3, 0xB3, 0x3D, 0x5D, 0x24, 0xE9, 0xC0, 0x83,
+			0x11, 0x24, 0xBF, 0x1A, 0xD5, 0x4B, 0x79, 0x25,
+			0x32, 0x98, 0x3D, 0xD6, 0xC3, 0xA8, 0xB7, 0xD0
+		},
+	.len = 2056
+	},
+	.digest = {
+		.data = {0x17, 0x9F, 0x2F, 0xA6},
+		.len  = 4
+	}
+};
+
+#endif /* TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_ */
diff --git a/app/test/test_cryptodev_snow3g_test_vectors.h b/app/test/test_cryptodev_snow3g_test_vectors.h
new file mode 100644
index 0000000..403406d
--- /dev/null
+++ b/app/test/test_cryptodev_snow3g_test_vectors.h
@@ -0,0 +1,379 @@
+/*-
+ *   BSD LICENSE
+ *
+ *   Copyright(c) 2015 Intel Corporation. All rights reserved.
+ *
+ *   Redistribution and use in source and binary forms, with or without
+ *   modification, are permitted provided that the following conditions
+ *   are met:
+ *
+ *   * Redistributions of source code must retain the above copyright
+ *     notice, this list of conditions and the following disclaimer.
+ *   * Redistributions in binary form must reproduce the above copyright
+ *     notice, this list of conditions and the following disclaimer in
+ *     the documentation and/or other materials provided with the
+ *     distribution.
+ *   * Neither the name of Intel Corporation nor the names of its
+ *     contributors may be used to endorse or promote products derived
+ *     from this software without specific prior written permission.
+ *
+ *   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ *   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ *   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ *   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ *   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ *   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ *   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ *   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ *   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ *   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ *   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#ifndef TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_
+#define TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_
+
+struct snow3g_test_data {
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} key;
+
+	struct {
+		uint8_t data[64] __rte_aligned(16);
+		unsigned len;
+	} iv;
+
+	struct {
+		uint8_t data[1024];
+		unsigned len;
+	} plaintext;
+
+	struct {
+		uint8_t data[1024];
+		unsigned len;
+	} ciphertext;
+
+	struct {
+		unsigned len;
+	} validDataLenInBits;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} aad;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} digest;
+};
+struct snow3g_test_data snow3g_test_case_1 = {
+	.key = {
+		.data = {
+			0x2B, 0xD6, 0x45, 0x9F, 0x82, 0xC5, 0xB3, 0x00,
+			0x95, 0x2C, 0x49, 0x10, 0x48, 0x81, 0xFF, 0x48
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00,
+			0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x7E, 0xC6, 0x12, 0x72, 0x74, 0x3B, 0xF1, 0x61,
+			0x47, 0x26, 0x44, 0x6A, 0x6C, 0x38, 0xCE, 0xD1,
+			0x66, 0xF6, 0xCA, 0x76, 0xEB, 0x54, 0x30, 0x04,
+			0x42, 0x86, 0x34, 0x6C, 0xEF, 0x13, 0x0F, 0x92,
+			0x92, 0x2B, 0x03, 0x45, 0x0D, 0x3A, 0x99, 0x75,
+			0xE5, 0xBD, 0x2E, 0xA0, 0xEB, 0x55, 0xAD, 0x8E,
+			0x1B, 0x19, 0x9E, 0x3E, 0xC4, 0x31, 0x60, 0x20,
+			0xE9, 0xA1, 0xB2, 0x85, 0xE7, 0x62, 0x79, 0x53,
+			0x59, 0xB7, 0xBD, 0xFD, 0x39, 0xBE, 0xF4, 0xB2,
+			0x48, 0x45, 0x83, 0xD5, 0xAF, 0xE0, 0x82, 0xAE,
+			0xE6, 0x38, 0xBF, 0x5F, 0xD5, 0xA6, 0x06, 0x19,
+			0x39, 0x01, 0xA0, 0x8F, 0x4A, 0xB4, 0x1A, 0xAB,
+			0x9B, 0x13, 0x48, 0x80
+		},
+		.len = 100
+	},
+	.ciphertext = {
+		.data = {
+			0x8C, 0xEB, 0xA6, 0x29, 0x43, 0xDC, 0xED, 0x3A,
+			0x09, 0x90, 0xB0, 0x6E, 0xA1, 0xB0, 0xA2, 0xC4,
+			0xFB, 0x3C, 0xED, 0xC7, 0x1B, 0x36, 0x9F, 0x42,
+			0xBA, 0x64, 0xC1, 0xEB, 0x66, 0x65, 0xE7, 0x2A,
+			0xA1, 0xC9, 0xBB, 0x0D, 0xEA, 0xA2, 0x0F, 0xE8,
+			0x60, 0x58, 0xB8, 0xBA, 0xEE, 0x2C, 0x2E, 0x7F,
+			0x0B, 0xEC, 0xCE, 0x48, 0xB5, 0x29, 0x32, 0xA5,
+			0x3C, 0x9D, 0x5F, 0x93, 0x1A, 0x3A, 0x7C, 0x53,
+			0x22, 0x59, 0xAF, 0x43, 0x25, 0xE2, 0xA6, 0x5E,
+			0x30, 0x84, 0xAD, 0x5F, 0x6A, 0x51, 0x3B, 0x7B,
+			0xDD, 0xC1, 0xB6, 0x5F, 0x0A, 0xA0, 0xD9, 0x7A,
+			0x05, 0x3D, 0xB5, 0x5A, 0x88, 0xC4, 0xC4, 0xF9,
+			0x60, 0x5E, 0x41, 0x40
+		},
+		.len = 100
+	},
+	.validDataLenInBits = {
+		.len = 798
+	},
+	.aad = {
+		.data = {
+			 0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00,
+			 0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_2 = {
+	.key = {
+		.data = {
+			0xEF, 0xA8, 0xB2, 0x22, 0x9E, 0x72, 0x0C, 0x2A,
+			0x7C, 0x36, 0xEA, 0x55, 0xE9, 0x60, 0x56, 0x95
+		},
+		.len = 16
+	},
+	.iv = {
+	       .data = {
+			0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00,
+			0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00
+		},
+	       .len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x10, 0x11, 0x12, 0x31, 0xE0, 0x60, 0x25, 0x3A,
+			0x43, 0xFD, 0x3F, 0x57, 0xE3, 0x76, 0x07, 0xAB,
+			0x28, 0x27, 0xB5, 0x99, 0xB6, 0xB1, 0xBB, 0xDA,
+			0x37, 0xA8, 0xAB, 0xCC, 0x5A, 0x8C, 0x55, 0x0D,
+			0x1B, 0xFB, 0x2F, 0x49, 0x46, 0x24, 0xFB, 0x50,
+			0x36, 0x7F, 0xA3, 0x6C, 0xE3, 0xBC, 0x68, 0xF1,
+			0x1C, 0xF9, 0x3B, 0x15, 0x10, 0x37, 0x6B, 0x02,
+			0x13, 0x0F, 0x81, 0x2A, 0x9F, 0xA1, 0x69, 0xD8
+		},
+		.len = 64
+	},
+	.ciphertext = {
+		.data = {
+				0xE0, 0xDA, 0x15, 0xCA, 0x8E, 0x25, 0x54, 0xF5,
+				0xE5, 0x6C, 0x94, 0x68, 0xDC, 0x6C, 0x7C, 0x12,
+				0x9C, 0x56, 0x8A, 0xA5, 0x03, 0x23, 0x17, 0xE0,
+				0x4E, 0x07, 0x29, 0x64, 0x6C, 0xAB, 0xEF, 0xA6,
+				0x89, 0x86, 0x4C, 0x41, 0x0F, 0x24, 0xF9, 0x19,
+				0xE6, 0x1E, 0x3D, 0xFD, 0xFA, 0xD7, 0x7E, 0x56,
+				0x0D, 0xB0, 0xA9, 0xCD, 0x36, 0xC3, 0x4A, 0xE4,
+				0x18, 0x14, 0x90, 0xB2, 0x9F, 0x5F, 0xA2, 0xFC
+		},
+		.len = 64
+	},
+	.validDataLenInBits = {
+		.len = 510
+	},
+	.aad = {
+		.data = {
+			 0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00,
+			 0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_3 = {
+	.key = {
+		.data = {
+			 0x5A, 0xCB, 0x1D, 0x64, 0x4C, 0x0D, 0x51, 0x20,
+			 0x4E, 0xA5, 0xF1, 0x45, 0x10, 0x10, 0xD8, 0x52
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00,
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xAD, 0x9C, 0x44, 0x1F, 0x89, 0x0B, 0x38, 0xC4,
+			0x57, 0xA4, 0x9D, 0x42, 0x14, 0x07, 0xE8
+		},
+		.len = 15
+	},
+	.ciphertext = {
+		.data = {
+			0xBA, 0x0F, 0x31, 0x30, 0x03, 0x34, 0xC5, 0x6B,
+			0x52, 0xA7, 0x49, 0x7C, 0xBA, 0xC0, 0x46
+		},
+		.len = 15
+	},
+	.validDataLenInBits = {
+		.len = 120
+	},
+	.aad = {
+		.data = {
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00,
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.digest = {
+		.data = {0xE8, 0x60, 0x5A, 0x3E},
+		.len  = 4
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_4 = {
+	.key = {
+		.data = {
+			0xD3, 0xC5, 0xD5, 0x92, 0x32, 0x7F, 0xB1, 0x1C,
+			0x40, 0x35, 0xC6, 0x68, 0x0A, 0xF8, 0xC6, 0xD1
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x39, 0x8A, 0x59, 0xB4, 0x2C, 0x00, 0x00, 0x00,
+			0x39, 0x8A, 0x59, 0xB4, 0x2C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x98, 0x1B, 0xA6, 0x82, 0x4C, 0x1B, 0xFB, 0x1A,
+			0xB4, 0x85, 0x47, 0x20, 0x29, 0xB7, 0x1D, 0x80,
+			0x8C, 0xE3, 0x3E, 0x2C, 0xC3, 0xC0, 0xB5, 0xFC,
+			0x1F, 0x3D, 0xE8, 0xA6, 0xDC, 0x66, 0xB1, 0xF0
+		},
+		.len = 32
+	},
+	.ciphertext = {
+		.data = {
+			0x98, 0x9B, 0x71, 0x9C, 0xDC, 0x33, 0xCE, 0xB7,
+			0xCF, 0x27, 0x6A, 0x52, 0x82, 0x7C, 0xEF, 0x94,
+			0xA5, 0x6C, 0x40, 0xC0, 0xAB, 0x9D, 0x81, 0xF7,
+			0xA2, 0xA9, 0xBA, 0xC6, 0x0E, 0x11, 0xC4, 0xB0
+		},
+		.len = 32
+	},
+	.validDataLenInBits = {
+		.len = 253
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_5 = {
+	.key = {
+		.data = {
+			0x60, 0x90, 0xEA, 0xE0, 0x4C, 0x83, 0x70, 0x6E,
+			0xEC, 0xBF, 0x65, 0x2B, 0xE8, 0xE3, 0x65, 0x66
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x72, 0xA4, 0xF2, 0x0F, 0x48, 0x00, 0x00, 0x00,
+			0x72, 0xA4, 0xF2, 0x0F, 0x48, 0x00, 0x00, 0x00
+		},
+		.len = 16},
+	.plaintext = {
+		.data = {
+			0x40, 0x98, 0x1B, 0xA6, 0x82, 0x4C, 0x1B, 0xFB,
+			0x42, 0x86, 0xB2, 0x99, 0x78, 0x3D, 0xAF, 0x44,
+			0x2C, 0x09, 0x9F, 0x7A, 0xB0, 0xF5, 0x8D, 0x5C,
+			0x8E, 0x46, 0xB1, 0x04, 0xF0, 0x8F, 0x01, 0xB4,
+			0x1A, 0xB4, 0x85, 0x47, 0x20, 0x29, 0xB7, 0x1D,
+			0x36, 0xBD, 0x1A, 0x3D, 0x90, 0xDC, 0x3A, 0x41,
+			0xB4, 0x6D, 0x51, 0x67, 0x2A, 0xC4, 0xC9, 0x66,
+			0x3A, 0x2B, 0xE0, 0x63, 0xDA, 0x4B, 0xC8, 0xD2,
+			0x80, 0x8C, 0xE3, 0x3E, 0x2C, 0xCC, 0xBF, 0xC6,
+			0x34, 0xE1, 0xB2, 0x59, 0x06, 0x08, 0x76, 0xA0,
+			0xFB, 0xB5, 0xA4, 0x37, 0xEB, 0xCC, 0x8D, 0x31,
+			0xC1, 0x9E, 0x44, 0x54, 0x31, 0x87, 0x45, 0xE3,
+			0x98, 0x76, 0x45, 0x98, 0x7A, 0x98, 0x6F, 0x2C,
+			0xB0
+		},
+		.len = 105
+	},
+	.ciphertext = {
+		.data = {
+			0x58, 0x92, 0xBB, 0xA8, 0x8B, 0xBB, 0xCA, 0xAE,
+			0xAE, 0x76, 0x9A, 0xA0, 0x6B, 0x68, 0x3D, 0x3A,
+			0x17, 0xCC, 0x04, 0xA3, 0x69, 0x88, 0x16, 0x97,
+			0x43, 0x5E, 0x44, 0xFE, 0xD5, 0xFF, 0x9A, 0xF5,
+			0x7B, 0x9E, 0x89, 0x0D, 0x4D, 0x5C, 0x64, 0x70,
+			0x98, 0x85, 0xD4, 0x8A, 0xE4, 0x06, 0x90, 0xEC,
+			0x04, 0x3B, 0xAA, 0xE9, 0x70, 0x57, 0x96, 0xE4,
+			0xA9, 0xFF, 0x5A, 0x4B, 0x8D, 0x8B, 0x36, 0xD7,
+			0xF3, 0xFE, 0x57, 0xCC, 0x6C, 0xFD, 0x6C, 0xD0,
+			0x05, 0xCD, 0x38, 0x52, 0xA8, 0x5E, 0x94, 0xCE,
+			0x6B, 0xCD, 0x90, 0xD0, 0xD0, 0x78, 0x39, 0xCE,
+			0x09, 0x73, 0x35, 0x44, 0xCA, 0x8E, 0x35, 0x08,
+			0x43, 0x24, 0x85, 0x50, 0x92, 0x2A, 0xC1, 0x28,
+			0x18
+		},
+		.len = 105
+	},
+	.validDataLenInBits = {
+		.len = 837
+	}
+};
+struct snow3g_test_data snow3g_test_case_6 = {
+	.key = {
+		.data = {
+			0xC7, 0x36, 0xC6, 0xAA, 0xB2, 0x2B, 0xFF, 0xF9,
+			0x1E, 0x26, 0x98, 0xD2, 0xE2, 0x2A, 0xD5, 0x7E
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.aad = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xD0, 0xA7, 0xD4, 0x63, 0xDF, 0x9F, 0xB2, 0xB2,
+			0x78, 0x83, 0x3F, 0xA0, 0x2E, 0x23, 0x5A, 0xA1,
+			0x72, 0xBD, 0x97, 0x0C, 0x14, 0x73, 0xE1, 0x29,
+			0x07, 0xFB, 0x64, 0x8B, 0x65, 0x99, 0xAA, 0xA0,
+			0xB2, 0x4A, 0x03, 0x86, 0x65, 0x42, 0x2B, 0x20,
+			0xA4, 0x99, 0x27, 0x6A, 0x50, 0x42, 0x70, 0x09
+		},
+		.len = 48
+	},
+	.ciphertext = {
+	   .data = {
+			0x95, 0x2E, 0x5A, 0xE1, 0x50, 0xB8, 0x59, 0x2A,
+			0x9B, 0xA0, 0x38, 0xA9, 0x8E, 0x2F, 0xED, 0xAB,
+			0xFD, 0xC8, 0x3B, 0x47, 0x46, 0x0B, 0x50, 0x16,
+			0xEC, 0x88, 0x45, 0xB6, 0x05, 0xC7, 0x54, 0xF8,
+			0xBD, 0x91, 0xAA, 0xB6, 0xA4, 0xDC, 0x64, 0xB4,
+			0xCB, 0xEB, 0x97, 0x06, 0x4C, 0xF7, 0x02, 0x3D
+		},
+		.len = 48
+	},
+	.digest = {
+		.data = {0x38, 0xB5, 0x54, 0xC0 },
+		.len  = 4
+	},
+	.validDataLenInBits = {
+		.len = 384
+	}
+};
+
+#endif /* TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_ */
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [dpdk-dev] [PATCH v3 0/3] Snow3G support for Intel Quick Assist Devices
  2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
                     ` (2 preceding siblings ...)
  2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
@ 2016-03-07 13:55   ` De Lara Guarch, Pablo
  2016-03-10 17:12   ` [dpdk-dev] [PATCH v4 " Deepak Kumar JAIN
  2016-03-16  5:05   ` [dpdk-dev] [PATCH v3 " Cao, Min
  5 siblings, 0 replies; 22+ messages in thread
From: De Lara Guarch, Pablo @ 2016-03-07 13:55 UTC (permalink / raw)
  To: Jain, Deepak K, dev



> -----Original Message-----
> From: dev [mailto:dev-bounces@dpdk.org] On Behalf Of Deepak Kumar JAIN
> Sent: Thursday, March 03, 2016 1:01 PM
> To: dev@dpdk.org
> Subject: [dpdk-dev] [PATCH v3 0/3] Snow3G support for Intel Quick Assist
> Devices
> 
>  This patchset contains fixes and refactoring for Snow3G(UEA2 and
>  UIA2) wireless algorithm for Intel Quick Assist devices.
> 
>  QAT PMD previously supported only cipher/hash alg-chaining for AES/SHA.
>  The code has been refactored to also support cipher-only and hash  only
>  (for Snow3G only) functionality along with alg-chaining.
> 
>  Changes from v2:
> 
>  1) Rebasing based on below mentioned patchset.
> 
>  This patchset depends on
>  cryptodev API changes
>  http://dpdk.org/ml/archives/dev/2016-February/034212.html
> 
> Deepak Kumar JAIN (3):
>   crypto: add cipher/auth only support
>   qat: add support for Snow3G
>   app/test: add Snow3G tests
> 
>  app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
>  app/test/test_cryptodev.h                          |    3 +-
>  app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
>  app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
>  doc/guides/cryptodevs/qat.rst                      |    8 +-
>  doc/guides/rel_notes/release_16_04.rst             |    6 +
>  drivers/crypto/qat/qat_adf/qat_algs.h              |   19 +-
>  drivers/crypto/qat/qat_adf/qat_algs_build_desc.c   |  280 +++++-
>  drivers/crypto/qat/qat_crypto.c                    |  149 ++-
>  drivers/crypto/qat/qat_crypto.h                    |   10 +
>  10 files changed, 2231 insertions(+), 75 deletions(-)
>  create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
>  create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h
> 
> --
> 2.1.0

Series-acked-by: Pablo de Lara <pablo.de.lara.guarch@intel.com>

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist Devices
  2016-03-10 17:12   ` [dpdk-dev] [PATCH v4 " Deepak Kumar JAIN
@ 2016-03-10 16:25     ` De Lara Guarch, Pablo
  2016-03-10 20:37       ` Thomas Monjalon
  2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
                       ` (3 subsequent siblings)
  4 siblings, 1 reply; 22+ messages in thread
From: De Lara Guarch, Pablo @ 2016-03-10 16:25 UTC (permalink / raw)
  To: Jain, Deepak K, dev



> -----Original Message-----
> From: dev [mailto:dev-bounces@dpdk.org] On Behalf Of Deepak Kumar JAIN
> Sent: Thursday, March 10, 2016 5:13 PM
> To: dev@dpdk.org
> Subject: [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist
> Devices
> 
>  This patchset contains fixes and refactoring for Snow3G(UEA2 and
>  UIA2) wireless algorithm for Intel Quick Assist devices.
> 
>  QAT PMD previously supported only cipher/hash alg-chaining for AES/SHA.
>  The code has been refactored to also support cipher-only and hash  only
> (for Snow3G only) functionality along with alg-chaining.
> 
>  Changes from V3:
>  1) Rebase based on below mentioned patchset.
>  2) Fixes test failure which happens only after
>     applying patch 1 only.
> 
>  Changes from v2:
> 
>  1) Rebasing based on below mentioned patchset.
> 
> This patchset depends on
> cryptodev API changes
> http://dpdk.org/ml/archives/dev/2016-March/035451.html
> 
> Deepak Kumar JAIN (3):
>   crypto: add cipher/auth only support
>   qat: add support for Snow3G
>   app/test: add Snow3G tests
> 
>  app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
>  app/test/test_cryptodev.h                          |    3 +-
>  app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
>  app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
>  doc/guides/cryptodevs/qat.rst                      |    8 +-
>  doc/guides/rel_notes/release_16_04.rst             |    6 +
>  drivers/crypto/qat/qat_adf/qat_algs.h              |   19 +-
>  drivers/crypto/qat/qat_adf/qat_algs_build_desc.c   |  284 +++++-
>  drivers/crypto/qat/qat_crypto.c                    |  149 ++-
>  drivers/crypto/qat/qat_crypto.h                    |   10 +
>  10 files changed, 2236 insertions(+), 74 deletions(-)
>  create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
>  create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h
> 
> --
> 2.1.0

Series-acked-by: Pablo de Lara <pablo.de.lara.guarch@intel.com>

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist Devices
  2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
                     ` (3 preceding siblings ...)
  2016-03-07 13:55   ` [dpdk-dev] [PATCH v3 0/3] Snow3G support for Intel Quick Assist Devices De Lara Guarch, Pablo
@ 2016-03-10 17:12   ` Deepak Kumar JAIN
  2016-03-10 16:25     ` De Lara Guarch, Pablo
                       ` (4 more replies)
  2016-03-16  5:05   ` [dpdk-dev] [PATCH v3 " Cao, Min
  5 siblings, 5 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-03-10 17:12 UTC (permalink / raw)
  To: dev

 This patchset contains fixes and refactoring for Snow3G(UEA2 and
 UIA2) wireless algorithm for Intel Quick Assist devices.

 QAT PMD previously supported only cipher/hash alg-chaining for AES/SHA.
 The code has been refactored to also support cipher-only and hash  only  (for Snow3G only) functionality along with alg-chaining.

 Changes from V3:
 1) Rebase based on below mentioned patchset.
 2) Fixes test failure which happens only after 
    applying patch 1 only.
 
 Changes from v2:

 1) Rebasing based on below mentioned patchset.

This patchset depends on
cryptodev API changes
http://dpdk.org/ml/archives/dev/2016-March/035451.html

Deepak Kumar JAIN (3):
  crypto: add cipher/auth only support
  qat: add support for Snow3G
  app/test: add Snow3G tests

 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 doc/guides/cryptodevs/qat.rst                      |    8 +-
 doc/guides/rel_notes/release_16_04.rst             |    6 +
 drivers/crypto/qat/qat_adf/qat_algs.h              |   19 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c   |  284 +++++-
 drivers/crypto/qat/qat_crypto.c                    |  149 ++-
 drivers/crypto/qat/qat_crypto.h                    |   10 +
 10 files changed, 2236 insertions(+), 74 deletions(-)
 create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v4 1/3] crypto: add cipher/auth only support
  2016-03-10 17:12   ` [dpdk-dev] [PATCH v4 " Deepak Kumar JAIN
  2016-03-10 16:25     ` De Lara Guarch, Pablo
@ 2016-03-10 17:12     ` Deepak Kumar JAIN
  2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 2/3] qat: add support for Snow3G Deepak Kumar JAIN
                       ` (2 subsequent siblings)
  4 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-03-10 17:12 UTC (permalink / raw)
  To: dev

Refactored the existing functionality into
modular form to support the cipher/auth only
functionalities.

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 drivers/crypto/qat/qat_adf/qat_algs.h            |  18 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 208 ++++++++++++++++++++---
 drivers/crypto/qat/qat_crypto.c                  | 137 +++++++++++----
 drivers/crypto/qat/qat_crypto.h                  |  10 ++
 4 files changed, 306 insertions(+), 67 deletions(-)

diff --git a/drivers/crypto/qat/qat_adf/qat_algs.h b/drivers/crypto/qat/qat_adf/qat_algs.h
index 76c08c0..b73a5d0 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs.h
+++ b/drivers/crypto/qat/qat_adf/qat_algs.h
@@ -3,7 +3,7 @@
  *  redistributing this file, you may do so under either license.
  *
  *  GPL LICENSE SUMMARY
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  This program is free software; you can redistribute it and/or modify
  *  it under the terms of version 2 of the GNU General Public License as
  *  published by the Free Software Foundation.
@@ -17,7 +17,7 @@
  *  qat-linux@intel.com
  *
  *  BSD LICENSE
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  Redistribution and use in source and binary forms, with or without
  *  modification, are permitted provided that the following conditions
  *  are met:
@@ -104,11 +104,15 @@ struct qat_alg_ablkcipher_cd {
 
 int qat_get_inter_state_size(enum icp_qat_hw_auth_algo qat_hash_alg);
 
-int qat_alg_aead_session_create_content_desc(struct qat_session *cd,
-					uint8_t *enckey, uint32_t enckeylen,
-					uint8_t *authkey, uint32_t authkeylen,
-					uint32_t add_auth_data_length,
-					uint32_t digestsize);
+int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cd,
+						uint8_t *enckey,
+						uint32_t enckeylen);
+
+int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
+						uint8_t *authkey,
+						uint32_t authkeylen,
+						uint32_t add_auth_data_length,
+						uint32_t digestsize);
 
 void qat_alg_init_common_hdr(struct icp_qat_fw_comn_req_hdr *header);
 
diff --git a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
index ceaffb7..534eda0 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
+++ b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
@@ -3,7 +3,7 @@
  *  redistributing this file, you may do so under either license.
  *
  *  GPL LICENSE SUMMARY
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  This program is free software; you can redistribute it and/or modify
  *  it under the terms of version 2 of the GNU General Public License as
  *  published by the Free Software Foundation.
@@ -17,7 +17,7 @@
  *  qat-linux@intel.com
  *
  *  BSD LICENSE
- *  Copyright(c) 2015 Intel Corporation.
+ *  Copyright(c) 2015-2016 Intel Corporation.
  *  Redistribution and use in source and binary forms, with or without
  *  modification, are permitted provided that the following conditions
  *  are met:
@@ -359,15 +359,139 @@ void qat_alg_init_common_hdr(struct icp_qat_fw_comn_req_hdr *header)
 					   ICP_QAT_FW_LA_NO_UPDATE_STATE);
 }
 
-int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
-			uint8_t *cipherkey, uint32_t cipherkeylen,
-			uint8_t *authkey, uint32_t authkeylen,
-			uint32_t add_auth_data_length,
-			uint32_t digestsize)
+int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
+						uint8_t *cipherkey,
+						uint32_t cipherkeylen)
 {
-	struct qat_alg_cd *content_desc = &cdesc->cd;
-	struct icp_qat_hw_cipher_algo_blk *cipher = &content_desc->cipher;
-	struct icp_qat_hw_auth_algo_blk *hash = &content_desc->hash;
+	struct icp_qat_hw_cipher_algo_blk *cipher;
+	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
+	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
+	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
+	void *ptr = &req_tmpl->cd_ctrl;
+	struct icp_qat_fw_cipher_cd_ctrl_hdr *cipher_cd_ctrl = ptr;
+	struct icp_qat_fw_auth_cd_ctrl_hdr *hash_cd_ctrl = ptr;
+	enum icp_qat_hw_cipher_convert key_convert;
+	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO;	/* no CCM/GCM/Snow3G */
+	uint16_t cipher_offset = 0;
+
+	PMD_INIT_FUNC_TRACE();
+
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		cipher =
+		    (struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk));
+		cipher_offset = sizeof(struct icp_qat_hw_auth_algo_blk);
+	} else {
+		cipher = (struct icp_qat_hw_cipher_algo_blk *)&cdesc->cd;
+		cipher_offset = 0;
+	}
+	/* CD setup */
+	if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT) {
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	} else {
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_CMP_AUTH_RES);
+	}
+
+	if (cdesc->qat_mode == ICP_QAT_HW_CIPHER_CTR_MODE) {
+		/* CTR Streaming ciphers are a special case. Decrypt = encrypt
+		 * Overriding default values previously set
+		 */
+		cdesc->qat_dir = ICP_QAT_HW_CIPHER_ENCRYPT;
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	} else if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT)
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	else
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
+	/* For Snow3G, set key convert and other bits */
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	}
+
+	cipher->aes.cipher_config.val =
+	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
+					cdesc->qat_cipher_alg, key_convert,
+					cdesc->qat_dir);
+	memcpy(cipher->aes.key, cipherkey, cipherkeylen);
+
+	proto = ICP_QAT_FW_LA_PROTO_GET(header->serv_specif_flags);
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2)
+		proto = ICP_QAT_FW_LA_SNOW_3G_PROTO;
+
+	/* Request template setup */
+	qat_alg_init_common_hdr(header);
+	header->service_cmd_id = cdesc->qat_cmd;
+
+	ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+					ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
+	/* Configure the common header protocol flags */
+	ICP_QAT_FW_LA_PROTO_SET(header->serv_specif_flags, proto);
+	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
+	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
+
+	/* Cipher CD config setup */
+	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
+		cipher_cd_ctrl->cipher_key_sz =
+			(ICP_QAT_HW_SNOW_3G_UEA2_KEY_SZ +
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ) >> 3;
+		cipher_cd_ctrl->cipher_state_sz =
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	} else {
+		cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
+		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+	}
+
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_CIPHER);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else {
+		PMD_DRV_LOG(ERR, "invalid param, only authenticated "
+			    "encryption supported");
+		return -EFAULT;
+	}
+	return 0;
+}
+
+int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
+						uint8_t *authkey,
+						uint32_t authkeylen,
+						uint32_t add_auth_data_length,
+						uint32_t digestsize)
+{
+	struct icp_qat_hw_cipher_algo_blk *cipher;
+	struct icp_qat_hw_auth_algo_blk *hash;
 	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
 	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
 	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
@@ -379,31 +503,56 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 		((char *)&req_tmpl->serv_specif_rqpars +
 		sizeof(struct icp_qat_fw_la_cipher_req_params));
 	enum icp_qat_hw_cipher_convert key_convert;
-	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO; /* no CCM/GCM/Snow3G */
+	uint16_t proto = ICP_QAT_FW_LA_NO_PROTO;	/* no CCM/GCM/Snow3G */
 	uint16_t state1_size = 0;
 	uint16_t state2_size = 0;
+	uint16_t cipher_offset = 0, hash_offset = 0;
 
 	PMD_INIT_FUNC_TRACE();
 
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+		hash = (struct icp_qat_hw_auth_algo_blk *)&cdesc->cd;
+		cipher =
+		(struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk));
+		hash_offset = 0;
+		cipher_offset = ((char *)hash - (char *)cipher);
+	} else {
+		cipher = (struct icp_qat_hw_cipher_algo_blk *)&cdesc->cd;
+		hash = (struct icp_qat_hw_auth_algo_blk *)((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_cipher_algo_blk));
+		cipher_offset = 0;
+		hash_offset = ((char *)hash - (char *)cipher);
+	}
+
 	/* CD setup */
 	if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT) {
-		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_RET_AUTH_RES);
+					   ICP_QAT_FW_LA_RET_AUTH_RES);
 		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+					   ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
 	} else {
-		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
-				ICP_QAT_FW_LA_NO_RET_AUTH_RES);
+					   ICP_QAT_FW_LA_NO_RET_AUTH_RES);
 		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-				   ICP_QAT_FW_LA_CMP_AUTH_RES);
+					   ICP_QAT_FW_LA_CMP_AUTH_RES);
 	}
 
-	cipher->aes.cipher_config.val = ICP_QAT_HW_CIPHER_CONFIG_BUILD(
-			cdesc->qat_mode, cdesc->qat_cipher_alg, key_convert,
-			cdesc->qat_dir);
-	memcpy(cipher->aes.key, cipherkey, cipherkeylen);
+	if (cdesc->qat_mode == ICP_QAT_HW_CIPHER_CTR_MODE) {
+		/* CTR Streaming ciphers are a special case. Decrypt = encrypt
+		 * Overriding default values previously set
+		 */
+		cdesc->qat_dir = ICP_QAT_HW_CIPHER_ENCRYPT;
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	} else if (cdesc->qat_dir == ICP_QAT_HW_CIPHER_ENCRYPT)
+		key_convert = ICP_QAT_HW_CIPHER_NO_CONVERT;
+	else
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
+	cipher->aes.cipher_config.val =
+	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
+					cdesc->qat_cipher_alg, key_convert,
+					cdesc->qat_dir);
 
 	hash->sha.inner_setup.auth_config.reserved = 0;
 	hash->sha.inner_setup.auth_config.config =
@@ -423,7 +572,7 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 	} else if ((cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_128) ||
 		(cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_64)) {
 		if (qat_alg_do_precomputes(cdesc->qat_hash_alg,
-			cipherkey, cipherkeylen, (uint8_t *)(hash->sha.state1 +
+			authkey, authkeylen, (uint8_t *)(hash->sha.state1 +
 			ICP_QAT_HW_GALOIS_128_STATE1_SZ), &state2_size)) {
 			PMD_DRV_LOG(ERR, "(GCM)precompute failed");
 			return -EFAULT;
@@ -454,15 +603,14 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 	/* Configure the common header protocol flags */
 	ICP_QAT_FW_LA_PROTO_SET(header->serv_specif_flags, proto);
 	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
-	cd_pars->u.s.content_desc_params_sz = sizeof(struct qat_alg_cd) >> 3;
+	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
 
 	/* Cipher CD config setup */
-	cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
 	cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
-	cipher_cd_ctrl->cipher_cfg_offset = 0;
+	cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
 
 	/* Auth CD config setup */
-	hash_cd_ctrl->hash_cfg_offset = ((char *)hash - (char *)cipher) >> 3;
+	hash_cd_ctrl->hash_cfg_offset = hash_offset >> 3;
 	hash_cd_ctrl->hash_flags = ICP_QAT_FW_AUTH_HDR_FLAG_NO_NESTED;
 	hash_cd_ctrl->inner_res_sz = digestsize;
 	hash_cd_ctrl->final_sz = digestsize;
@@ -505,8 +653,12 @@ int qat_alg_aead_session_create_content_desc(struct qat_session *cdesc,
 					>> 3);
 	auth_param->auth_res_sz = digestsize;
 
-
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_AUTH) {
+		ICP_QAT_FW_COMN_CURR_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_AUTH);
+		ICP_QAT_FW_COMN_NEXT_ID_SET(hash_cd_ctrl,
+					ICP_QAT_FW_SLICE_DRAM_WR);
+	} else if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_CIPHER_HASH) {
 		ICP_QAT_FW_COMN_CURR_ID_SET(cipher_cd_ctrl,
 				ICP_QAT_FW_SLICE_CIPHER);
 		ICP_QAT_FW_COMN_NEXT_ID_SET(cipher_cd_ctrl,
diff --git a/drivers/crypto/qat/qat_crypto.c b/drivers/crypto/qat/qat_crypto.c
index 3533f37..2edae32 100644
--- a/drivers/crypto/qat/qat_crypto.c
+++ b/drivers/crypto/qat/qat_crypto.c
@@ -90,16 +90,16 @@ void qat_crypto_sym_clear_session(struct rte_cryptodev *dev,
 static int
 qat_get_cmd_id(const struct rte_crypto_sym_xform *xform)
 {
-	if (xform->next == NULL)
-		return -1;
-
 	/* Cipher Only */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && xform->next == NULL)
-		return -1; /* return ICP_QAT_FW_LA_CMD_CIPHER; */
+		return ICP_QAT_FW_LA_CMD_CIPHER;
 
 	/* Authentication Only */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && xform->next == NULL)
-		return -1; /* return ICP_QAT_FW_LA_CMD_AUTH; */
+		return ICP_QAT_FW_LA_CMD_AUTH;
+
+	if (xform->next == NULL)
+		return -1;
 
 	/* Cipher then Authenticate */
 	if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER &&
@@ -139,31 +139,16 @@ qat_get_cipher_xform(struct rte_crypto_sym_xform *xform)
 
 	return NULL;
 }
-
-
 void *
-qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
+qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
 		struct rte_crypto_sym_xform *xform, void *session_private)
 {
 	struct qat_pmd_private *internals = dev->data->dev_private;
 
 	struct qat_session *session = session_private;
 
-	struct rte_crypto_auth_xform *auth_xform = NULL;
 	struct rte_crypto_cipher_xform *cipher_xform = NULL;
 
-	int qat_cmd_id;
-
-	PMD_INIT_FUNC_TRACE();
-
-	/* Get requested QAT command id */
-	qat_cmd_id = qat_get_cmd_id(xform);
-	if (qat_cmd_id < 0 || qat_cmd_id >= ICP_QAT_FW_LA_CMD_DELIMITER) {
-		PMD_DRV_LOG(ERR, "Unsupported xform chain requested");
-		goto error_out;
-	}
-	session->qat_cmd = (enum icp_qat_fw_la_cmd_id)qat_cmd_id;
-
 	/* Get cipher xform from crypto xform chain */
 	cipher_xform = qat_get_cipher_xform(xform);
 
@@ -205,8 +190,87 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 	else
 		session->qat_dir = ICP_QAT_HW_CIPHER_DECRYPT;
 
+	if (qat_alg_aead_session_create_content_desc_cipher(session,
+						cipher_xform->key.data,
+						cipher_xform->key.length))
+		goto error_out;
+
+	return session;
+
+error_out:
+	rte_mempool_put(internals->sess_mp, session);
+	return NULL;
+}
+
+
+void *
+qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
+		struct rte_crypto_sym_xform *xform, void *session_private)
+{
+	struct qat_pmd_private *internals = dev->data->dev_private;
+
+	struct qat_session *session = session_private;
+
+	int qat_cmd_id;
+
+	PMD_INIT_FUNC_TRACE();
+
+	/* Get requested QAT command id */
+	qat_cmd_id = qat_get_cmd_id(xform);
+	if (qat_cmd_id < 0 || qat_cmd_id >= ICP_QAT_FW_LA_CMD_DELIMITER) {
+		PMD_DRV_LOG(ERR, "Unsupported xform chain requested");
+		goto error_out;
+	}
+	session->qat_cmd = (enum icp_qat_fw_la_cmd_id)qat_cmd_id;
+	switch (session->qat_cmd) {
+	case ICP_QAT_FW_LA_CMD_CIPHER:
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_AUTH:
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_CIPHER_HASH:
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_HASH_CIPHER:
+	session = qat_crypto_sym_configure_session_auth(dev, xform, session);
+	session = qat_crypto_sym_configure_session_cipher(dev, xform, session);
+		break;
+	case ICP_QAT_FW_LA_CMD_TRNG_GET_RANDOM:
+	case ICP_QAT_FW_LA_CMD_TRNG_TEST:
+	case ICP_QAT_FW_LA_CMD_SSL3_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_TLS_V1_1_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_TLS_V1_2_KEY_DERIVE:
+	case ICP_QAT_FW_LA_CMD_MGF1:
+	case ICP_QAT_FW_LA_CMD_AUTH_PRE_COMP:
+	case ICP_QAT_FW_LA_CMD_CIPHER_PRE_COMP:
+	case ICP_QAT_FW_LA_CMD_DELIMITER:
+	PMD_DRV_LOG(ERR, "Unsupported Service %u",
+		session->qat_cmd);
+		goto error_out;
+	default:
+	PMD_DRV_LOG(ERR, "Unsupported Service %u",
+		session->qat_cmd);
+		goto error_out;
+	}
+	return session;
+
+error_out:
+	rte_mempool_put(internals->sess_mp, session);
+	return NULL;
+}
+
+struct qat_session *
+qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
+				struct rte_crypto_sym_xform *xform,
+				struct qat_session *session_private)
+{
 
-	/* Get authentication xform from Crypto xform chain */
+	struct qat_pmd_private *internals = dev->data->dev_private;
+	struct qat_session *session = session_private;
+	struct rte_crypto_auth_xform *auth_xform = NULL;
+	struct rte_crypto_cipher_xform *cipher_xform = NULL;
 	auth_xform = qat_get_auth_xform(xform);
 
 	switch (auth_xform->algo) {
@@ -250,17 +314,26 @@ qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 				auth_xform->algo);
 		goto error_out;
 	}
+	cipher_xform = qat_get_cipher_xform(xform);
 
-	if (qat_alg_aead_session_create_content_desc(session,
-		cipher_xform->key.data,
-		cipher_xform->key.length,
-		auth_xform->key.data,
-		auth_xform->key.length,
-		auth_xform->add_auth_data_length,
-		auth_xform->digest_length))
-		goto error_out;
-
-	return (struct rte_crypto_sym_session *)session;
+	if ((session->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_GALOIS_128) ||
+			(session->qat_hash_alg ==
+				ICP_QAT_HW_AUTH_ALGO_GALOIS_64))  {
+		if (qat_alg_aead_session_create_content_desc_auth(session,
+				cipher_xform->key.data,
+				cipher_xform->key.length,
+				auth_xform->add_auth_data_length,
+				auth_xform->digest_length))
+			goto error_out;
+	} else {
+		if (qat_alg_aead_session_create_content_desc_auth(session,
+				auth_xform->key.data,
+				auth_xform->key.length,
+				auth_xform->add_auth_data_length,
+				auth_xform->digest_length))
+			goto error_out;
+	}
+	return session;
 
 error_out:
 	rte_mempool_put(internals->sess_mp, session);
diff --git a/drivers/crypto/qat/qat_crypto.h b/drivers/crypto/qat/qat_crypto.h
index 9323383..0afe74e 100644
--- a/drivers/crypto/qat/qat_crypto.h
+++ b/drivers/crypto/qat/qat_crypto.h
@@ -111,6 +111,16 @@ extern void *
 qat_crypto_sym_configure_session(struct rte_cryptodev *dev,
 		struct rte_crypto_sym_xform *xform, void *session_private);
 
+struct qat_session *
+qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
+				struct rte_crypto_sym_xform *xform,
+				struct qat_session *session_private);
+
+void *
+qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
+		struct rte_crypto_sym_xform *xform, void *session_private);
+
+
 extern void
 qat_crypto_sym_clear_session(struct rte_cryptodev *dev, void *session);
 
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v4 2/3] qat: add support for Snow3G
  2016-03-10 17:12   ` [dpdk-dev] [PATCH v4 " Deepak Kumar JAIN
  2016-03-10 16:25     ` De Lara Guarch, Pablo
  2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
@ 2016-03-10 17:12     ` Deepak Kumar JAIN
  2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
  2016-03-16  3:27     ` [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist Devices Cao, Min
  4 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-03-10 17:12 UTC (permalink / raw)
  To: dev

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 doc/guides/cryptodevs/qat.rst                    |  8 ++-
 doc/guides/rel_notes/release_16_04.rst           |  6 ++
 drivers/crypto/qat/qat_adf/qat_algs.h            |  1 +
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c | 86 ++++++++++++++++++++++--
 drivers/crypto/qat/qat_crypto.c                  | 12 +++-
 5 files changed, 104 insertions(+), 9 deletions(-)

diff --git a/doc/guides/cryptodevs/qat.rst b/doc/guides/cryptodevs/qat.rst
index 23402b4..af52047 100644
--- a/doc/guides/cryptodevs/qat.rst
+++ b/doc/guides/cryptodevs/qat.rst
@@ -1,5 +1,5 @@
 ..  BSD LICENSE
-    Copyright(c) 2015 Intel Corporation. All rights reserved.
+    Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
 
     Redistribution and use in source and binary forms, with or without
     modification, are permitted provided that the following conditions
@@ -47,6 +47,7 @@ Cipher algorithms:
 * ``RTE_CRYPTO_SYM_CIPHER_AES128_CBC``
 * ``RTE_CRYPTO_SYM_CIPHER_AES192_CBC``
 * ``RTE_CRYPTO_SYM_CIPHER_AES256_CBC``
+* ``RTE_CRYPTO_SYM_CIPHER_SNOW3G_UEA2``
 
 Hash algorithms:
 
@@ -54,14 +55,15 @@ Hash algorithms:
 * ``RTE_CRYPTO_AUTH_SHA256_HMAC``
 * ``RTE_CRYPTO_AUTH_SHA512_HMAC``
 * ``RTE_CRYPTO_AUTH_AES_XCBC_MAC``
+* ``RTE_CRYPTO_AUTH_SNOW3G_UIA2``
 
 
 Limitations
 -----------
 
 * Chained mbufs are not supported.
-* Hash only is not supported.
-* Cipher only is not supported.
+* Hash only is not supported except Snow3G UIA2.
+* Cipher only is not supported except Snow3G UEA2.
 * Only in-place is currently supported (destination address is the same as source address).
 * Only supports the session-oriented API implementation (session-less APIs are not supported).
 * Not performance tuned.
diff --git a/doc/guides/rel_notes/release_16_04.rst b/doc/guides/rel_notes/release_16_04.rst
index aa9eabc..4f41e63 100644
--- a/doc/guides/rel_notes/release_16_04.rst
+++ b/doc/guides/rel_notes/release_16_04.rst
@@ -35,6 +35,12 @@ This section should contain new features added in this release. Sample format:
 
   Refer to the previous release notes for examples.
 
+* **Added support of Snow3G (UEA2 and UIA2) for Intel Quick Assist Devices.**
+
+  Enabled support for Snow3g Wireless algorithm for Intel Quick Assist devices.
+  Support for cipher only, Hash only is also provided
+  along with alg-chaining operations.
+
 * **Added function to check primary process state.**
 
   A new function ``rte_eal_primary_proc_alive()`` has been added
diff --git a/drivers/crypto/qat/qat_adf/qat_algs.h b/drivers/crypto/qat/qat_adf/qat_algs.h
index b73a5d0..b47dbc2 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs.h
+++ b/drivers/crypto/qat/qat_adf/qat_algs.h
@@ -125,5 +125,6 @@ void qat_alg_ablkcipher_init_dec(struct qat_alg_ablkcipher_cd *cd,
 					unsigned int keylen);
 
 int qat_alg_validate_aes_key(int key_len, enum icp_qat_hw_cipher_algo *alg);
+int qat_alg_validate_snow3g_key(int key_len, enum icp_qat_hw_cipher_algo *alg);
 
 #endif
diff --git a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
index 534eda0..bcccdf4 100644
--- a/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
+++ b/drivers/crypto/qat/qat_adf/qat_algs_build_desc.c
@@ -82,6 +82,9 @@ static int qat_hash_get_state1_size(enum icp_qat_hw_auth_algo qat_hash_alg)
 	case ICP_QAT_HW_AUTH_ALGO_GALOIS_64:
 		return QAT_HW_ROUND_UP(ICP_QAT_HW_GALOIS_128_STATE1_SZ,
 						QAT_HW_DEFAULT_ALIGNMENT);
+	case ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2:
+		return QAT_HW_ROUND_UP(ICP_QAT_HW_SNOW_3G_UIA2_STATE1_SZ,
+						QAT_HW_DEFAULT_ALIGNMENT);
 	case ICP_QAT_HW_AUTH_ALGO_DELIMITER:
 		/* return maximum state1 size in this case */
 		return QAT_HW_ROUND_UP(ICP_QAT_HW_SHA512_STATE1_SZ,
@@ -376,7 +379,8 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 
 	PMD_INIT_FUNC_TRACE();
 
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER &&
+		cdesc->qat_hash_alg != ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2) {
 		cipher =
 		    (struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
 				sizeof(struct icp_qat_hw_auth_algo_blk));
@@ -409,13 +413,20 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 	else
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
 	/* For Snow3G, set key convert and other bits */
 	if (cdesc->qat_cipher_alg == ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2) {
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
 					ICP_QAT_FW_LA_NO_RET_AUTH_RES);
-		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
-					ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+		if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER)  {
+			ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_RET_AUTH_RES);
+			ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+		}
 	}
 
 	cipher->aes.cipher_config.val =
@@ -431,7 +442,6 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 	/* Request template setup */
 	qat_alg_init_common_hdr(header);
 	header->service_cmd_id = cdesc->qat_cmd;
-
 	ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
 					ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
 	/* Configure the common header protocol flags */
@@ -447,6 +457,10 @@ int qat_alg_aead_session_create_content_desc_cipher(struct qat_session *cdesc,
 		cipher_cd_ctrl->cipher_state_sz =
 			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ >> 3;
 		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
+		if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER)  {
+		ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+				ICP_QAT_FW_LA_DIGEST_IN_BUFFER);
+		}
 	} else {
 		cipher_cd_ctrl->cipher_key_sz = cipherkeylen >> 3;
 		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
@@ -492,6 +506,7 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 {
 	struct icp_qat_hw_cipher_algo_blk *cipher;
 	struct icp_qat_hw_auth_algo_blk *hash;
+	struct icp_qat_hw_cipher_algo_blk *cipherconfig;
 	struct icp_qat_fw_la_bulk_req *req_tmpl = &cdesc->fw_req;
 	struct icp_qat_fw_comn_req_hdr_cd_pars *cd_pars = &req_tmpl->cd_pars;
 	struct icp_qat_fw_comn_req_hdr *header = &req_tmpl->comn_hdr;
@@ -510,7 +525,8 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 
 	PMD_INIT_FUNC_TRACE();
 
-	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER) {
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_HASH_CIPHER &&
+		cdesc->qat_hash_alg != ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2) {
 		hash = (struct icp_qat_hw_auth_algo_blk *)&cdesc->cd;
 		cipher =
 		(struct icp_qat_hw_cipher_algo_blk *)((char *)&cdesc->cd +
@@ -549,6 +565,9 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 	else
 		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
 
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)
+		key_convert = ICP_QAT_HW_CIPHER_KEY_CONVERT;
+
 	cipher->aes.cipher_config.val =
 	    ICP_QAT_HW_CIPHER_CONFIG_BUILD(cdesc->qat_mode,
 					cdesc->qat_cipher_alg, key_convert,
@@ -560,6 +579,22 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 				cdesc->qat_hash_alg, digestsize);
 	hash->sha.inner_setup.auth_counter.counter =
 		rte_bswap32(qat_hash_get_block_size(cdesc->qat_hash_alg));
+	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)  {
+		hash->sha.inner_setup.auth_counter.counter = 0;
+		hash->sha.outer_setup.auth_config.reserved = 0;
+		cipherconfig = (struct icp_qat_hw_cipher_algo_blk *)
+				((char *)&cdesc->cd +
+				sizeof(struct icp_qat_hw_auth_algo_blk)
+				+ 16);
+		cipherconfig->aes.cipher_config.val =
+		ICP_QAT_HW_CIPHER_CONFIG_BUILD(ICP_QAT_HW_CIPHER_ECB_MODE,
+			ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2,
+			ICP_QAT_HW_CIPHER_KEY_CONVERT,
+			ICP_QAT_HW_CIPHER_ENCRYPT);
+		memcpy(cipherconfig->aes.key, authkey, authkeylen);
+		memset(cipherconfig->aes.key + authkeylen, 0,
+			ICP_QAT_HW_SNOW_3G_UEA2_IV_SZ);
+	}
 
 	/* Do precomputes */
 	if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_AES_XCBC_MAC) {
@@ -586,6 +621,9 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 					ICP_QAT_HW_GALOIS_H_SZ]) =
 			rte_bswap32(add_auth_data_length);
 		proto = ICP_QAT_FW_LA_GCM_PROTO;
+	} else if (cdesc->qat_hash_alg == ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2)  {
+		proto = ICP_QAT_FW_LA_SNOW_3G_PROTO;
+		state1_size = qat_hash_get_state1_size(cdesc->qat_hash_alg);
 	} else {
 		if (qat_alg_do_precomputes(cdesc->qat_hash_alg,
 			authkey, authkeylen, (uint8_t *)(hash->sha.state1),
@@ -605,10 +643,29 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 	cd_pars->u.s.content_desc_addr = cdesc->cd_paddr;
 	cd_pars->u.s.content_desc_params_sz = sizeof(cdesc->cd) >> 3;
 
+	if (cdesc->qat_cmd == ICP_QAT_FW_LA_CMD_AUTH)  {
+		ICP_QAT_FW_LA_DIGEST_IN_BUFFER_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_NO_DIGEST_IN_BUFFER);
+		ICP_QAT_FW_LA_CIPH_IV_FLD_FLAG_SET(header->serv_specif_flags,
+			ICP_QAT_FW_CIPH_IV_64BIT_PTR);
+		ICP_QAT_FW_LA_RET_AUTH_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_RET_AUTH_RES);
+		ICP_QAT_FW_LA_CMP_AUTH_SET(header->serv_specif_flags,
+			ICP_QAT_FW_LA_NO_CMP_AUTH_RES);
+	}
+
 	/* Cipher CD config setup */
 	cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
 	cipher_cd_ctrl->cipher_cfg_offset = cipher_offset >> 3;
 
+	if (cdesc->qat_cmd != ICP_QAT_FW_LA_CMD_AUTH) {
+		cipher_cd_ctrl->cipher_state_sz = ICP_QAT_HW_AES_BLK_SZ >> 3;
+		cipher_cd_ctrl->cipher_cfg_offset = cipher_offset>>3;
+		} else {
+		cipher_cd_ctrl->cipher_state_sz = 0;
+		cipher_cd_ctrl->cipher_cfg_offset = 0;
+	}
+
 	/* Auth CD config setup */
 	hash_cd_ctrl->hash_cfg_offset = hash_offset >> 3;
 	hash_cd_ctrl->hash_flags = ICP_QAT_FW_AUTH_HDR_FLAG_NO_NESTED;
@@ -642,6 +699,13 @@ int qat_alg_aead_session_create_content_desc_auth(struct qat_session *cdesc,
 		hash_cd_ctrl->inner_state1_sz = ICP_QAT_HW_GALOIS_128_STATE1_SZ;
 		memset(hash->sha.state1, 0, ICP_QAT_HW_GALOIS_128_STATE1_SZ);
 		break;
+	case ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2:
+		hash_cd_ctrl->inner_state2_sz =
+				ICP_QAT_HW_SNOW_3G_UIA2_STATE2_SZ;
+		hash_cd_ctrl->inner_state1_sz =
+				ICP_QAT_HW_SNOW_3G_UIA2_STATE1_SZ;
+		memset(hash->sha.state1, 0, ICP_QAT_HW_SNOW_3G_UIA2_STATE1_SZ);
+		break;
 	default:
 		PMD_DRV_LOG(ERR, "invalid HASH alg %u", cdesc->qat_hash_alg);
 		return -EFAULT;
@@ -751,3 +815,15 @@ int qat_alg_validate_aes_key(int key_len, enum icp_qat_hw_cipher_algo *alg)
 	}
 	return 0;
 }
+
+int qat_alg_validate_snow3g_key(int key_len, enum icp_qat_hw_cipher_algo *alg)
+{
+	switch (key_len) {
+	case ICP_QAT_HW_SNOW_3G_UEA2_KEY_SZ:
+		*alg = ICP_QAT_HW_CIPHER_ALGO_SNOW_3G_UEA2;
+		break;
+	default:
+		return -EINVAL;
+	}
+	return 0;
+}
diff --git a/drivers/crypto/qat/qat_crypto.c b/drivers/crypto/qat/qat_crypto.c
index 2edae32..e0e506d 100644
--- a/drivers/crypto/qat/qat_crypto.c
+++ b/drivers/crypto/qat/qat_crypto.c
@@ -169,6 +169,14 @@ qat_crypto_sym_configure_session_cipher(struct rte_cryptodev *dev,
 		}
 		session->qat_mode = ICP_QAT_HW_CIPHER_CTR_MODE;
 		break;
+	case RTE_CRYPTO_CIPHER_SNOW3G_UEA2:
+		if (qat_alg_validate_snow3g_key(cipher_xform->key.length,
+					&session->qat_cipher_alg) != 0) {
+			PMD_DRV_LOG(ERR, "Invalid SNOW3G cipher key size");
+			goto error_out;
+		}
+		session->qat_mode = ICP_QAT_HW_CIPHER_ECB_MODE;
+		break;
 	case RTE_CRYPTO_CIPHER_NULL:
 	case RTE_CRYPTO_CIPHER_3DES_ECB:
 	case RTE_CRYPTO_CIPHER_3DES_CBC:
@@ -290,6 +298,9 @@ qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
 	case RTE_CRYPTO_AUTH_AES_GMAC:
 		session->qat_hash_alg = ICP_QAT_HW_AUTH_ALGO_GALOIS_128;
 		break;
+	case RTE_CRYPTO_AUTH_SNOW3G_UIA2:
+		session->qat_hash_alg = ICP_QAT_HW_AUTH_ALGO_SNOW_3G_UIA2;
+		break;
 	case RTE_CRYPTO_AUTH_NULL:
 	case RTE_CRYPTO_AUTH_SHA1:
 	case RTE_CRYPTO_AUTH_SHA256:
@@ -302,7 +313,6 @@ qat_crypto_sym_configure_session_auth(struct rte_cryptodev *dev,
 	case RTE_CRYPTO_AUTH_MD5_HMAC:
 	case RTE_CRYPTO_AUTH_AES_CCM:
 	case RTE_CRYPTO_AUTH_KASUMI_F9:
-	case RTE_CRYPTO_AUTH_SNOW3G_UIA2:
 	case RTE_CRYPTO_AUTH_AES_CMAC:
 	case RTE_CRYPTO_AUTH_AES_CBC_MAC:
 	case RTE_CRYPTO_AUTH_ZUC_EIA3:
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* [dpdk-dev] [PATCH v4 3/3] app/test: add Snow3G tests
  2016-03-10 17:12   ` [dpdk-dev] [PATCH v4 " Deepak Kumar JAIN
                       ` (2 preceding siblings ...)
  2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 2/3] qat: add support for Snow3G Deepak Kumar JAIN
@ 2016-03-10 17:12     ` Deepak Kumar JAIN
  2016-03-16  3:27     ` [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist Devices Cao, Min
  4 siblings, 0 replies; 22+ messages in thread
From: Deepak Kumar JAIN @ 2016-03-10 17:12 UTC (permalink / raw)
  To: dev

Signed-off-by: Deepak Kumar JAIN <deepak.k.jain@intel.com>
---
 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 4 files changed, 1831 insertions(+), 3 deletions(-)
 create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c
index 3240ecd..0fe47b9 100644
--- a/app/test/test_cryptodev.c
+++ b/app/test/test_cryptodev.c
@@ -42,7 +42,8 @@
 
 #include "test.h"
 #include "test_cryptodev.h"
-
+#include "test_cryptodev_snow3g_test_vectors.h"
+#include "test_cryptodev_snow3g_hash_test_vectors.h"
 static enum rte_cryptodev_type gbl_cryptodev_type;
 
 struct crypto_testsuite_params {
@@ -68,6 +69,9 @@ struct crypto_unittest_params {
 	uint8_t *digest;
 };
 
+#define ALIGN_POW2_ROUNDUP(num, align) \
+	(((num) + (align) - 1) & ~((align) - 1))
+
 /*
  * Forward declarations.
  */
@@ -1747,6 +1751,997 @@ test_AES_CBC_HMAC_AES_XCBC_decrypt_digest_verify(void)
 	return TEST_SUCCESS;
 }
 
+/* ***** Snow3G Tests ***** */
+static int
+create_snow3g_hash_session(uint8_t dev_id,
+	const uint8_t *key, const uint8_t key_len,
+	const uint8_t aad_len, const uint8_t auth_len,
+	enum rte_crypto_auth_operation op)
+{
+	uint8_t hash_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(hash_key, key, key_len);
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.next = NULL;
+
+	ut_params->auth_xform.auth.op = op;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	ut_params->auth_xform.auth.key.data = hash_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->auth_xform);
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+static int
+create_snow3g_cipher_session(uint8_t dev_id,
+			enum rte_crypto_cipher_operation op,
+			const uint8_t *key, const uint8_t key_len)
+{
+	uint8_t cipher_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(cipher_key, key, key_len);
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = NULL;
+
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = op;
+	ut_params->cipher_xform.cipher.key.data = cipher_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session */
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+						&ut_params->
+						cipher_xform);
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+
+static int
+create_snow3g_cipher_operation(const uint8_t *iv, const unsigned iv_len,
+			const unsigned data_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+	unsigned iv_pad_len = 0;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+				RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+				"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(ut_params->ibuf
+			, iv_pad_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = iv_pad_len;
+	return 0;
+}
+
+
+static int
+create_snow3g_cipher_auth_session(uint8_t dev_id,
+		enum rte_crypto_cipher_operation cipher_op,
+		enum rte_crypto_auth_operation auth_op,
+		const uint8_t *key, const uint8_t key_len,
+		const uint8_t aad_len, const uint8_t auth_len)
+{
+	uint8_t cipher_auth_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(cipher_auth_key, key, key_len);
+
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.next = NULL;
+
+	ut_params->auth_xform.auth.op = auth_op;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	/* Hash key = cipher key */
+	ut_params->auth_xform.auth.key.data = cipher_auth_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = &ut_params->auth_xform;
+
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = cipher_op;
+	ut_params->cipher_xform.cipher.key.data = cipher_auth_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session*/
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->cipher_xform);
+
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+	return 0;
+}
+
+static int
+create_snow3g_auth_cipher_session(uint8_t dev_id,
+		enum rte_crypto_cipher_operation cipher_op,
+		enum rte_crypto_auth_operation auth_op,
+		const uint8_t *key, const uint8_t key_len,
+		const uint8_t aad_len, const uint8_t auth_len)
+	{
+	uint8_t auth_cipher_key[key_len];
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	memcpy(auth_cipher_key, key, key_len);
+
+	/* Setup Authentication Parameters */
+	ut_params->auth_xform.type = RTE_CRYPTO_SYM_XFORM_AUTH;
+	ut_params->auth_xform.auth.op = auth_op;
+	ut_params->auth_xform.next = &ut_params->cipher_xform;
+	ut_params->auth_xform.auth.algo = RTE_CRYPTO_AUTH_SNOW3G_UIA2;
+	ut_params->auth_xform.auth.key.length = key_len;
+	ut_params->auth_xform.auth.key.data = auth_cipher_key;
+	ut_params->auth_xform.auth.digest_length = auth_len;
+	ut_params->auth_xform.auth.add_auth_data_length = aad_len;
+
+	/* Setup Cipher Parameters */
+	ut_params->cipher_xform.type = RTE_CRYPTO_SYM_XFORM_CIPHER;
+	ut_params->cipher_xform.next = NULL;
+	ut_params->cipher_xform.cipher.algo = RTE_CRYPTO_CIPHER_SNOW3G_UEA2;
+	ut_params->cipher_xform.cipher.op = cipher_op;
+	ut_params->cipher_xform.cipher.key.data = auth_cipher_key;
+	ut_params->cipher_xform.cipher.key.length = key_len;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "key:", key, key_len);
+#endif
+	/* Create Crypto session*/
+	ut_params->sess = rte_cryptodev_sym_session_create(dev_id,
+				&ut_params->auth_xform);
+
+	TEST_ASSERT_NOT_NULL(ut_params->sess, "Session creation failed");
+
+	return 0;
+}
+
+static int
+create_snow3g_hash_operation(const uint8_t *auth_tag,
+		const unsigned auth_tag_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len,
+		enum rte_crypto_auth_operation op)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned aad_buffer_len;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+		"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+	sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend(
+			ut_params->ibuf, aad_buffer_len);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+					"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+			ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+					ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+				"no room to append auth tag");
+	ut_params->digest = sym_op->auth.digest.data;
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len + aad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+	if (op == RTE_CRYPTO_AUTH_OP_GENERATE)
+		memset(sym_op->auth.digest.data, 0, auth_tag_len);
+	else
+		rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "digest:",
+		sym_op->auth.digest.data,
+		sym_op->auth.digest.length);
+#endif
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len;
+
+	return 0;
+}
+
+static int
+create_snow3g_cipher_hash_operation(const uint8_t *auth_tag,
+		const unsigned auth_tag_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len,
+		enum rte_crypto_auth_operation op,
+		const uint8_t *iv, const unsigned iv_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned iv_pad_len = 0;
+	unsigned aad_buffer_len;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+			"Failed to allocate pktmbuf offload");
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(
+		ut_params->ibuf, iv_pad_len);
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = iv_pad_len;
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+
+	sym_op->auth.aad.data =
+			(uint8_t *)rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+			"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+			ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+			ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+			"no room to append auth tag");
+	ut_params->digest = sym_op->auth.digest.data;
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len + aad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+	if (op == RTE_CRYPTO_AUTH_OP_GENERATE)
+		memset(sym_op->auth.digest.data, 0, auth_tag_len);
+	else
+		rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len);
+
+	#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "digest:",
+		sym_op->auth.digest.data,
+		sym_op->auth.digest.length);
+	#endif
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len;
+
+	return 0;
+}
+
+static int
+create_snow3g_auth_cipher_operation(const unsigned auth_tag_len,
+		const uint8_t *iv, const unsigned iv_len,
+		const uint8_t *aad, const unsigned aad_len,
+		const unsigned data_len, unsigned data_pad_len)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	unsigned iv_pad_len = 0;
+	unsigned aad_buffer_len = 0;
+
+	/* Generate Crypto op data structure */
+	ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool,
+			RTE_CRYPTO_OP_TYPE_SYMMETRIC);
+	TEST_ASSERT_NOT_NULL(ut_params->op,
+			"Failed to allocate pktmbuf offload");
+
+	/* Set crypto operation data parameters */
+	rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess);
+
+	struct rte_crypto_sym_op *sym_op = ut_params->op->sym;
+
+	/* set crypto operation source mbuf */
+	sym_op->m_src = ut_params->ibuf;
+
+	/* digest */
+	sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append(
+			ut_params->ibuf, auth_tag_len);
+
+	TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data,
+			"no room to append auth tag");
+
+	sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset(
+			ut_params->ibuf, data_pad_len);
+	sym_op->auth.digest.length = auth_tag_len;
+
+	memset(sym_op->auth.digest.data, 0, auth_tag_len);
+
+	#ifdef RTE_APP_TEST_DEBUG
+		rte_hexdump(stdout, "digest:",
+			sym_op->auth.digest.data,
+			sym_op->auth.digest.length);
+	#endif
+	/* iv */
+	iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16);
+
+	sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend(
+		ut_params->ibuf, iv_pad_len);
+	TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv");
+
+	memset(sym_op->cipher.iv.data, 0, iv_pad_len);
+	sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf);
+	sym_op->cipher.iv.length = iv_pad_len;
+
+	rte_memcpy(sym_op->cipher.iv.data, iv, iv_len);
+
+	/* aad */
+	/*
+	* Always allocate the aad up to the block size.
+	* The cryptodev API calls out -
+	*  - the array must be big enough to hold the AAD, plus any
+	*   space to round this up to the nearest multiple of the
+	*   block size (16 bytes).
+	*/
+	aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16);
+
+	sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend(
+	ut_params->ibuf, aad_buffer_len);
+	TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data,
+				"no room to prepend aad");
+	sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys(
+				ut_params->ibuf);
+	sym_op->auth.aad.length = aad_len;
+
+	memset(sym_op->auth.aad.data, 0, aad_buffer_len);
+	rte_memcpy(sym_op->auth.aad.data, aad, aad_len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "aad:",
+			sym_op->auth.aad.data, aad_len);
+#endif
+
+	sym_op->cipher.data.length = data_len;
+	sym_op->cipher.data.offset = aad_buffer_len + iv_pad_len;
+
+	sym_op->auth.data.length = data_len;
+	sym_op->auth.data.offset = aad_buffer_len + iv_pad_len;
+
+	return 0;
+}
+
+static int
+test_snow3g_authentication(const struct snow3g_hash_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+	unsigned plaintext_pad_len;
+	uint8_t *plaintext;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_hash_session(ts_params->valid_devs[0],
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len,
+			RTE_CRYPTO_AUTH_OP_GENERATE);
+	if (retval < 0)
+		return retval;
+
+	/* alloc mbuf and set payload */
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple of */
+	/* the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+				plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+	/* Create SNOW3G opertaion */
+	retval = create_snow3g_hash_operation(NULL, tdata->digest.len,
+			tdata->aad.data, tdata->aad.len, tdata->plaintext.len,
+			plaintext_pad_len, RTE_CRYPTO_AUTH_OP_GENERATE);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+				ut_params->op);
+	ut_params->obuf = ut_params->op->sym->m_src;
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+			+ plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+	ut_params->digest,
+	tdata->digest.data,
+	DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+	"Snow3G Generated auth tag not as expected");
+
+	return 0;
+}
+
+static int
+test_snow3g_authentication_verify(const struct snow3g_hash_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+	unsigned plaintext_pad_len;
+	uint8_t *plaintext;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_hash_session(ts_params->valid_devs[0],
+				tdata->key.data, tdata->key.len,
+				tdata->aad.len, tdata->digest.len,
+				RTE_CRYPTO_AUTH_OP_VERIFY);
+	if (retval < 0)
+		return retval;
+	/* alloc mbuf and set payload */
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/* of the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+					plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_hash_operation(tdata->digest.data,
+			tdata->digest.len,
+			tdata->aad.data, tdata->aad.len,
+			tdata->plaintext.len, plaintext_pad_len,
+			RTE_CRYPTO_AUTH_OP_VERIFY);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+				ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	if (ut_params->op->status == RTE_CRYPTO_OP_STATUS_SUCCESS)
+		return 0;
+	else
+		return -1;
+
+	return 0;
+}
+
+
+static int
+test_snow3g_hash_generate_test_case_1(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_1);
+}
+
+static int
+test_snow3g_hash_generate_test_case_2(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_2);
+}
+
+static int
+test_snow3g_hash_generate_test_case_3(void)
+{
+	return test_snow3g_authentication(&snow3g_hash_test_case_3);
+}
+
+static int
+test_snow3g_hash_verify_test_case_1(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_1);
+
+}
+
+static int
+test_snow3g_hash_verify_test_case_2(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_2);
+}
+
+static int
+test_snow3g_hash_verify_test_case_3(void)
+{
+	return test_snow3g_authentication_verify(&snow3g_hash_test_case_3);
+}
+
+static int
+test_snow3g_encryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_session(ts_params->valid_devs[0],
+					RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+					tdata->key.data, tdata->key.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* Clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	       rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/*
+	 * Append data which is padded to a
+	 * multiple of the algorithms block size
+	 */
+	plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16);
+
+	plaintext = (uint8_t *) rte_pktmbuf_append(ut_params->ibuf,
+						plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_operation(tdata->iv.data, tdata->iv.len,
+						tdata->plaintext.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+						ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		ciphertext = plaintext;
+
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8 - lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ciphertext,
+		tdata->ciphertext.data,
+		tdata->ciphertext.len,
+		"Snow3G Ciphertext data not as expected");
+	return 0;
+}
+
+static int test_snow3g_decryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t ciphertext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_session(ts_params->valid_devs[0],
+					RTE_CRYPTO_CIPHER_OP_DECRYPT,
+					tdata->key.data, tdata->key.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* Clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+	       rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/*
+	 * Append data which is padded to a
+	 * multiple of the algorithms block size
+	 */
+	ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16);
+
+	ciphertext = (uint8_t *) rte_pktmbuf_append(ut_params->ibuf,
+						ciphertext_pad_len);
+	memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_operation(tdata->iv.data, tdata->iv.len,
+						tdata->ciphertext.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+						ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		plaintext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		plaintext = ciphertext;
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8 - lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(plaintext,
+				tdata->plaintext.data,
+				tdata->plaintext.len,
+				"Snow3G Plaintext data not as expected");
+	return 0;
+}
+
+static int
+test_snow3g_authenticated_encryption(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_cipher_auth_session(ts_params->valid_devs[0],
+			RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+			RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len);
+	if (retval < 0)
+		return retval;
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+			rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/*  of the algorithms block size */
+	plaintext_pad_len = tdata->plaintext.len;
+
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+			plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_cipher_hash_operation(tdata->digest.data,
+			tdata->digest.len, tdata->aad.data,
+			tdata->aad.len, tdata->plaintext.len,
+			plaintext_pad_len, RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->iv.data, tdata->iv.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+			ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->iv.len;
+	else
+		ciphertext = plaintext;
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8-lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+			ciphertext,
+			tdata->ciphertext.data,
+			tdata->ciphertext.len,
+			"Snow3G Ciphertext data not as expected");
+
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+	    + plaintext_pad_len + tdata->aad.len;
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+			ut_params->digest,
+			tdata->digest.data,
+			DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+			"Snow3G Generated auth tag not as expected");
+	return 0;
+}
+static int
+test_snow3g_encrypted_authentication(const struct snow3g_test_data *tdata)
+{
+	struct crypto_testsuite_params *ts_params = &testsuite_params;
+	struct crypto_unittest_params *ut_params = &unittest_params;
+
+	int retval;
+
+	uint8_t *plaintext, *ciphertext;
+	uint8_t plaintext_pad_len;
+	uint8_t lastByteValidBits = 8;
+	uint8_t lastByteMask = 0xFF;
+
+	/* Create SNOW3G session */
+	retval = create_snow3g_auth_cipher_session(ts_params->valid_devs[0],
+			RTE_CRYPTO_CIPHER_OP_ENCRYPT,
+			RTE_CRYPTO_AUTH_OP_GENERATE,
+			tdata->key.data, tdata->key.len,
+			tdata->aad.len, tdata->digest.len);
+	if (retval < 0)
+		return retval;
+
+	ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool);
+
+	/* clear mbuf payload */
+	memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0,
+			rte_pktmbuf_tailroom(ut_params->ibuf));
+
+	/* Append data which is padded to a multiple */
+	/* of the algorithms block size */
+	plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 8);
+
+	plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf,
+			plaintext_pad_len);
+	memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len);
+
+#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "plaintext:", plaintext, tdata->plaintext.len);
+#endif
+
+	/* Create SNOW3G operation */
+	retval = create_snow3g_auth_cipher_operation(
+		tdata->digest.len,
+		tdata->iv.data, tdata->iv.len,
+		tdata->aad.data, tdata->aad.len,
+		tdata->plaintext.len, plaintext_pad_len
+	);
+
+	if (retval < 0)
+		return retval;
+
+	ut_params->op = process_crypto_request(ts_params->valid_devs[0],
+			ut_params->op);
+	TEST_ASSERT_NOT_NULL(ut_params->op, "failed to retrieve obuf");
+	ut_params->obuf = ut_params->op->sym->m_src;
+	if (ut_params->obuf)
+		ciphertext = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+				+ tdata->aad.len + tdata->iv.len;
+	else
+		ciphertext = plaintext;
+
+	lastByteValidBits = (tdata->validDataLenInBits.len % 8);
+	if (lastByteValidBits == 0)
+		lastByteValidBits = 8;
+	lastByteMask = lastByteMask << (8-lastByteValidBits);
+	(*(ciphertext + tdata->ciphertext.len - 1)) &= lastByteMask;
+	ut_params->digest = rte_pktmbuf_mtod(ut_params->obuf, uint8_t *)
+			+ plaintext_pad_len + tdata->aad.len + tdata->iv.len;
+
+	#ifdef RTE_APP_TEST_DEBUG
+	rte_hexdump(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len);
+#endif
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ciphertext,
+		tdata->ciphertext.data,
+		tdata->ciphertext.len,
+		"Snow3G Ciphertext data not as expected");
+
+	/* Validate obuf */
+	TEST_ASSERT_BUFFERS_ARE_EQUAL(
+		ut_params->digest,
+		tdata->digest.data,
+		DIGEST_BYTE_LENGTH_SNOW3G_UIA2,
+		"Snow3G Generated auth tag not as expected");
+	return 0;
+}
+
+static int
+test_snow3g_encryption_test_case_1(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_1);
+}
+
+static int
+test_snow3g_encryption_test_case_2(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_2);
+}
+
+static int
+test_snow3g_encryption_test_case_3(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_encryption_test_case_4(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_4);
+}
+
+static int
+test_snow3g_encryption_test_case_5(void)
+{
+	return test_snow3g_encryption(&snow3g_test_case_5);
+}
+
+static int
+test_snow3g_decryption_test_case_1(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_1);
+}
+
+static int
+test_snow3g_decryption_test_case_2(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_2);
+}
+
+static int
+test_snow3g_decryption_test_case_3(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_decryption_test_case_4(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_4);
+}
+
+static int
+test_snow3g_decryption_test_case_5(void)
+{
+	return test_snow3g_decryption(&snow3g_test_case_5);
+}
+static int
+test_snow3g_authenticated_encryption_test_case_1(void)
+{
+	return test_snow3g_authenticated_encryption(&snow3g_test_case_3);
+}
+
+static int
+test_snow3g_encrypted_authentication_test_case_1(void)
+{
+	return test_snow3g_encrypted_authentication(&snow3g_test_case_6);
+}
 
 /* ***** AES-GCM Tests ***** */
 
@@ -1986,9 +2981,47 @@ static struct unit_test_suite cryptodev_qat_testsuite  = {
 				test_AES_CBC_HMAC_AES_XCBC_encrypt_digest),
 		TEST_CASE_ST(ut_setup, ut_teardown,
 				test_AES_CBC_HMAC_AES_XCBC_decrypt_digest_verify),
-
 		TEST_CASE_ST(ut_setup, ut_teardown, test_stats),
+		/** Snow3G encrypt only (UEA2) */
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_4),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encryption_test_case_5),
+
 
+		/** Snow3G decrypt only (UEA2) */
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_4),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_decryption_test_case_5),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_generate_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_2),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_hash_verify_test_case_3),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_authenticated_encryption_test_case_1),
+		TEST_CASE_ST(ut_setup, ut_teardown,
+			test_snow3g_encrypted_authentication_test_case_1),
 		TEST_CASES_END() /**< NULL terminate unit test array */
 	}
 };
diff --git a/app/test/test_cryptodev.h b/app/test/test_cryptodev.h
index 083266a..6059a01 100644
--- a/app/test/test_cryptodev.h
+++ b/app/test/test_cryptodev.h
@@ -1,7 +1,7 @@
 /*-
  *   BSD LICENSE
  *
- *   Copyright(c) 2015 Intel Corporation. All rights reserved.
+ *   Copyright(c) 2015-2016 Intel Corporation. All rights reserved.
  *
  *   Redistribution and use in source and binary forms, with or without
  *   modification, are permitted provided that the following conditions
@@ -60,6 +60,7 @@
 #define DIGEST_BYTE_LENGTH_SHA384		(BYTE_LENGTH(384))
 #define DIGEST_BYTE_LENGTH_SHA512		(BYTE_LENGTH(512))
 #define DIGEST_BYTE_LENGTH_AES_XCBC		(BYTE_LENGTH(96))
+#define DIGEST_BYTE_LENGTH_SNOW3G_UIA2		(BYTE_LENGTH(32))
 #define AES_XCBC_MAC_KEY_SZ			(16)
 
 #define TRUNCATED_DIGEST_BYTE_LENGTH_SHA1		(12)
diff --git a/app/test/test_cryptodev_snow3g_hash_test_vectors.h b/app/test/test_cryptodev_snow3g_hash_test_vectors.h
new file mode 100644
index 0000000..f4fa36d
--- /dev/null
+++ b/app/test/test_cryptodev_snow3g_hash_test_vectors.h
@@ -0,0 +1,415 @@
+/*-
+ *   BSD LICENSE
+ *
+ *   Copyright(c) 2016 Intel Corporation. All rights reserved.
+ *
+ *   Redistribution and use in source and binary forms, with or without
+ *   modification, are permitted provided that the following conditions
+ *   are met:
+ *
+ *	 * Redistributions of source code must retain the above copyright
+ *	   notice, this list of conditions and the following disclaimer.
+ *	 * Redistributions in binary form must reproduce the above copyright
+ *	   notice, this list of conditions and the following disclaimer in
+ *	   the documentation and/or other materials provided with the
+ *	   distribution.
+ *	 * Neither the name of Intel Corporation nor the names of its
+ *	   contributors may be used to endorse or promote products derived
+ *	   from this software without specific prior written permission.
+ *
+ *   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ *   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ *   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ *   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ *   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ *   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ *   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ *   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ *   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ *   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ *   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#ifndef TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_
+#define TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_
+
+struct snow3g_hash_test_data {
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} key;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} aad;
+
+	struct {
+		uint8_t data[2056];
+		unsigned len;
+	} plaintext;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} digest;
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_1 = {
+	.key = {
+		.data = {
+			0xC7, 0x36, 0xC6, 0xAA, 0xB2, 0x2B, 0xFF, 0xF9,
+			0x1E, 0x26, 0x98, 0xD2, 0xE2, 0x2A, 0xD5, 0x7E
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xD0, 0xA7, 0xD4, 0x63, 0xDF, 0x9F, 0xB2, 0xB2,
+			0x78, 0x83, 0x3F, 0xA0, 0x2E, 0x23, 0x5A, 0xA1,
+			0x72, 0xBD, 0x97, 0x0C, 0x14, 0x73, 0xE1, 0x29,
+			0x07, 0xFB, 0x64, 0x8B, 0x65, 0x99, 0xAA, 0xA0,
+			0xB2, 0x4A, 0x03, 0x86, 0x65, 0x42, 0x2B, 0x20,
+			0xA4, 0x99, 0x27, 0x6A, 0x50, 0x42, 0x70, 0x09
+		},
+		.len = 48
+	},
+	.digest = {
+		.data = {0x38, 0xB5, 0x54, 0xC0 },
+		.len  = 4
+	}
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_2 = {
+	.key = {
+		.data = {
+			0xF4, 0xEB, 0xEC, 0x69, 0xE7, 0x3E, 0xAF, 0x2E,
+			0xB2, 0xCF, 0x6A, 0xF4, 0xB3, 0x12, 0x0F, 0xFD
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x29, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0x77, 0x37,
+			0xA9, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0xF7, 0x37
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x10, 0xBF, 0xFF, 0x83, 0x9E, 0x0C, 0x71, 0x65,
+			0x8D, 0xBB, 0x2D, 0x17, 0x07, 0xE1, 0x45, 0x72,
+			0x4F, 0x41, 0xC1, 0x6F, 0x48, 0xBF, 0x40, 0x3C,
+			0x3B, 0x18, 0xE3, 0x8F, 0xD5, 0xD1, 0x66, 0x3B,
+			0x6F, 0x6D, 0x90, 0x01, 0x93, 0xE3, 0xCE, 0xA8,
+			0xBB, 0x4F, 0x1B, 0x4F, 0x5B, 0xE8, 0x22, 0x03,
+			0x22, 0x32, 0xA7, 0x8D, 0x7D, 0x75, 0x23, 0x8D,
+			0x5E, 0x6D, 0xAE, 0xCD, 0x3B, 0x43, 0x22, 0xCF,
+			0x59, 0xBC, 0x7E, 0xA8, 0x4A, 0xB1, 0x88, 0x11,
+			0xB5, 0xBF, 0xB7, 0xBC, 0x55, 0x3F, 0x4F, 0xE4,
+			0x44, 0x78, 0xCE, 0x28, 0x7A, 0x14, 0x87, 0x99,
+			0x90, 0xD1, 0x8D, 0x12, 0xCA, 0x79, 0xD2, 0xC8,
+			0x55, 0x14, 0x90, 0x21, 0xCD, 0x5C, 0xE8, 0xCA,
+			0x03, 0x71, 0xCA, 0x04, 0xFC, 0xCE, 0x14, 0x3E,
+			0x3D, 0x7C, 0xFE, 0xE9, 0x45, 0x85, 0xB5, 0x88,
+			0x5C, 0xAC, 0x46, 0x06, 0x8B
+		},
+	.len = 125
+	},
+	.digest = {
+		.data = {0x06, 0x17, 0x45, 0xAE},
+		.len  = 4
+	}
+};
+
+struct snow3g_hash_test_data snow3g_hash_test_case_3 = {
+	.key = {
+		.data = {
+			0xB3, 0x12, 0x0F, 0xFD, 0xB2, 0xCF, 0x6A, 0xF4,
+			0xE7, 0x3E, 0xAF, 0x2E, 0xF4, 0xEB, 0xEC, 0x69
+		},
+	.len = 16
+	},
+	.aad = {
+		.data = {
+			0x29, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0x77, 0x37,
+			0xA9, 0x6F, 0x39, 0x3C, 0x6B, 0x22, 0xF7, 0x37
+		},
+	.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
+			0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01, 0x01,
+			0xE0, 0x95, 0x80, 0x45, 0xF3, 0xA0, 0xBB, 0xA4,
+			0xE3, 0x96, 0x83, 0x46, 0xF0, 0xA3, 0xB8, 0xA7,
+			0xC0, 0x2A, 0x01, 0x8A, 0xE6, 0x40, 0x76, 0x52,
+			0x26, 0xB9, 0x87, 0xC9, 0x13, 0xE6, 0xCB, 0xF0,
+			0x83, 0x57, 0x00, 0x16, 0xCF, 0x83, 0xEF, 0xBC,
+			0x61, 0xC0, 0x82, 0x51, 0x3E, 0x21, 0x56, 0x1A,
+			0x42, 0x7C, 0x00, 0x9D, 0x28, 0xC2, 0x98, 0xEF,
+			0xAC, 0xE7, 0x8E, 0xD6, 0xD5, 0x6C, 0x2D, 0x45,
+			0x05, 0xAD, 0x03, 0x2E, 0x9C, 0x04, 0xDC, 0x60,
+			0xE7, 0x3A, 0x81, 0x69, 0x6D, 0xA6, 0x65, 0xC6,
+			0xC4, 0x86, 0x03, 0xA5, 0x7B, 0x45, 0xAB, 0x33,
+			0x22, 0x15, 0x85, 0xE6, 0x8E, 0xE3, 0x16, 0x91,
+			0x87, 0xFB, 0x02, 0x39, 0x52, 0x86, 0x32, 0xDD,
+			0x65, 0x6C, 0x80, 0x7E, 0xA3, 0x24, 0x8B, 0x7B,
+			0x46, 0xD0, 0x02, 0xB2, 0xB5, 0xC7, 0x45, 0x8E,
+			0xB8, 0x5B, 0x9C, 0xE9, 0x58, 0x79, 0xE0, 0x34,
+			0x08, 0x59, 0x05, 0x5E, 0x3B, 0x0A, 0xBB, 0xC3,
+			0xEA, 0xCE, 0x87, 0x19, 0xCA, 0xA8, 0x02, 0x65,
+			0xC9, 0x72, 0x05, 0xD5, 0xDC, 0x4B, 0xCC, 0x90,
+			0x2F, 0xE1, 0x83, 0x96, 0x29, 0xED, 0x71, 0x32,
+			0x8A, 0x0F, 0x04, 0x49, 0xF5, 0x88, 0x55, 0x7E,
+			0x68, 0x98, 0x86, 0x0E, 0x04, 0x2A, 0xEC, 0xD8,
+			0x4B, 0x24, 0x04, 0xC2, 0x12, 0xC9, 0x22, 0x2D,
+			0xA5, 0xBF, 0x8A, 0x89, 0xEF, 0x67, 0x97, 0x87,
+			0x0C, 0xF5, 0x07, 0x71, 0xA6, 0x0F, 0x66, 0xA2,
+			0xEE, 0x62, 0x85, 0x36, 0x57, 0xAD, 0xDF, 0x04,
+			0xCD, 0xDE, 0x07, 0xFA, 0x41, 0x4E, 0x11, 0xF1,
+			0x2B, 0x4D, 0x81, 0xB9, 0xB4, 0xE8, 0xAC, 0x53,
+			0x8E, 0xA3, 0x06, 0x66, 0x68, 0x8D, 0x88, 0x1F,
+			0x6C, 0x34, 0x84, 0x21, 0x99, 0x2F, 0x31, 0xB9,
+			0x4F, 0x88, 0x06, 0xED, 0x8F, 0xCC, 0xFF, 0x4C,
+			0x91, 0x23, 0xB8, 0x96, 0x42, 0x52, 0x7A, 0xD6,
+			0x13, 0xB1, 0x09, 0xBF, 0x75, 0x16, 0x74, 0x85,
+			0xF1, 0x26, 0x8B, 0xF8, 0x84, 0xB4, 0xCD, 0x23,
+			0xD2, 0x9A, 0x09, 0x34, 0x92, 0x57, 0x03, 0xD6,
+			0x34, 0x09, 0x8F, 0x77, 0x67, 0xF1, 0xBE, 0x74,
+			0x91, 0xE7, 0x08, 0xA8, 0xBB, 0x94, 0x9A, 0x38,
+			0x73, 0x70, 0x8A, 0xEF, 0x4A, 0x36, 0x23, 0x9E,
+			0x50, 0xCC, 0x08, 0x23, 0x5C, 0xD5, 0xED, 0x6B,
+			0xBE, 0x57, 0x86, 0x68, 0xA1, 0x7B, 0x58, 0xC1,
+			0x17, 0x1D, 0x0B, 0x90, 0xE8, 0x13, 0xA9, 0xE4,
+			0xF5, 0x8A, 0x89, 0xD7, 0x19, 0xB1, 0x10, 0x42,
+			0xD6, 0x36, 0x0B, 0x1B, 0x0F, 0x52, 0xDE, 0xB7,
+			0x30, 0xA5, 0x8D, 0x58, 0xFA, 0xF4, 0x63, 0x15,
+			0x95, 0x4B, 0x0A, 0x87, 0x26, 0x91, 0x47, 0x59,
+			0x77, 0xDC, 0x88, 0xC0, 0xD7, 0x33, 0xFE, 0xFF,
+			0x54, 0x60, 0x0A, 0x0C, 0xC1, 0xD0, 0x30, 0x0A,
+			0xAA, 0xEB, 0x94, 0x57, 0x2C, 0x6E, 0x95, 0xB0,
+			0x1A, 0xE9, 0x0D, 0xE0, 0x4F, 0x1D, 0xCE, 0x47,
+			0xF8, 0x7E, 0x8F, 0xA7, 0xBE, 0xBF, 0x77, 0xE1,
+			0xDB, 0xC2, 0x0D, 0x6B, 0xA8, 0x5C, 0xB9, 0x14,
+			0x3D, 0x51, 0x8B, 0x28, 0x5D, 0xFA, 0x04, 0xB6,
+			0x98, 0xBF, 0x0C, 0xF7, 0x81, 0x9F, 0x20, 0xFA,
+			0x7A, 0x28, 0x8E, 0xB0, 0x70, 0x3D, 0x99, 0x5C,
+			0x59, 0x94, 0x0C, 0x7C, 0x66, 0xDE, 0x57, 0xA9,
+			0xB7, 0x0F, 0x82, 0x37, 0x9B, 0x70, 0xE2, 0x03,
+			0x1E, 0x45, 0x0F, 0xCF, 0xD2, 0x18, 0x13, 0x26,
+			0xFC, 0xD2, 0x8D, 0x88, 0x23, 0xBA, 0xAA, 0x80,
+			0xDF, 0x6E, 0x0F, 0x44, 0x35, 0x59, 0x64, 0x75,
+			0x39, 0xFD, 0x89, 0x07, 0xC0, 0xFF, 0xD9, 0xD7,
+			0x9C, 0x13, 0x0E, 0xD8, 0x1C, 0x9A, 0xFD, 0x9B,
+			0x7E, 0x84, 0x8C, 0x9F, 0xED, 0x38, 0x44, 0x3D,
+			0x5D, 0x38, 0x0E, 0x53, 0xFB, 0xDB, 0x8A, 0xC8,
+			0xC3, 0xD3, 0xF0, 0x68, 0x76, 0x05, 0x4F, 0x12,
+			0x24, 0x61, 0x10, 0x7D, 0xE9, 0x2F, 0xEA, 0x09,
+			0xC6, 0xF6, 0x92, 0x3A, 0x18, 0x8D, 0x53, 0xAF,
+			0xE5, 0x4A, 0x10, 0xF6, 0x0E, 0x6E, 0x9D, 0x5A,
+			0x03, 0xD9, 0x96, 0xB5, 0xFB, 0xC8, 0x20, 0xF8,
+			0xA6, 0x37, 0x11, 0x6A, 0x27, 0xAD, 0x04, 0xB4,
+			0x44, 0xA0, 0x93, 0x2D, 0xD6, 0x0F, 0xBD, 0x12,
+			0x67, 0x1C, 0x11, 0xE1, 0xC0, 0xEC, 0x73, 0xE7,
+			0x89, 0x87, 0x9F, 0xAA, 0x3D, 0x42, 0xC6, 0x4D,
+			0x20, 0xCD, 0x12, 0x52, 0x74, 0x2A, 0x37, 0x68,
+			0xC2, 0x5A, 0x90, 0x15, 0x85, 0x88, 0x8E, 0xCE,
+			0xE1, 0xE6, 0x12, 0xD9, 0x93, 0x6B, 0x40, 0x3B,
+			0x07, 0x75, 0x94, 0x9A, 0x66, 0xCD, 0xFD, 0x99,
+			0xA2, 0x9B, 0x13, 0x45, 0xBA, 0xA8, 0xD9, 0xD5,
+			0x40, 0x0C, 0x91, 0x02, 0x4B, 0x0A, 0x60, 0x73,
+			0x63, 0xB0, 0x13, 0xCE, 0x5D, 0xE9, 0xAE, 0x86,
+			0x9D, 0x3B, 0x8D, 0x95, 0xB0, 0x57, 0x0B, 0x3C,
+			0x2D, 0x39, 0x14, 0x22, 0xD3, 0x24, 0x50, 0xCB,
+			0xCF, 0xAE, 0x96, 0x65, 0x22, 0x86, 0xE9, 0x6D,
+			0xEC, 0x12, 0x14, 0xA9, 0x34, 0x65, 0x27, 0x98,
+			0x0A, 0x81, 0x92, 0xEA, 0xC1, 0xC3, 0x9A, 0x3A,
+			0xAF, 0x6F, 0x15, 0x35, 0x1D, 0xA6, 0xBE, 0x76,
+			0x4D, 0xF8, 0x97, 0x72, 0xEC, 0x04, 0x07, 0xD0,
+			0x6E, 0x44, 0x15, 0xBE, 0xFA, 0xE7, 0xC9, 0x25,
+			0x80, 0xDF, 0x9B, 0xF5, 0x07, 0x49, 0x7C, 0x8F,
+			0x29, 0x95, 0x16, 0x0D, 0x4E, 0x21, 0x8D, 0xAA,
+			0xCB, 0x02, 0x94, 0x4A, 0xBF, 0x83, 0x34, 0x0C,
+			0xE8, 0xBE, 0x16, 0x86, 0xA9, 0x60, 0xFA, 0xF9,
+			0x0E, 0x2D, 0x90, 0xC5, 0x5C, 0xC6, 0x47, 0x5B,
+			0xAB, 0xC3, 0x17, 0x1A, 0x80, 0xA3, 0x63, 0x17,
+			0x49, 0x54, 0x95, 0x5D, 0x71, 0x01, 0xDA, 0xB1,
+			0x6A, 0xE8, 0x17, 0x91, 0x67, 0xE2, 0x14, 0x44,
+			0xB4, 0x43, 0xA9, 0xEA, 0xAA, 0x7C, 0x91, 0xDE,
+			0x36, 0xD1, 0x18, 0xC3, 0x9D, 0x38, 0x9F, 0x8D,
+			0xD4, 0x46, 0x9A, 0x84, 0x6C, 0x9A, 0x26, 0x2B,
+			0xF7, 0xFA, 0x18, 0x48, 0x7A, 0x79, 0xE8, 0xDE,
+			0x11, 0x69, 0x9E, 0x0B, 0x8F, 0xDF, 0x55, 0x7C,
+			0xB4, 0x87, 0x19, 0xD4, 0x53, 0xBA, 0x71, 0x30,
+			0x56, 0x10, 0x9B, 0x93, 0xA2, 0x18, 0xC8, 0x96,
+			0x75, 0xAC, 0x19, 0x5F, 0xB4, 0xFB, 0x06, 0x63,
+			0x9B, 0x37, 0x97, 0x14, 0x49, 0x55, 0xB3, 0xC9,
+			0x32, 0x7D, 0x1A, 0xEC, 0x00, 0x3D, 0x42, 0xEC,
+			0xD0, 0xEA, 0x98, 0xAB, 0xF1, 0x9F, 0xFB, 0x4A,
+			0xF3, 0x56, 0x1A, 0x67, 0xE7, 0x7C, 0x35, 0xBF,
+			0x15, 0xC5, 0x9C, 0x24, 0x12, 0xDA, 0x88, 0x1D,
+			0xB0, 0x2B, 0x1B, 0xFB, 0xCE, 0xBF, 0xAC, 0x51,
+			0x52, 0xBC, 0x99, 0xBC, 0x3F, 0x1D, 0x15, 0xF7,
+			0x71, 0x00, 0x1B, 0x70, 0x29, 0xFE, 0xDB, 0x02,
+			0x8F, 0x8B, 0x85, 0x2B, 0xC4, 0x40, 0x7E, 0xB8,
+			0x3F, 0x89, 0x1C, 0x9C, 0xA7, 0x33, 0x25, 0x4F,
+			0xDD, 0x1E, 0x9E, 0xDB, 0x56, 0x91, 0x9C, 0xE9,
+			0xFE, 0xA2, 0x1C, 0x17, 0x40, 0x72, 0x52, 0x1C,
+			0x18, 0x31, 0x9A, 0x54, 0xB5, 0xD4, 0xEF, 0xBE,
+			0xBD, 0xDF, 0x1D, 0x8B, 0x69, 0xB1, 0xCB, 0xF2,
+			0x5F, 0x48, 0x9F, 0xCC, 0x98, 0x13, 0x72, 0x54,
+			0x7C, 0xF4, 0x1D, 0x00, 0x8E, 0xF0, 0xBC, 0xA1,
+			0x92, 0x6F, 0x93, 0x4B, 0x73, 0x5E, 0x09, 0x0B,
+			0x3B, 0x25, 0x1E, 0xB3, 0x3A, 0x36, 0xF8, 0x2E,
+			0xD9, 0xB2, 0x9C, 0xF4, 0xCB, 0x94, 0x41, 0x88,
+			0xFA, 0x0E, 0x1E, 0x38, 0xDD, 0x77, 0x8F, 0x7D,
+			0x1C, 0x9D, 0x98, 0x7B, 0x28, 0xD1, 0x32, 0xDF,
+			0xB9, 0x73, 0x1F, 0xA4, 0xF4, 0xB4, 0x16, 0x93,
+			0x5B, 0xE4, 0x9D, 0xE3, 0x05, 0x16, 0xAF, 0x35,
+			0x78, 0x58, 0x1F, 0x2F, 0x13, 0xF5, 0x61, 0xC0,
+			0x66, 0x33, 0x61, 0x94, 0x1E, 0xAB, 0x24, 0x9A,
+			0x4B, 0xC1, 0x23, 0xF8, 0xD1, 0x5C, 0xD7, 0x11,
+			0xA9, 0x56, 0xA1, 0xBF, 0x20, 0xFE, 0x6E, 0xB7,
+			0x8A, 0xEA, 0x23, 0x73, 0x36, 0x1D, 0xA0, 0x42,
+			0x6C, 0x79, 0xA5, 0x30, 0xC3, 0xBB, 0x1D, 0xE0,
+			0xC9, 0x97, 0x22, 0xEF, 0x1F, 0xDE, 0x39, 0xAC,
+			0x2B, 0x00, 0xA0, 0xA8, 0xEE, 0x7C, 0x80, 0x0A,
+			0x08, 0xBC, 0x22, 0x64, 0xF8, 0x9F, 0x4E, 0xFF,
+			0xE6, 0x27, 0xAC, 0x2F, 0x05, 0x31, 0xFB, 0x55,
+			0x4F, 0x6D, 0x21, 0xD7, 0x4C, 0x59, 0x0A, 0x70,
+			0xAD, 0xFA, 0xA3, 0x90, 0xBD, 0xFB, 0xB3, 0xD6,
+			0x8E, 0x46, 0x21, 0x5C, 0xAB, 0x18, 0x7D, 0x23,
+			0x68, 0xD5, 0xA7, 0x1F, 0x5E, 0xBE, 0xC0, 0x81,
+			0xCD, 0x3B, 0x20, 0xC0, 0x82, 0xDB, 0xE4, 0xCD,
+			0x2F, 0xAC, 0xA2, 0x87, 0x73, 0x79, 0x5D, 0x6B,
+			0x0C, 0x10, 0x20, 0x4B, 0x65, 0x9A, 0x93, 0x9E,
+			0xF2, 0x9B, 0xBE, 0x10, 0x88, 0x24, 0x36, 0x24,
+			0x42, 0x99, 0x27, 0xA7, 0xEB, 0x57, 0x6D, 0xD3,
+			0xA0, 0x0E, 0xA5, 0xE0, 0x1A, 0xF5, 0xD4, 0x75,
+			0x83, 0xB2, 0x27, 0x2C, 0x0C, 0x16, 0x1A, 0x80,
+			0x65, 0x21, 0xA1, 0x6F, 0xF9, 0xB0, 0xA7, 0x22,
+			0xC0, 0xCF, 0x26, 0xB0, 0x25, 0xD5, 0x83, 0x6E,
+			0x22, 0x58, 0xA4, 0xF7, 0xD4, 0x77, 0x3A, 0xC8,
+			0x01, 0xE4, 0x26, 0x3B, 0xC2, 0x94, 0xF4, 0x3D,
+			0xEF, 0x7F, 0xA8, 0x70, 0x3F, 0x3A, 0x41, 0x97,
+			0x46, 0x35, 0x25, 0x88, 0x76, 0x52, 0xB0, 0xB2,
+			0xA4, 0xA2, 0xA7, 0xCF, 0x87, 0xF0, 0x09, 0x14,
+			0x87, 0x1E, 0x25, 0x03, 0x91, 0x13, 0xC7, 0xE1,
+			0x61, 0x8D, 0xA3, 0x40, 0x64, 0xB5, 0x7A, 0x43,
+			0xC4, 0x63, 0x24, 0x9F, 0xB8, 0xD0, 0x5E, 0x0F,
+			0x26, 0xF4, 0xA6, 0xD8, 0x49, 0x72, 0xE7, 0xA9,
+			0x05, 0x48, 0x24, 0x14, 0x5F, 0x91, 0x29, 0x5C,
+			0xDB, 0xE3, 0x9A, 0x6F, 0x92, 0x0F, 0xAC, 0xC6,
+			0x59, 0x71, 0x2B, 0x46, 0xA5, 0x4B, 0xA2, 0x95,
+			0xBB, 0xE6, 0xA9, 0x01, 0x54, 0xE9, 0x1B, 0x33,
+			0x98, 0x5A, 0x2B, 0xCD, 0x42, 0x0A, 0xD5, 0xC6,
+			0x7E, 0xC9, 0xAD, 0x8E, 0xB7, 0xAC, 0x68, 0x64,
+			0xDB, 0x27, 0x2A, 0x51, 0x6B, 0xC9, 0x4C, 0x28,
+			0x39, 0xB0, 0xA8, 0x16, 0x9A, 0x6B, 0xF5, 0x8E,
+			0x1A, 0x0C, 0x2A, 0xDA, 0x8C, 0x88, 0x3B, 0x7B,
+			0xF4, 0x97, 0xA4, 0x91, 0x71, 0x26, 0x8E, 0xD1,
+			0x5D, 0xDD, 0x29, 0x69, 0x38, 0x4E, 0x7F, 0xF4,
+			0xBF, 0x4A, 0xAB, 0x2E, 0xC9, 0xEC, 0xC6, 0x52,
+			0x9C, 0xF6, 0x29, 0xE2, 0xDF, 0x0F, 0x08, 0xA7,
+			0x7A, 0x65, 0xAF, 0xA1, 0x2A, 0xA9, 0xB5, 0x05,
+			0xDF, 0x8B, 0x28, 0x7E, 0xF6, 0xCC, 0x91, 0x49,
+			0x3D, 0x1C, 0xAA, 0x39, 0x07, 0x6E, 0x28, 0xEF,
+			0x1E, 0xA0, 0x28, 0xF5, 0x11, 0x8D, 0xE6, 0x1A,
+			0xE0, 0x2B, 0xB6, 0xAE, 0xFC, 0x33, 0x43, 0xA0,
+			0x50, 0x29, 0x2F, 0x19, 0x9F, 0x40, 0x18, 0x57,
+			0xB2, 0xBE, 0xAD, 0x5E, 0x6E, 0xE2, 0xA1, 0xF1,
+			0x91, 0x02, 0x2F, 0x92, 0x78, 0x01, 0x6F, 0x04,
+			0x77, 0x91, 0xA9, 0xD1, 0x8D, 0xA7, 0xD2, 0xA6,
+			0xD2, 0x7F, 0x2E, 0x0E, 0x51, 0xC2, 0xF6, 0xEA,
+			0x30, 0xE8, 0xAC, 0x49, 0xA0, 0x60, 0x4F, 0x4C,
+			0x13, 0x54, 0x2E, 0x85, 0xB6, 0x83, 0x81, 0xB9,
+			0xFD, 0xCF, 0xA0, 0xCE, 0x4B, 0x2D, 0x34, 0x13,
+			0x54, 0x85, 0x2D, 0x36, 0x02, 0x45, 0xC5, 0x36,
+			0xB6, 0x12, 0xAF, 0x71, 0xF3, 0xE7, 0x7C, 0x90,
+			0x95, 0xAE, 0x2D, 0xBD, 0xE5, 0x04, 0xB2, 0x65,
+			0x73, 0x3D, 0xAB, 0xFE, 0x10, 0xA2, 0x0F, 0xC7,
+			0xD6, 0xD3, 0x2C, 0x21, 0xCC, 0xC7, 0x2B, 0x8B,
+			0x34, 0x44, 0xAE, 0x66, 0x3D, 0x65, 0x92, 0x2D,
+			0x17, 0xF8, 0x2C, 0xAA, 0x2B, 0x86, 0x5C, 0xD8,
+			0x89, 0x13, 0xD2, 0x91, 0xA6, 0x58, 0x99, 0x02,
+			0x6E, 0xA1, 0x32, 0x84, 0x39, 0x72, 0x3C, 0x19,
+			0x8C, 0x36, 0xB0, 0xC3, 0xC8, 0xD0, 0x85, 0xBF,
+			0xAF, 0x8A, 0x32, 0x0F, 0xDE, 0x33, 0x4B, 0x4A,
+			0x49, 0x19, 0xB4, 0x4C, 0x2B, 0x95, 0xF6, 0xE8,
+			0xEC, 0xF7, 0x33, 0x93, 0xF7, 0xF0, 0xD2, 0xA4,
+			0x0E, 0x60, 0xB1, 0xD4, 0x06, 0x52, 0x6B, 0x02,
+			0x2D, 0xDC, 0x33, 0x18, 0x10, 0xB1, 0xA5, 0xF7,
+			0xC3, 0x47, 0xBD, 0x53, 0xED, 0x1F, 0x10, 0x5D,
+			0x6A, 0x0D, 0x30, 0xAB, 0xA4, 0x77, 0xE1, 0x78,
+			0x88, 0x9A, 0xB2, 0xEC, 0x55, 0xD5, 0x58, 0xDE,
+			0xAB, 0x26, 0x30, 0x20, 0x43, 0x36, 0x96, 0x2B,
+			0x4D, 0xB5, 0xB6, 0x63, 0xB6, 0x90, 0x2B, 0x89,
+			0xE8, 0x5B, 0x31, 0xBC, 0x6A, 0xF5, 0x0F, 0xC5,
+			0x0A, 0xCC, 0xB3, 0xFB, 0x9B, 0x57, 0xB6, 0x63,
+			0x29, 0x70, 0x31, 0x37, 0x8D, 0xB4, 0x78, 0x96,
+			0xD7, 0xFB, 0xAF, 0x6C, 0x60, 0x0A, 0xDD, 0x2C,
+			0x67, 0xF9, 0x36, 0xDB, 0x03, 0x79, 0x86, 0xDB,
+			0x85, 0x6E, 0xB4, 0x9C, 0xF2, 0xDB, 0x3F, 0x7D,
+			0xA6, 0xD2, 0x36, 0x50, 0xE4, 0x38, 0xF1, 0x88,
+			0x40, 0x41, 0xB0, 0x13, 0x11, 0x9E, 0x4C, 0x2A,
+			0xE5, 0xAF, 0x37, 0xCC, 0xCD, 0xFB, 0x68, 0x66,
+			0x07, 0x38, 0xB5, 0x8B, 0x3C, 0x59, 0xD1, 0xC0,
+			0x24, 0x84, 0x37, 0x47, 0x2A, 0xBA, 0x1F, 0x35,
+			0xCA, 0x1F, 0xB9, 0x0C, 0xD7, 0x14, 0xAA, 0x9F,
+			0x63, 0x55, 0x34, 0xF4, 0x9E, 0x7C, 0x5B, 0xBA,
+			0x81, 0xC2, 0xB6, 0xB3, 0x6F, 0xDE, 0xE2, 0x1C,
+			0xA2, 0x7E, 0x34, 0x7F, 0x79, 0x3D, 0x2C, 0xE9,
+			0x44, 0xED, 0xB2, 0x3C, 0x8C, 0x9B, 0x91, 0x4B,
+			0xE1, 0x03, 0x35, 0xE3, 0x50, 0xFE, 0xB5, 0x07,
+			0x03, 0x94, 0xB7, 0xA4, 0xA1, 0x5C, 0x0C, 0xA1,
+			0x20, 0x28, 0x35, 0x68, 0xB7, 0xBF, 0xC2, 0x54,
+			0xFE, 0x83, 0x8B, 0x13, 0x7A, 0x21, 0x47, 0xCE,
+			0x7C, 0x11, 0x3A, 0x3A, 0x4D, 0x65, 0x49, 0x9D,
+			0x9E, 0x86, 0xB8, 0x7D, 0xBC, 0xC7, 0xF0, 0x3B,
+			0xBD, 0x3A, 0x3A, 0xB1, 0xAA, 0x24, 0x3E, 0xCE,
+			0x5B, 0xA9, 0xBC, 0xF2, 0x5F, 0x82, 0x83, 0x6C,
+			0xFE, 0x47, 0x3B, 0x2D, 0x83, 0xE7, 0xA7, 0x20,
+			0x1C, 0xD0, 0xB9, 0x6A, 0x72, 0x45, 0x1E, 0x86,
+			0x3F, 0x6C, 0x3B, 0xA6, 0x64, 0xA6, 0xD0, 0x73,
+			0xD1, 0xF7, 0xB5, 0xED, 0x99, 0x08, 0x65, 0xD9,
+			0x78, 0xBD, 0x38, 0x15, 0xD0, 0x60, 0x94, 0xFC,
+			0x9A, 0x2A, 0xBA, 0x52, 0x21, 0xC2, 0x2D, 0x5A,
+			0xB9, 0x96, 0x38, 0x9E, 0x37, 0x21, 0xE3, 0xAF,
+			0x5F, 0x05, 0xBE, 0xDD, 0xC2, 0x87, 0x5E, 0x0D,
+			0xFA, 0xEB, 0x39, 0x02, 0x1E, 0xE2, 0x7A, 0x41,
+			0x18, 0x7C, 0xBB, 0x45, 0xEF, 0x40, 0xC3, 0xE7,
+			0x3B, 0xC0, 0x39, 0x89, 0xF9, 0xA3, 0x0D, 0x12,
+			0xC5, 0x4B, 0xA7, 0xD2, 0x14, 0x1D, 0xA8, 0xA8,
+			0x75, 0x49, 0x3E, 0x65, 0x77, 0x6E, 0xF3, 0x5F,
+			0x97, 0xDE, 0xBC, 0x22, 0x86, 0xCC, 0x4A, 0xF9,
+			0xB4, 0x62, 0x3E, 0xEE, 0x90, 0x2F, 0x84, 0x0C,
+			0x52, 0xF1, 0xB8, 0xAD, 0x65, 0x89, 0x39, 0xAE,
+			0xF7, 0x1F, 0x3F, 0x72, 0xB9, 0xEC, 0x1D, 0xE2,
+			0x15, 0x88, 0xBD, 0x35, 0x48, 0x4E, 0xA4, 0x44,
+			0x36, 0x34, 0x3F, 0xF9, 0x5E, 0xAD, 0x6A, 0xB1,
+			0xD8, 0xAF, 0xB1, 0xB2, 0xA3, 0x03, 0xDF, 0x1B,
+			0x71, 0xE5, 0x3C, 0x4A, 0xEA, 0x6B, 0x2E, 0x3E,
+			0x93, 0x72, 0xBE, 0x0D, 0x1B, 0xC9, 0x97, 0x98,
+			0xB0, 0xCE, 0x3C, 0xC1, 0x0D, 0x2A, 0x59, 0x6D,
+			0x56, 0x5D, 0xBA, 0x82, 0xF8, 0x8C, 0xE4, 0xCF,
+			0xF3, 0xB3, 0x3D, 0x5D, 0x24, 0xE9, 0xC0, 0x83,
+			0x11, 0x24, 0xBF, 0x1A, 0xD5, 0x4B, 0x79, 0x25,
+			0x32, 0x98, 0x3D, 0xD6, 0xC3, 0xA8, 0xB7, 0xD0
+		},
+	.len = 2056
+	},
+	.digest = {
+		.data = {0x17, 0x9F, 0x2F, 0xA6},
+		.len  = 4
+	}
+};
+
+#endif /* TEST_CRYPTODEV_SNOW3G_HASH_TEST_VECTORS_H_ */
diff --git a/app/test/test_cryptodev_snow3g_test_vectors.h b/app/test/test_cryptodev_snow3g_test_vectors.h
new file mode 100644
index 0000000..403406d
--- /dev/null
+++ b/app/test/test_cryptodev_snow3g_test_vectors.h
@@ -0,0 +1,379 @@
+/*-
+ *   BSD LICENSE
+ *
+ *   Copyright(c) 2015 Intel Corporation. All rights reserved.
+ *
+ *   Redistribution and use in source and binary forms, with or without
+ *   modification, are permitted provided that the following conditions
+ *   are met:
+ *
+ *   * Redistributions of source code must retain the above copyright
+ *     notice, this list of conditions and the following disclaimer.
+ *   * Redistributions in binary form must reproduce the above copyright
+ *     notice, this list of conditions and the following disclaimer in
+ *     the documentation and/or other materials provided with the
+ *     distribution.
+ *   * Neither the name of Intel Corporation nor the names of its
+ *     contributors may be used to endorse or promote products derived
+ *     from this software without specific prior written permission.
+ *
+ *   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ *   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ *   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ *   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ *   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ *   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ *   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ *   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ *   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ *   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ *   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#ifndef TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_
+#define TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_
+
+struct snow3g_test_data {
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} key;
+
+	struct {
+		uint8_t data[64] __rte_aligned(16);
+		unsigned len;
+	} iv;
+
+	struct {
+		uint8_t data[1024];
+		unsigned len;
+	} plaintext;
+
+	struct {
+		uint8_t data[1024];
+		unsigned len;
+	} ciphertext;
+
+	struct {
+		unsigned len;
+	} validDataLenInBits;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} aad;
+
+	struct {
+		uint8_t data[64];
+		unsigned len;
+	} digest;
+};
+struct snow3g_test_data snow3g_test_case_1 = {
+	.key = {
+		.data = {
+			0x2B, 0xD6, 0x45, 0x9F, 0x82, 0xC5, 0xB3, 0x00,
+			0x95, 0x2C, 0x49, 0x10, 0x48, 0x81, 0xFF, 0x48
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00,
+			0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x7E, 0xC6, 0x12, 0x72, 0x74, 0x3B, 0xF1, 0x61,
+			0x47, 0x26, 0x44, 0x6A, 0x6C, 0x38, 0xCE, 0xD1,
+			0x66, 0xF6, 0xCA, 0x76, 0xEB, 0x54, 0x30, 0x04,
+			0x42, 0x86, 0x34, 0x6C, 0xEF, 0x13, 0x0F, 0x92,
+			0x92, 0x2B, 0x03, 0x45, 0x0D, 0x3A, 0x99, 0x75,
+			0xE5, 0xBD, 0x2E, 0xA0, 0xEB, 0x55, 0xAD, 0x8E,
+			0x1B, 0x19, 0x9E, 0x3E, 0xC4, 0x31, 0x60, 0x20,
+			0xE9, 0xA1, 0xB2, 0x85, 0xE7, 0x62, 0x79, 0x53,
+			0x59, 0xB7, 0xBD, 0xFD, 0x39, 0xBE, 0xF4, 0xB2,
+			0x48, 0x45, 0x83, 0xD5, 0xAF, 0xE0, 0x82, 0xAE,
+			0xE6, 0x38, 0xBF, 0x5F, 0xD5, 0xA6, 0x06, 0x19,
+			0x39, 0x01, 0xA0, 0x8F, 0x4A, 0xB4, 0x1A, 0xAB,
+			0x9B, 0x13, 0x48, 0x80
+		},
+		.len = 100
+	},
+	.ciphertext = {
+		.data = {
+			0x8C, 0xEB, 0xA6, 0x29, 0x43, 0xDC, 0xED, 0x3A,
+			0x09, 0x90, 0xB0, 0x6E, 0xA1, 0xB0, 0xA2, 0xC4,
+			0xFB, 0x3C, 0xED, 0xC7, 0x1B, 0x36, 0x9F, 0x42,
+			0xBA, 0x64, 0xC1, 0xEB, 0x66, 0x65, 0xE7, 0x2A,
+			0xA1, 0xC9, 0xBB, 0x0D, 0xEA, 0xA2, 0x0F, 0xE8,
+			0x60, 0x58, 0xB8, 0xBA, 0xEE, 0x2C, 0x2E, 0x7F,
+			0x0B, 0xEC, 0xCE, 0x48, 0xB5, 0x29, 0x32, 0xA5,
+			0x3C, 0x9D, 0x5F, 0x93, 0x1A, 0x3A, 0x7C, 0x53,
+			0x22, 0x59, 0xAF, 0x43, 0x25, 0xE2, 0xA6, 0x5E,
+			0x30, 0x84, 0xAD, 0x5F, 0x6A, 0x51, 0x3B, 0x7B,
+			0xDD, 0xC1, 0xB6, 0x5F, 0x0A, 0xA0, 0xD9, 0x7A,
+			0x05, 0x3D, 0xB5, 0x5A, 0x88, 0xC4, 0xC4, 0xF9,
+			0x60, 0x5E, 0x41, 0x40
+		},
+		.len = 100
+	},
+	.validDataLenInBits = {
+		.len = 798
+	},
+	.aad = {
+		.data = {
+			 0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00,
+			 0x72, 0xA4, 0xF2, 0x0F, 0x64, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_2 = {
+	.key = {
+		.data = {
+			0xEF, 0xA8, 0xB2, 0x22, 0x9E, 0x72, 0x0C, 0x2A,
+			0x7C, 0x36, 0xEA, 0x55, 0xE9, 0x60, 0x56, 0x95
+		},
+		.len = 16
+	},
+	.iv = {
+	       .data = {
+			0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00,
+			0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00
+		},
+	       .len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x10, 0x11, 0x12, 0x31, 0xE0, 0x60, 0x25, 0x3A,
+			0x43, 0xFD, 0x3F, 0x57, 0xE3, 0x76, 0x07, 0xAB,
+			0x28, 0x27, 0xB5, 0x99, 0xB6, 0xB1, 0xBB, 0xDA,
+			0x37, 0xA8, 0xAB, 0xCC, 0x5A, 0x8C, 0x55, 0x0D,
+			0x1B, 0xFB, 0x2F, 0x49, 0x46, 0x24, 0xFB, 0x50,
+			0x36, 0x7F, 0xA3, 0x6C, 0xE3, 0xBC, 0x68, 0xF1,
+			0x1C, 0xF9, 0x3B, 0x15, 0x10, 0x37, 0x6B, 0x02,
+			0x13, 0x0F, 0x81, 0x2A, 0x9F, 0xA1, 0x69, 0xD8
+		},
+		.len = 64
+	},
+	.ciphertext = {
+		.data = {
+				0xE0, 0xDA, 0x15, 0xCA, 0x8E, 0x25, 0x54, 0xF5,
+				0xE5, 0x6C, 0x94, 0x68, 0xDC, 0x6C, 0x7C, 0x12,
+				0x9C, 0x56, 0x8A, 0xA5, 0x03, 0x23, 0x17, 0xE0,
+				0x4E, 0x07, 0x29, 0x64, 0x6C, 0xAB, 0xEF, 0xA6,
+				0x89, 0x86, 0x4C, 0x41, 0x0F, 0x24, 0xF9, 0x19,
+				0xE6, 0x1E, 0x3D, 0xFD, 0xFA, 0xD7, 0x7E, 0x56,
+				0x0D, 0xB0, 0xA9, 0xCD, 0x36, 0xC3, 0x4A, 0xE4,
+				0x18, 0x14, 0x90, 0xB2, 0x9F, 0x5F, 0xA2, 0xFC
+		},
+		.len = 64
+	},
+	.validDataLenInBits = {
+		.len = 510
+	},
+	.aad = {
+		.data = {
+			 0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00,
+			 0xE2, 0x8B, 0xCF, 0x7B, 0xC0, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_3 = {
+	.key = {
+		.data = {
+			 0x5A, 0xCB, 0x1D, 0x64, 0x4C, 0x0D, 0x51, 0x20,
+			 0x4E, 0xA5, 0xF1, 0x45, 0x10, 0x10, 0xD8, 0x52
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00,
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xAD, 0x9C, 0x44, 0x1F, 0x89, 0x0B, 0x38, 0xC4,
+			0x57, 0xA4, 0x9D, 0x42, 0x14, 0x07, 0xE8
+		},
+		.len = 15
+	},
+	.ciphertext = {
+		.data = {
+			0xBA, 0x0F, 0x31, 0x30, 0x03, 0x34, 0xC5, 0x6B,
+			0x52, 0xA7, 0x49, 0x7C, 0xBA, 0xC0, 0x46
+		},
+		.len = 15
+	},
+	.validDataLenInBits = {
+		.len = 120
+	},
+	.aad = {
+		.data = {
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00,
+			0xFA, 0x55, 0x6B, 0x26, 0x1C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.digest = {
+		.data = {0xE8, 0x60, 0x5A, 0x3E},
+		.len  = 4
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_4 = {
+	.key = {
+		.data = {
+			0xD3, 0xC5, 0xD5, 0x92, 0x32, 0x7F, 0xB1, 0x1C,
+			0x40, 0x35, 0xC6, 0x68, 0x0A, 0xF8, 0xC6, 0xD1
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x39, 0x8A, 0x59, 0xB4, 0x2C, 0x00, 0x00, 0x00,
+			0x39, 0x8A, 0x59, 0xB4, 0x2C, 0x00, 0x00, 0x00
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0x98, 0x1B, 0xA6, 0x82, 0x4C, 0x1B, 0xFB, 0x1A,
+			0xB4, 0x85, 0x47, 0x20, 0x29, 0xB7, 0x1D, 0x80,
+			0x8C, 0xE3, 0x3E, 0x2C, 0xC3, 0xC0, 0xB5, 0xFC,
+			0x1F, 0x3D, 0xE8, 0xA6, 0xDC, 0x66, 0xB1, 0xF0
+		},
+		.len = 32
+	},
+	.ciphertext = {
+		.data = {
+			0x98, 0x9B, 0x71, 0x9C, 0xDC, 0x33, 0xCE, 0xB7,
+			0xCF, 0x27, 0x6A, 0x52, 0x82, 0x7C, 0xEF, 0x94,
+			0xA5, 0x6C, 0x40, 0xC0, 0xAB, 0x9D, 0x81, 0xF7,
+			0xA2, 0xA9, 0xBA, 0xC6, 0x0E, 0x11, 0xC4, 0xB0
+		},
+		.len = 32
+	},
+	.validDataLenInBits = {
+		.len = 253
+	}
+};
+
+struct snow3g_test_data snow3g_test_case_5 = {
+	.key = {
+		.data = {
+			0x60, 0x90, 0xEA, 0xE0, 0x4C, 0x83, 0x70, 0x6E,
+			0xEC, 0xBF, 0x65, 0x2B, 0xE8, 0xE3, 0x65, 0x66
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x72, 0xA4, 0xF2, 0x0F, 0x48, 0x00, 0x00, 0x00,
+			0x72, 0xA4, 0xF2, 0x0F, 0x48, 0x00, 0x00, 0x00
+		},
+		.len = 16},
+	.plaintext = {
+		.data = {
+			0x40, 0x98, 0x1B, 0xA6, 0x82, 0x4C, 0x1B, 0xFB,
+			0x42, 0x86, 0xB2, 0x99, 0x78, 0x3D, 0xAF, 0x44,
+			0x2C, 0x09, 0x9F, 0x7A, 0xB0, 0xF5, 0x8D, 0x5C,
+			0x8E, 0x46, 0xB1, 0x04, 0xF0, 0x8F, 0x01, 0xB4,
+			0x1A, 0xB4, 0x85, 0x47, 0x20, 0x29, 0xB7, 0x1D,
+			0x36, 0xBD, 0x1A, 0x3D, 0x90, 0xDC, 0x3A, 0x41,
+			0xB4, 0x6D, 0x51, 0x67, 0x2A, 0xC4, 0xC9, 0x66,
+			0x3A, 0x2B, 0xE0, 0x63, 0xDA, 0x4B, 0xC8, 0xD2,
+			0x80, 0x8C, 0xE3, 0x3E, 0x2C, 0xCC, 0xBF, 0xC6,
+			0x34, 0xE1, 0xB2, 0x59, 0x06, 0x08, 0x76, 0xA0,
+			0xFB, 0xB5, 0xA4, 0x37, 0xEB, 0xCC, 0x8D, 0x31,
+			0xC1, 0x9E, 0x44, 0x54, 0x31, 0x87, 0x45, 0xE3,
+			0x98, 0x76, 0x45, 0x98, 0x7A, 0x98, 0x6F, 0x2C,
+			0xB0
+		},
+		.len = 105
+	},
+	.ciphertext = {
+		.data = {
+			0x58, 0x92, 0xBB, 0xA8, 0x8B, 0xBB, 0xCA, 0xAE,
+			0xAE, 0x76, 0x9A, 0xA0, 0x6B, 0x68, 0x3D, 0x3A,
+			0x17, 0xCC, 0x04, 0xA3, 0x69, 0x88, 0x16, 0x97,
+			0x43, 0x5E, 0x44, 0xFE, 0xD5, 0xFF, 0x9A, 0xF5,
+			0x7B, 0x9E, 0x89, 0x0D, 0x4D, 0x5C, 0x64, 0x70,
+			0x98, 0x85, 0xD4, 0x8A, 0xE4, 0x06, 0x90, 0xEC,
+			0x04, 0x3B, 0xAA, 0xE9, 0x70, 0x57, 0x96, 0xE4,
+			0xA9, 0xFF, 0x5A, 0x4B, 0x8D, 0x8B, 0x36, 0xD7,
+			0xF3, 0xFE, 0x57, 0xCC, 0x6C, 0xFD, 0x6C, 0xD0,
+			0x05, 0xCD, 0x38, 0x52, 0xA8, 0x5E, 0x94, 0xCE,
+			0x6B, 0xCD, 0x90, 0xD0, 0xD0, 0x78, 0x39, 0xCE,
+			0x09, 0x73, 0x35, 0x44, 0xCA, 0x8E, 0x35, 0x08,
+			0x43, 0x24, 0x85, 0x50, 0x92, 0x2A, 0xC1, 0x28,
+			0x18
+		},
+		.len = 105
+	},
+	.validDataLenInBits = {
+		.len = 837
+	}
+};
+struct snow3g_test_data snow3g_test_case_6 = {
+	.key = {
+		.data = {
+			0xC7, 0x36, 0xC6, 0xAA, 0xB2, 0x2B, 0xFF, 0xF9,
+			0x1E, 0x26, 0x98, 0xD2, 0xE2, 0x2A, 0xD5, 0x7E
+		},
+		.len = 16
+	},
+	.iv = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.aad = {
+		.data = {
+			0x14, 0x79, 0x3E, 0x41, 0x03, 0x97, 0xE8, 0xFD,
+			0x94, 0x79, 0x3E, 0x41, 0x03, 0x97, 0x68, 0xFD
+		},
+		.len = 16
+	},
+	.plaintext = {
+		.data = {
+			0xD0, 0xA7, 0xD4, 0x63, 0xDF, 0x9F, 0xB2, 0xB2,
+			0x78, 0x83, 0x3F, 0xA0, 0x2E, 0x23, 0x5A, 0xA1,
+			0x72, 0xBD, 0x97, 0x0C, 0x14, 0x73, 0xE1, 0x29,
+			0x07, 0xFB, 0x64, 0x8B, 0x65, 0x99, 0xAA, 0xA0,
+			0xB2, 0x4A, 0x03, 0x86, 0x65, 0x42, 0x2B, 0x20,
+			0xA4, 0x99, 0x27, 0x6A, 0x50, 0x42, 0x70, 0x09
+		},
+		.len = 48
+	},
+	.ciphertext = {
+	   .data = {
+			0x95, 0x2E, 0x5A, 0xE1, 0x50, 0xB8, 0x59, 0x2A,
+			0x9B, 0xA0, 0x38, 0xA9, 0x8E, 0x2F, 0xED, 0xAB,
+			0xFD, 0xC8, 0x3B, 0x47, 0x46, 0x0B, 0x50, 0x16,
+			0xEC, 0x88, 0x45, 0xB6, 0x05, 0xC7, 0x54, 0xF8,
+			0xBD, 0x91, 0xAA, 0xB6, 0xA4, 0xDC, 0x64, 0xB4,
+			0xCB, 0xEB, 0x97, 0x06, 0x4C, 0xF7, 0x02, 0x3D
+		},
+		.len = 48
+	},
+	.digest = {
+		.data = {0x38, 0xB5, 0x54, 0xC0 },
+		.len  = 4
+	},
+	.validDataLenInBits = {
+		.len = 384
+	}
+};
+
+#endif /* TEST_CRYPTODEV_SNOW3G_TEST_VECTORS_H_ */
-- 
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist Devices
  2016-03-10 16:25     ` De Lara Guarch, Pablo
@ 2016-03-10 20:37       ` Thomas Monjalon
  0 siblings, 0 replies; 22+ messages in thread
From: Thomas Monjalon @ 2016-03-10 20:37 UTC (permalink / raw)
  To: Jain, Deepak K; +Cc: dev

> > Deepak Kumar JAIN (3):
> >   crypto: add cipher/auth only support
> >   qat: add support for Snow3G
> >   app/test: add Snow3G tests
> 
> Series-acked-by: Pablo de Lara <pablo.de.lara.guarch@intel.com>

Applied, thanks

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist Devices
  2016-03-10 17:12   ` [dpdk-dev] [PATCH v4 " Deepak Kumar JAIN
                       ` (3 preceding siblings ...)
  2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
@ 2016-03-16  3:27     ` Cao, Min
  4 siblings, 0 replies; 22+ messages in thread
From: Cao, Min @ 2016-03-16  3:27 UTC (permalink / raw)
  To: Jain, Deepak K, dev

Tested-by: Min Cao <min.cao@intel.com>

- Tested Commit: 1b9cb73ecef109593081ab9efbd9d1429607bb99
- OS: Fedora20 3.11.10-301.fc20.x86_64
- GCC: gcc (GCC) 4.8.3
- CPU: Intel(R) Xeon(R) CPU E5-2658 v3 @ 2.20GHz
- NIC: Niantic
- Default x86_64-native-linuxapp-gcc configuration
- Prerequisites:
- Total 42 cases, 42 passed, 0 failed

- test case 1: QAT Unit test 
    Total 31 cases, 31 passed, 0 failed

- test case 2: AES_NI Unit test 
    Total 10 cases, 10 passed, 0 failed

- test case 3: l2fwd-crypto 
    Total 1 cases, 1 passed, 0 failed

-----Original Message-----
From: dev [mailto:dev-bounces@dpdk.org] On Behalf Of Deepak Kumar JAIN
Sent: Friday, March 11, 2016 1:13 AM
To: dev@dpdk.org
Subject: [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist Devices

 This patchset contains fixes and refactoring for Snow3G(UEA2 and
 UIA2) wireless algorithm for Intel Quick Assist devices.

 QAT PMD previously supported only cipher/hash alg-chaining for AES/SHA.
 The code has been refactored to also support cipher-only and hash  only  (for Snow3G only) functionality along with alg-chaining.

 Changes from V3:
 1) Rebase based on below mentioned patchset.
 2) Fixes test failure which happens only after 
    applying patch 1 only.
 
 Changes from v2:

 1) Rebasing based on below mentioned patchset.

This patchset depends on
cryptodev API changes
http://dpdk.org/ml/archives/dev/2016-March/035451.html

Deepak Kumar JAIN (3):
  crypto: add cipher/auth only support
  qat: add support for Snow3G
  app/test: add Snow3G tests

 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 doc/guides/cryptodevs/qat.rst                      |    8 +-
 doc/guides/rel_notes/release_16_04.rst             |    6 +
 drivers/crypto/qat/qat_adf/qat_algs.h              |   19 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c   |  284 +++++-
 drivers/crypto/qat/qat_crypto.c                    |  149 ++-
 drivers/crypto/qat/qat_crypto.h                    |   10 +
 10 files changed, 2236 insertions(+), 74 deletions(-)  create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

--
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [dpdk-dev] [PATCH v3 0/3] Snow3G support for Intel Quick Assist Devices
  2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
                     ` (4 preceding siblings ...)
  2016-03-10 17:12   ` [dpdk-dev] [PATCH v4 " Deepak Kumar JAIN
@ 2016-03-16  5:05   ` Cao, Min
  5 siblings, 0 replies; 22+ messages in thread
From: Cao, Min @ 2016-03-16  5:05 UTC (permalink / raw)
  To: Jain, Deepak K, dev

Tested-by: Min Cao <min.cao@intel.com>

- Tested Commit: 1b9cb73ecef109593081ab9efbd9d1429607bb99
- OS: Fedora20 3.11.10-301.fc20.x86_64
- GCC: gcc (GCC) 4.8.3
- CPU: Intel(R) Xeon(R) CPU E5-2658 v3 @ 2.20GHz
- NIC: Niantic
- Default x86_64-native-linuxapp-gcc configuration
- Prerequisites:
- Total 42 cases, 42 passed, 0 failed

- test case 1: QAT Unit test 
    Total 31 cases, 31 passed, 0 failed

- test case 2: AES_NI Unit test 
    Total 10 cases, 10 passed, 0 failed

- test case 3: l2fwd-crypto 
    Total 1 cases, 1 passed, 0 failed

-----Original Message-----
From: dev [mailto:dev-bounces@dpdk.org] On Behalf Of Deepak Kumar JAIN
Sent: Thursday, March 03, 2016 9:01 PM
To: dev@dpdk.org
Subject: [dpdk-dev] [PATCH v3 0/3] Snow3G support for Intel Quick Assist Devices

 This patchset contains fixes and refactoring for Snow3G(UEA2 and
 UIA2) wireless algorithm for Intel Quick Assist devices.

 QAT PMD previously supported only cipher/hash alg-chaining for AES/SHA.
 The code has been refactored to also support cipher-only and hash  only  (for Snow3G only) functionality along with alg-chaining.

 Changes from v2:

 1) Rebasing based on below mentioned patchset.

 This patchset depends on
 cryptodev API changes
 http://dpdk.org/ml/archives/dev/2016-February/034212.html

Deepak Kumar JAIN (3):
  crypto: add cipher/auth only support
  qat: add support for Snow3G
  app/test: add Snow3G tests

 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 doc/guides/cryptodevs/qat.rst                      |    8 +-
 doc/guides/rel_notes/release_16_04.rst             |    6 +
 drivers/crypto/qat/qat_adf/qat_algs.h              |   19 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c   |  280 +++++-
 drivers/crypto/qat/qat_crypto.c                    |  149 ++-
 drivers/crypto/qat/qat_crypto.h                    |   10 +
 10 files changed, 2231 insertions(+), 75 deletions(-)  create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

--
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

* Re: [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices
  2016-02-23 14:02 ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Deepak Kumar JAIN
                     ` (2 preceding siblings ...)
  2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
@ 2016-03-16  5:15   ` Cao, Min
  3 siblings, 0 replies; 22+ messages in thread
From: Cao, Min @ 2016-03-16  5:15 UTC (permalink / raw)
  To: Jain, Deepak K, dev

Tested-by: Min Cao <min.cao@intel.com>

- Tested Commit: 1b9cb73ecef109593081ab9efbd9d1429607bb99
- OS: Fedora20 3.11.10-301.fc20.x86_64
- GCC: gcc (GCC) 4.8.3
- CPU: Intel(R) Xeon(R) CPU E5-2658 v3 @ 2.20GHz
- NIC: Niantic
- Default x86_64-native-linuxapp-gcc configuration
- Prerequisites:
- Total 42 cases, 42 passed, 0 failed

- test case 1: QAT Unit test 
    Total 31 cases, 31 passed, 0 failed

- test case 2: AES_NI Unit test 
    Total 10 cases, 10 passed, 0 failed

- test case 3: l2fwd-crypto 
    Total 1 cases, 1 passed, 0 failed

-----Original Message-----
From: dev [mailto:dev-bounces@dpdk.org] On Behalf Of Deepak Kumar JAIN
Sent: Tuesday, February 23, 2016 10:03 PM
To: dev@dpdk.org
Subject: [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices

 This patchset contains fixes and refactoring for Snow3G(UEA2 and
 UIA2) wireless algorithm for Intel Quick Assist devices.

 QAT PMD previously supported only cipher/hash alg-chaining for AES/SHA.
 The code has been refactored to also support cipher-only and hash  only (for Snow3G only) functionality along with alg-chaining.

 Changes from v1: 

 1) Hash only fix and alg chainging fix

 2) Added hash test vectors for snow3g UIA2 functionality.

 This patchset depends on
 Cryptodev API changes
 http://dpdk.org/ml/archives/dev/2016-February/033551.html

Deepak Kumar JAIN (3):
  crypto: add cipher/auth only support
  qat: add support for Snow3G
  app/test: add Snow3G tests

 app/test/test_cryptodev.c                          | 1037 +++++++++++++++++++-
 app/test/test_cryptodev.h                          |    3 +-
 app/test/test_cryptodev_snow3g_hash_test_vectors.h |  415 ++++++++
 app/test/test_cryptodev_snow3g_test_vectors.h      |  379 +++++++
 doc/guides/cryptodevs/qat.rst                      |    8 +-
 doc/guides/rel_notes/release_16_04.rst             |    4 +
 drivers/crypto/qat/qat_adf/qat_algs.h              |   19 +-
 drivers/crypto/qat/qat_adf/qat_algs_build_desc.c   |  280 +++++-
 drivers/crypto/qat/qat_crypto.c                    |  147 ++-
 drivers/crypto/qat/qat_crypto.h                    |   10 +
 10 files changed, 2228 insertions(+), 74 deletions(-)  create mode 100644 app/test/test_cryptodev_snow3g_hash_test_vectors.h
 create mode 100644 app/test/test_cryptodev_snow3g_test_vectors.h

--
2.1.0

^ permalink raw reply	[flat|nested] 22+ messages in thread

end of thread, other threads:[~2016-03-16  5:15 UTC | newest]

Thread overview: 22+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2016-01-28 17:46 [dpdk-dev] [PATCH 0/3] Snow3G UEA2 support for Intel Quick Assist Devices Deepak Kumar JAIN
2016-01-28 17:46 ` [dpdk-dev] [PATCH 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
2016-01-28 17:46 ` [dpdk-dev] [PATCH 2/3] qat: add Snow3G UEA2 support Deepak Kumar JAIN
2016-01-28 17:46 ` [dpdk-dev] [PATCH 3/3] app/test: add Snow3G UEA2 tests Deepak Kumar JAIN
2016-02-23 14:02 ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Deepak Kumar JAIN
2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 2/3] qat: add support for Snow3G Deepak Kumar JAIN
2016-02-23 14:02   ` [dpdk-dev] [PATCH v2 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
2016-03-16  5:15   ` [dpdk-dev] [PATCH v2 0/3] Snow3G support for Intel Quick Assist Devices Cao, Min
2016-03-03 13:01 ` [dpdk-dev] [PATCH v3 " Deepak Kumar JAIN
2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 2/3] qat: add support for Snow3G Deepak Kumar JAIN
2016-03-03 13:01   ` [dpdk-dev] [PATCH v3 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
2016-03-07 13:55   ` [dpdk-dev] [PATCH v3 0/3] Snow3G support for Intel Quick Assist Devices De Lara Guarch, Pablo
2016-03-10 17:12   ` [dpdk-dev] [PATCH v4 " Deepak Kumar JAIN
2016-03-10 16:25     ` De Lara Guarch, Pablo
2016-03-10 20:37       ` Thomas Monjalon
2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 1/3] crypto: add cipher/auth only support Deepak Kumar JAIN
2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 2/3] qat: add support for Snow3G Deepak Kumar JAIN
2016-03-10 17:12     ` [dpdk-dev] [PATCH v4 3/3] app/test: add Snow3G tests Deepak Kumar JAIN
2016-03-16  3:27     ` [dpdk-dev] [PATCH v4 0/3] Snow3G support for Intel Quick Assist Devices Cao, Min
2016-03-16  5:05   ` [dpdk-dev] [PATCH v3 " Cao, Min

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).