* [dpdk-dev] [PATCH v2] crypto/aesni_gcm: migration from MB library to ISA-L @ 2016-12-27 18:53 Michal Jastrzebski 2017-01-03 13:02 ` [dpdk-dev] [PATCH v3] " Piotr Azarewicz 0 siblings, 1 reply; 15+ messages in thread From: Michal Jastrzebski @ 2016-12-27 18:53 UTC (permalink / raw) To: dev; +Cc: pablo.de.lara.guarch, Piotr Azarewicz From: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> Current Cryptodev AESNI-GCM PMD is implemented using AESNI-MB library.This patch reimplement Cryptodev AESni-GCM using ISA-L Crypto library: https://github.com/01org/isa-l_crypto. In new version 256-bit key support and AAD variable length is available. Verified current unit tests and added new unit tests to verify new functionalities. v2 changes: - implement native scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) - write unit test for session-less mode - write unit test for out-of place processing - add support for GMAC authentication algorithm Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> --- app/test/test_cryptodev.c | 739 +++++++++++++++++++--- app/test/test_cryptodev_gcm_test_vectors.h | 487 +++++++++++++- doc/guides/cryptodevs/aesni_gcm.rst | 23 +- doc/guides/rel_notes/release_17_02.rst | 13 + drivers/crypto/aesni_gcm/Makefile | 8 +- drivers/crypto/aesni_gcm/aesni_gcm_ops.h | 95 +-- drivers/crypto/aesni_gcm/aesni_gcm_pmd.c | 307 ++++----- drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c | 49 +- drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h | 15 +- mk/rte.app.mk | 3 +- 10 files changed, 1356 insertions(+), 383 deletions(-) diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c index 00dced5..0bea584 100644 --- a/app/test/test_cryptodev.c +++ b/app/test/test_cryptodev.c @@ -3931,16 +3931,48 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +create_gcm_xforms(struct rte_crypto_op *op, + enum rte_crypto_cipher_operation cipher_op, + uint8_t *key, const uint8_t key_len, + const uint8_t aad_len, const uint8_t auth_len, + enum rte_crypto_auth_operation auth_op) +{ + TEST_ASSERT_NOT_NULL(rte_crypto_op_sym_xforms_alloc(op, 2), + "failed to allocate space for crypto transforms"); + + struct rte_crypto_sym_op *sym_op = op->sym; + + /* Setup Cipher Parameters */ + sym_op->xform->type = RTE_CRYPTO_SYM_XFORM_CIPHER; + sym_op->xform->cipher.algo = RTE_CRYPTO_CIPHER_AES_GCM; + sym_op->xform->cipher.op = cipher_op; + sym_op->xform->cipher.key.data = key; + sym_op->xform->cipher.key.length = key_len; + + TEST_HEXDUMP(stdout, "key:", key, key_len); + + /* Setup Authentication Parameters */ + sym_op->xform->next->type = RTE_CRYPTO_SYM_XFORM_AUTH; + sym_op->xform->next->auth.algo = RTE_CRYPTO_AUTH_AES_GCM; + sym_op->xform->next->auth.op = auth_op; + sym_op->xform->next->auth.digest_length = auth_len; + sym_op->xform->next->auth.add_auth_data_length = aad_len; + sym_op->xform->next->auth.key.length = 0; + sym_op->xform->next->auth.key.data = NULL; + sym_op->xform->next->next = NULL; + + return 0; +} + +static int create_gcm_operation(enum rte_crypto_cipher_operation op, - const uint8_t *auth_tag, const unsigned auth_tag_len, - const uint8_t *iv, const unsigned iv_len, - const uint8_t *aad, const unsigned aad_len, - const unsigned data_len, unsigned data_pad_len) + const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; - unsigned iv_pad_len = 0, aad_buffer_len; + uint8_t *plaintext, *ciphertext; + unsigned int iv_pad_len, aad_pad_len, plaintext_pad_len; /* Generate Crypto op data structure */ ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool, @@ -3950,63 +3982,118 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct rte_crypto_sym_op *sym_op = ut_params->op->sym; - sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( - ut_params->ibuf, auth_tag_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, - "no room to append digest"); - sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( - ut_params->ibuf, data_pad_len); - sym_op->auth.digest.length = auth_tag_len; - - if (op == RTE_CRYPTO_CIPHER_OP_DECRYPT) { - rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len); - TEST_HEXDUMP(stdout, "digest:", - sym_op->auth.digest.data, - sym_op->auth.digest.length); - } + /* Append aad data */ + aad_pad_len = RTE_ALIGN_CEIL(tdata->aad.len, 16); + sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + aad_pad_len); + TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, + "no room to append aad"); - /* iv */ - iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16); + sym_op->auth.aad.length = tdata->aad.len; + sym_op->auth.aad.phys_addr = + rte_pktmbuf_mtophys(ut_params->ibuf); + memcpy(sym_op->auth.aad.data, tdata->aad.data, tdata->aad.len); + TEST_HEXDUMP(stdout, "aad:", sym_op->auth.aad.data, + sym_op->auth.aad.length); + /* Prepend iv */ + iv_pad_len = RTE_ALIGN_CEIL(tdata->iv.len, 16); sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend( ut_params->ibuf, iv_pad_len); TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv"); memset(sym_op->cipher.iv.data, 0, iv_pad_len); sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf); - sym_op->cipher.iv.length = iv_len; + sym_op->cipher.iv.length = tdata->iv.len; - rte_memcpy(sym_op->cipher.iv.data, iv, iv_len); + rte_memcpy(sym_op->cipher.iv.data, tdata->iv.data, tdata->iv.len); + TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, + sym_op->cipher.iv.length); - /* - * Always allocate the aad up to the block size. - * The cryptodev API calls out - - * - the array must be big enough to hold the AAD, plus any - * space to round this up to the nearest multiple of the - * block size (16 bytes). - */ - aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16); + /* Append plaintext/ciphertext */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, "no room to append plaintext"); - sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend( - ut_params->ibuf, aad_buffer_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, - "no room to prepend aad"); - sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys( - ut_params->ibuf); - sym_op->auth.aad.length = aad_len; + memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); + TEST_HEXDUMP(stdout, "plaintext:", plaintext, + tdata->plaintext.len); - memset(sym_op->auth.aad.data, 0, aad_buffer_len); - rte_memcpy(sym_op->auth.aad.data, aad, aad_len); + if (ut_params->obuf) { + ciphertext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, iv_pad_len); - TEST_HEXDUMP(stdout, "aad:", - sym_op->auth.aad.data, aad_len); + memset(ciphertext + aad_pad_len + iv_pad_len, 0, + tdata->ciphertext.len); + } + } else { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); + ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); + + memcpy(ciphertext, tdata->ciphertext.data, + tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, + tdata->ciphertext.len); + + if (ut_params->obuf) { + plaintext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, + "no room to append plaintext"); + + memset(plaintext + aad_pad_len + iv_pad_len, 0, + tdata->plaintext.len); + } + } + + /* Append digest data */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + memset(sym_op->auth.digest.data, 0, tdata->auth_tag.len); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + plaintext_pad_len + + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; + } else { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->ibuf, tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->ibuf, + plaintext_pad_len + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; + + rte_memcpy(sym_op->auth.digest.data, tdata->auth_tag.data, + tdata->auth_tag.len); + TEST_HEXDUMP(stdout, "digest:", + sym_op->auth.digest.data, + sym_op->auth.digest.length); + } - sym_op->cipher.data.length = data_len; - sym_op->cipher.data.offset = aad_buffer_len + iv_pad_len; + sym_op->cipher.data.length = tdata->plaintext.len; + sym_op->cipher.data.offset = aad_pad_len + iv_pad_len; - sym_op->auth.data.offset = aad_buffer_len + iv_pad_len; - sym_op->auth.data.length = data_len; + sym_op->auth.data.length = tdata->plaintext.len; + sym_op->auth.data.offset = aad_pad_len + iv_pad_len; return 0; } @@ -4018,9 +4105,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext, *auth_tag; + uint8_t *ciphertext, *auth_tag; uint16_t plaintext_pad_len; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4031,31 +4118,20 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GMC_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); /* clear mbuf payload */ memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - /* - * Append data which is padded to a multiple - * of the algorithms block size - */ - plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); - - plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - plaintext_pad_len); - memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); - - TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->plaintext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->plaintext.len, plaintext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); if (retval < 0) return retval; @@ -4070,14 +4146,18 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, "crypto op processing failed"); + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + if (ut_params->op->sym->m_dst) { ciphertext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); auth_tag = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_dst, uint8_t *, plaintext_pad_len); } else { - ciphertext = plaintext; - auth_tag = plaintext + plaintext_pad_len; + ciphertext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; } TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); @@ -4143,15 +4223,68 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_encryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_2); +} + +static int test_mb_AES_GCM_authenticated_decryption(const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext; - uint16_t ciphertext_pad_len; + uint8_t *plaintext; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4162,31 +4295,23 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - /* alloc mbuf and set payload */ - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GMC_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); - - ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - ciphertext_pad_len); - memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len); - - TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->ciphertext.len, ciphertext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); if (retval < 0) return retval; - rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); ut_params->op->sym->m_src = ut_params->ibuf; @@ -4202,7 +4327,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) plaintext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); else - plaintext = ciphertext; + plaintext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); @@ -4262,6 +4389,358 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_decryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_2); +} + +static int +test_AES_GCM_authenticated_encryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_oop(void) +{ + return test_AES_GCM_authenticated_encryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_DECRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_oop(void) +{ + return test_AES_GCM_authenticated_decryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_encryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + uint8_t key[tdata->key.len + 1]; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_sessionless(void) +{ + return test_AES_GCM_authenticated_encryption_sessionless( + &gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + uint8_t key[tdata->key.len + 1]; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_DECRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_sessionless(void) +{ + return test_AES_GCM_authenticated_decryption_sessionless( + &gcm_test_case_5); +} + +static int test_stats(void) { struct crypto_testsuite_params *ts_params = &testsuite_params; @@ -6332,6 +6811,82 @@ struct test_crypto_vector { TEST_CASE_ST(ut_setup, ut_teardown, test_mb_AES_GCM_authenticated_decryption_test_case_7), + /** AES GCM Authenticated Encryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_7), + + /** AES GCM Authenticated Decryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_7), + + /** AES GCM Authenticated Encryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_2), + + /** AES GCM Authenticated Decryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_2), + + /** AES GMAC Authentication */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_4), + + /** Negative tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_data_corrupt), + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_tag_corrupt), + + /** Out of place tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_oop), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_oop), + + /** Session-less tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_sessionless), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_sessionless), + TEST_CASES_END() /**< NULL terminate unit test array */ } }; diff --git a/app/test/test_cryptodev_gcm_test_vectors.h b/app/test/test_cryptodev_gcm_test_vectors.h index b404242..9a31a38 100644 --- a/app/test/test_cryptodev_gcm_test_vectors.h +++ b/app/test/test_cryptodev_gcm_test_vectors.h @@ -33,7 +33,17 @@ #ifndef TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ #define TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ -#define GMAC_LARGE_PLAINTEXT_LENGTH 65376 +#define GMAC_LARGE_PLAINTEXT_LENGTH 65344 +#define GMC_MAX_AAD_LENGTH 65536 +#define GMC_LARGE_AAD_LENGTH 65296 + +static uint8_t gcm_aad_zero_text[GMC_MAX_AAD_LENGTH] = { 0 }; + +static uint8_t gcm_aad_text[GMC_MAX_AAD_LENGTH] = { + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6, 0x97, + 0x88, 0x79, 0x6a, 0x5b, 0x4c, 0x3d, 0x2e, 0x1f }; struct gcm_test_data { struct { @@ -47,7 +57,7 @@ struct gcm_test_data { } iv; struct { - uint8_t data[64]; + uint8_t *data; unsigned len; } aad; @@ -111,7 +121,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -148,7 +158,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -186,7 +196,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -238,8 +248,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .data = gcm_aad_zero_text, .len = 8 }, .plaintext = { @@ -294,8 +303,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef }, + .data = gcm_aad_text, .len = 8 }, .plaintext = { @@ -346,15 +354,11 @@ struct gmac_test_data { .iv = { .data = { 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, - 0xde, 0xca, 0xf8, 0x88 - }, + 0xde, 0xca, 0xf8, 0x88 }, .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, - 0x00, 0x00, 0x00, 0x00 - }, + .data = gcm_aad_zero_text, .len = 12 }, .plaintext = { @@ -409,10 +413,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, - 0xfe, 0xed, 0xfa, 0xce - }, + .data = gcm_aad_text, .len = 12 }, .plaintext = { @@ -450,6 +451,450 @@ struct gmac_test_data { } }; +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_1 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { 0x00 }, + .len = 0 + }, + .ciphertext = { + .data = { 0x00 }, + .len = 0 + }, + .auth_tag = { + .data = { + 0x53, 0x0F, 0x8A, 0xFB, 0xC7, 0x45, 0x36, 0xB9, + 0xA9, 0x63, 0xB4, 0xF1, 0xC4, 0xCB, 0x73, 0x8B }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_2 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 16 + }, + .ciphertext = { + .data = { + 0xCE, 0xA7, 0x40, 0x3D, 0x4D, 0x60, 0x6B, 0x6E, + 0x07, 0x4E, 0xC5, 0xD3, 0xBA, 0xF3, 0x9D, 0x18 }, + .len = 16 + }, + .auth_tag = { + .data = { + 0xD0, 0xD1, 0xC8, 0xA7, 0x99, 0x99, 0x6B, 0xF0, + 0x26, 0x5B, 0x98, 0xB5, 0xD4, 0x8A, 0xB9, 0x19 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_3 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E }, + .len = 64 + }, + .auth_tag = { + .data = { + 0x64, 0xAF, 0x1D, 0xFB, 0xE8, 0x0D, 0x37, 0xD8, + 0x92, 0xC3, 0xB9, 0x1D, 0xD3, 0x08, 0xAB, 0xFC }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_4 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x63, 0x16, 0x91, 0xAE, 0x17, 0x05, 0x5E, 0xA6, + 0x6D, 0x0A, 0x51, 0xE2, 0x50, 0x21, 0x85, 0x4A }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_5 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0xA7, 0x99, 0xAC, 0xB8, 0x27, 0xDA, 0xB1, 0x82, + 0x79, 0xFD, 0x83, 0x73, 0x52, 0x4D, 0xDB, 0xF1 }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_6 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x5D, 0xA5, 0x0E, 0x53, 0x64, 0x7F, 0x3F, 0xAE, + 0x1A, 0x1F, 0xC0, 0xB0, 0xD8, 0xBE, 0xF2, 0x64 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_7 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x4E, 0xD0, 0x91, 0x95, 0x83, 0xA9, 0x38, 0x72, + 0x09, 0xA9, 0xCE, 0x5F, 0x89, 0x06, 0x4E, 0xC8 }, + .len = 16 + } +}; + +/** variable AAD AES-128 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_1 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08 }, + .len = 16 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GMC_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x42, 0x83, 0x1E, 0xC2, 0x21, 0x77, 0x74, 0x24, + 0x4B, 0x72, 0x21, 0xB7, 0x84, 0xD0, 0xD4, 0x9C, + 0xE3, 0xAA, 0x21, 0x2F, 0x2C, 0x02, 0xA4, 0xE0, + 0x35, 0xC1, 0x7E, 0x23, 0x29, 0xAC, 0xA1, 0x2E, + 0x21, 0xD5, 0x14, 0xB2, 0x54, 0x66, 0x93, 0x1C, + 0x7D, 0x8F, 0x6A, 0x5A, 0xAC, 0x84, 0xAA, 0x05, + 0x1B, 0xA3, 0x0B, 0x39, 0x6A, 0x0A, 0xAC, 0x97, + 0x3D, 0x58, 0xE0, 0x91, 0x47, 0x3F, 0x59, 0x85 + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xCA, 0x70, 0xAF, 0x96, 0xA8, 0x5D, 0x40, 0x47, + 0x0C, 0x3C, 0x48, 0xF5, 0xF0, 0xF5, 0xA5, 0x7D + }, + .len = 16 + } +}; + +/** variable AAD AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_2 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GMC_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xBA, 0x06, 0xDA, 0xA1, 0x91, 0xE1, 0xFE, 0x22, + 0x59, 0xDA, 0x67, 0xAF, 0x9D, 0xA5, 0x43, 0x94 + }, + .len = 16 + } +}; + /** GMAC Test Vectors */ static uint8_t gmac_plaintext[GMAC_LARGE_PLAINTEXT_LENGTH] = { 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, @@ -1228,8 +1673,8 @@ struct cryptodev_perf_test_data { }, .gmac_tag = { .data = { - 0x88, 0x82, 0xb4, 0x93, 0x8f, 0x04, 0xcd, 0x06, - 0xfd, 0xac, 0x6d, 0x8b, 0x9c, 0x9e, 0x8f, 0xec + 0x3f, 0x07, 0xcb, 0xb9, 0x86, 0x3a, 0xea, 0xc2, + 0x2f, 0x3a, 0x2a, 0x93, 0xd8, 0x09, 0x6b, 0xda }, .len = 16 } diff --git a/doc/guides/cryptodevs/aesni_gcm.rst b/doc/guides/cryptodevs/aesni_gcm.rst index 04bf43c..184b71c 100644 --- a/doc/guides/cryptodevs/aesni_gcm.rst +++ b/doc/guides/cryptodevs/aesni_gcm.rst @@ -32,10 +32,8 @@ AES-NI GCM Crypto Poll Mode Driver The AES-NI GCM PMD (**librte_pmd_aesni_gcm**) provides poll mode crypto driver -support for utilizing Intel multi buffer library (see AES-NI Multi-buffer PMD documentation -to learn more about it, including installation). - -The AES-NI GCM PMD has current only been tested on Fedora 21 64-bit with gcc. +support for utilizing Intel ISA-L crypto library, which provides operation acceleration +through the AES-NI instruction sets for AES-GCM authenticated cipher algorithm. Features -------- @@ -49,16 +47,21 @@ Cipher algorithms: Authentication algorithms: * RTE_CRYPTO_AUTH_AES_GCM +* RTE_CRYPTO_AUTH_AES_GMAC + +Installation +------------ + +To build DPDK with the AESNI_GCM_PMD the user is required to install +the ``libisal_crypto`` library in the build environment. +For download and more details please visit `<https://github.com/01org/isa-l_crypto>`_. Initialization -------------- In order to enable this virtual crypto PMD, user must: -* Export the environmental variable AESNI_MULTI_BUFFER_LIB_PATH with the path where - the library was extracted. - -* Build the multi buffer library (go to Installation section in AES-NI MB PMD documentation). +* Install the ISA-L crypto library (explained in Installation section). * Set CONFIG_RTE_LIBRTE_PMD_AESNI_GCM=y in config/common_base. @@ -86,9 +89,7 @@ Example: Limitations ----------- -* Chained mbufs are not supported. +* Chained mbufs are supported but only out-of-place (destination mbuf must be contiguous). * Hash only is not supported. * Cipher only is not supported. -* Only in-place is currently supported (destination address is the same as source address). -* Only supports session-oriented API implementation (session-less APIs are not supported). * Not performance tuned. diff --git a/doc/guides/rel_notes/release_17_02.rst b/doc/guides/rel_notes/release_17_02.rst index 3b65038..a253f0b 100644 --- a/doc/guides/rel_notes/release_17_02.rst +++ b/doc/guides/rel_notes/release_17_02.rst @@ -39,6 +39,19 @@ New Features ========================================================= +* **Updated the AES-NI GCM PMD.** + + The AES-NI GCM PMD was migrated from MB library to ISA-L library. + The migration entailed the following additional support for: + + * GMAC algorithm. + * 256-bit cipher key. + * Session-less mode. + * Out-of place processing + * Scatter-gatter support for chained mbufs (only out-of place and destination + mbuf must be contiguous) + + Resolved Issues --------------- diff --git a/drivers/crypto/aesni_gcm/Makefile b/drivers/crypto/aesni_gcm/Makefile index 5898cae..fb17fbf 100644 --- a/drivers/crypto/aesni_gcm/Makefile +++ b/drivers/crypto/aesni_gcm/Makefile @@ -31,9 +31,6 @@ include $(RTE_SDK)/mk/rte.vars.mk ifneq ($(MAKECMDGOALS),clean) -ifeq ($(AESNI_MULTI_BUFFER_LIB_PATH),) -$(error "Please define AESNI_MULTI_BUFFER_LIB_PATH environment variable") -endif endif # library name @@ -50,10 +47,7 @@ LIBABIVER := 1 EXPORT_MAP := rte_pmd_aesni_gcm_version.map # external library dependencies -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH) -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH)/include -LDLIBS += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -LDLIBS += -lcrypto +LDLIBS += -lisal_crypto # library source files SRCS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += aesni_gcm_pmd.c diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h index c399068..e9de654 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h @@ -37,91 +37,26 @@ #define LINUX #endif -#include <gcm_defines.h> -#include <aux_funcs.h> +#include <isa-l_crypto/aes_gcm.h> -/** Supported vector modes */ -enum aesni_gcm_vector_mode { - RTE_AESNI_GCM_NOT_SUPPORTED = 0, - RTE_AESNI_GCM_SSE, - RTE_AESNI_GCM_AVX, - RTE_AESNI_GCM_AVX2 -}; - -typedef void (*aes_keyexp_128_enc_t)(void *key, void *enc_exp_keys); +typedef void (*aesni_gcm_init_t)(struct gcm_data *my_ctx_data, + uint8_t *iv, + uint8_t const *aad, + uint64_t aad_len); -typedef void (*aesni_gcm_t)(gcm_data *my_ctx_data, u8 *out, const u8 *in, - u64 plaintext_len, u8 *iv, const u8 *aad, u64 aad_len, - u8 *auth_tag, u64 auth_tag_len); +typedef void (*aesni_gcm_update_t)(struct gcm_data *my_ctx_data, + uint8_t *out, + const uint8_t *in, + uint64_t plaintext_len); -typedef void (*aesni_gcm_precomp_t)(gcm_data *my_ctx_data, u8 *hash_subkey); +typedef void (*aesni_gcm_finalize_t)(struct gcm_data *my_ctx_data, + uint8_t *auth_tag, + uint64_t auth_tag_len); -/** GCM library function pointer table */ struct aesni_gcm_ops { - struct { - struct { - aes_keyexp_128_enc_t aes128_enc; - /**< AES128 enc key expansion */ - } keyexp; - /**< Key expansion functions */ - } aux; /**< Auxiliary functions */ - - struct { - aesni_gcm_t enc; /**< GCM encode function pointer */ - aesni_gcm_t dec; /**< GCM decode function pointer */ - aesni_gcm_precomp_t precomp; /**< GCM pre-compute */ - } gcm; /**< GCM functions */ + aesni_gcm_init_t init; + aesni_gcm_update_t update; + aesni_gcm_finalize_t finalize; }; - -static const struct aesni_gcm_ops gcm_ops[] = { - [RTE_AESNI_GCM_NOT_SUPPORTED] = { - .aux = { - .keyexp = { - NULL - } - }, - .gcm = { - NULL - } - }, - [RTE_AESNI_GCM_SSE] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_sse - } - }, - .gcm = { - aesni_gcm_enc_sse, - aesni_gcm_dec_sse, - aesni_gcm_precomp_sse - } - }, - [RTE_AESNI_GCM_AVX] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen2, - aesni_gcm_dec_avx_gen2, - aesni_gcm_precomp_avx_gen2 - } - }, - [RTE_AESNI_GCM_AVX2] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx2, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen4, - aesni_gcm_dec_avx_gen4, - aesni_gcm_precomp_avx_gen4 - } - } -}; - - #endif /* _AESNI_GCM_OPS_H_ */ diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c index dba5e15..31d7e35 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c @@ -30,8 +30,6 @@ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ -#include <openssl/aes.h> - #include <rte_common.h> #include <rte_config.h> #include <rte_hexdump.h> @@ -43,6 +41,34 @@ #include "aesni_gcm_pmd_private.h" +/** GCM encode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_enc[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_enc_update, + aesni_gcm128_enc_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_enc_update, + aesni_gcm256_enc_finalize + } +}; + +/** GCM decode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_dec[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_dec_update, + aesni_gcm128_dec_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_dec_update, + aesni_gcm256_dec_finalize + } +}; + /** * Global static parameter used to create a unique name for each AES-NI multi * buffer crypto device. @@ -64,112 +90,68 @@ return 0; } -static int -aesni_gcm_calculate_hash_sub_key(uint8_t *hsubkey, unsigned hsubkey_length, - uint8_t *aeskey, unsigned aeskey_length) -{ - uint8_t key[aeskey_length] __rte_aligned(16); - AES_KEY enc_key; - - if (hsubkey_length % 16 != 0 && aeskey_length % 16 != 0) - return -EFAULT; - - memcpy(key, aeskey, aeskey_length); - - if (AES_set_encrypt_key(key, aeskey_length << 3, &enc_key) != 0) - return -EFAULT; - - AES_encrypt(hsubkey, hsubkey, &enc_key); - - return 0; -} - -/** Get xform chain order */ -static int -aesni_gcm_get_mode(const struct rte_crypto_sym_xform *xform) -{ - /* - * GCM only supports authenticated encryption or authenticated - * decryption, all other options are invalid, so we must have exactly - * 2 xform structs chained together - */ - if (xform->next == NULL || xform->next->next != NULL) - return -1; - - if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && - xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { - return AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; - } - - if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && - xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { - return AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - } - - return -1; -} - /** Parse crypto xform chain and set private session parameters */ int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *gcm_ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform) { - const struct rte_crypto_sym_xform *auth_xform = NULL; - const struct rte_crypto_sym_xform *cipher_xform = NULL; - - uint8_t hsubkey[16] __rte_aligned(16) = { 0 }; + const struct rte_crypto_sym_xform *auth_xform; + const struct rte_crypto_sym_xform *cipher_xform; - /* Select Crypto operation - hash then cipher / cipher then hash */ - switch (aesni_gcm_get_mode(xform)) { - case AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + if (xform->next == NULL || xform->next->next != NULL) { + GCM_LOG_ERR("Two and only two chained xform required"); + return -EINVAL; + } - cipher_xform = xform; + if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && + xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { auth_xform = xform->next; - break; - case AESNI_GCM_OP_AUTHENTICATED_DECRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - + cipher_xform = xform; + } else if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && + xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { auth_xform = xform; cipher_xform = xform->next; - break; - default: - GCM_LOG_ERR("Unsupported operation chain order parameter"); + } else { + GCM_LOG_ERR("Cipher and auth xform required"); return -EINVAL; } - /* We only support AES GCM */ - if (cipher_xform->cipher.algo != RTE_CRYPTO_CIPHER_AES_GCM && - auth_xform->auth.algo != RTE_CRYPTO_AUTH_AES_GCM) + if (!(cipher_xform->cipher.algo == RTE_CRYPTO_CIPHER_AES_GCM && + (auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GCM || + auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GMAC))) { + GCM_LOG_ERR("We only support AES GCM and AES GMAC"); return -EINVAL; + } - /* Select cipher direction */ - if (sess->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_ENCRYPT) { - GCM_LOG_ERR("xform chain (CIPHER/AUTH) and cipher operation " - "(DECRYPT) specified are an invalid selection"); - return -EINVAL; - } else if (sess->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_DECRYPT) { - GCM_LOG_ERR("xform chain (AUTH/CIPHER) and cipher operation " - "(ENCRYPT) specified are an invalid selection"); + /* Select Crypto operation */ + if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_ENCRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_GENERATE) + sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + else if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_DECRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_VERIFY) + sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; + else { + GCM_LOG_ERR("Cipher/Auth operations: Encrypt/Generate or" + " Decrypt/Verify are valid only"); return -EINVAL; } - /* Expand GCM AES128 key */ - (*gcm_ops->aux.keyexp.aes128_enc)(cipher_xform->cipher.key.data, - sess->gdata.expanded_keys); + /* Check key length and calculate GCM pre-compute. */ + switch (cipher_xform->cipher.key.length) { + case 16: + aesni_gcm128_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_128; - /* Calculate hash sub key here */ - aesni_gcm_calculate_hash_sub_key(hsubkey, sizeof(hsubkey), - cipher_xform->cipher.key.data, - cipher_xform->cipher.key.length); + break; + case 32: + aesni_gcm256_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_256; - /* Calculate GCM pre-compute */ - (*gcm_ops->gcm.precomp)(&sess->gdata, hsubkey); + break; + default: + GCM_LOG_ERR("Unsupported cipher key length"); + return -EINVAL; + } return 0; } @@ -193,10 +175,10 @@ return sess; sess = (struct aesni_gcm_session *) - ((struct rte_cryptodev_session *)_sess)->_private; + ((struct rte_cryptodev_sym_session *)_sess)->_private; - if (unlikely(aesni_gcm_set_session_parameters(qp->ops, - sess, op->xform) != 0)) { + if (unlikely(aesni_gcm_set_session_parameters(sess, + op->xform) != 0)) { rte_mempool_put(qp->sess_mp, _sess); sess = NULL; } @@ -216,19 +198,45 @@ * */ static int -process_gcm_crypto_op(struct aesni_gcm_qp *qp, struct rte_crypto_sym_op *op, +process_gcm_crypto_op(struct rte_crypto_sym_op *op, struct aesni_gcm_session *session) { uint8_t *src, *dst; - struct rte_mbuf *m = op->m_src; + struct rte_mbuf *m_src = op->m_src; + uint32_t offset = op->cipher.data.offset; + uint32_t part_len, total_len, data_len; + + RTE_ASSERT(m_src != NULL); + + while (offset >= m_src->data_len) { + offset -= m_src->data_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + } + + data_len = m_src->data_len - offset; + part_len = (data_len < op->cipher.data.length) ? data_len : + op->cipher.data.length; + + /* Destination buffer is required when segmented source buffer */ + RTE_ASSERT((part_len == op->cipher.data.length) || + ((part_len != op->cipher.data.length) && + (op->m_dst != NULL))); + /* Segmented destination buffer is not supported */ + RTE_ASSERT((op->m_dst == NULL) || + ((op->m_dst != NULL) && + rte_pktmbuf_is_contiguous(op->m_dst))); + - src = rte_pktmbuf_mtod(m, uint8_t *) + op->cipher.data.offset; dst = op->m_dst ? rte_pktmbuf_mtod_offset(op->m_dst, uint8_t *, op->cipher.data.offset) : - rte_pktmbuf_mtod_offset(m, uint8_t *, + rte_pktmbuf_mtod_offset(op->m_src, uint8_t *, op->cipher.data.offset); + src = rte_pktmbuf_mtod_offset(m_src, uint8_t *, offset); + /* sanity checks */ if (op->cipher.iv.length != 16 && op->cipher.iv.length != 12 && op->cipher.iv.length != 0) { @@ -244,48 +252,81 @@ op->cipher.iv.data[15] = 1; } - if (op->auth.aad.length != 12 && op->auth.aad.length != 8 && - op->auth.aad.length != 0) { - GCM_LOG_ERR("iv"); - return -1; - } - if (op->auth.digest.length != 16 && op->auth.digest.length != 12 && - op->auth.digest.length != 8 && - op->auth.digest.length != 0) { - GCM_LOG_ERR("iv"); + op->auth.digest.length != 8) { + GCM_LOG_ERR("digest"); return -1; } if (session->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION) { - (*qp->ops->gcm.enc)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_enc[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_enc[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_enc[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_enc[session->key].finalize(&session->gdata, op->auth.digest.data, (uint64_t)op->auth.digest.length); - } else if (session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION) { - uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(m, + } else { /* session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION */ + uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(op->m_dst ? + op->m_dst : op->m_src, op->auth.digest.length); if (!auth_tag) { - GCM_LOG_ERR("iv"); + GCM_LOG_ERR("auth_tag"); return -1; } - (*qp->ops->gcm.dec)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_dec[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_dec[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_dec[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_dec[session->key].finalize(&session->gdata, auth_tag, (uint64_t)op->auth.digest.length); - } else { - GCM_LOG_ERR("iv"); - return -1; } return 0; @@ -375,7 +416,7 @@ break; } - retval = process_gcm_crypto_op(qp, ops[i]->sym, sess); + retval = process_gcm_crypto_op(ops[i]->sym, sess); if (retval < 0) { ops[i]->status = RTE_CRYPTO_OP_STATUS_INVALID_ARGS; qp->qp_stats.enqueue_err_count++; @@ -413,7 +454,6 @@ struct rte_cryptodev *dev; char crypto_dev_name[RTE_CRYPTODEV_NAME_MAX_LEN]; struct aesni_gcm_private *internals; - enum aesni_gcm_vector_mode vector_mode; /* Check CPU for support for AES instruction set */ if (!rte_cpu_get_flag_enabled(RTE_CPUFLAG_AES)) { @@ -421,18 +461,6 @@ return -EFAULT; } - /* Check CPU for supported vector instruction set */ - if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX2)) - vector_mode = RTE_AESNI_GCM_AVX2; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX)) - vector_mode = RTE_AESNI_GCM_AVX; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_SSE4_1)) - vector_mode = RTE_AESNI_GCM_SSE; - else { - GCM_LOG_ERR("Vector instructions are not supported by CPU"); - return -EFAULT; - } - /* create a unique device name */ if (create_unique_device_name(crypto_dev_name, RTE_CRYPTODEV_NAME_MAX_LEN) != 0) { @@ -459,25 +487,8 @@ RTE_CRYPTODEV_FF_SYM_OPERATION_CHAINING | RTE_CRYPTODEV_FF_CPU_AESNI; - switch (vector_mode) { - case RTE_AESNI_GCM_SSE: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_SSE; - break; - case RTE_AESNI_GCM_AVX: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX; - break; - case RTE_AESNI_GCM_AVX2: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX2; - break; - default: - break; - } - - /* Set vector instructions mode supported */ internals = dev->data->dev_private; - internals->vector_mode = vector_mode; - internals->max_nb_queue_pairs = init_params->max_nb_queue_pairs; internals->max_nb_sessions = init_params->max_nb_sessions; diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c index e824d4b..899ef28 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c @@ -39,17 +39,17 @@ #include "aesni_gcm_pmd_private.h" static const struct rte_cryptodev_capabilities aesni_gcm_pmd_capabilities[] = { - { /* AES GCM (AUTH) */ + { /* AES GMAC (AUTH) */ .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, {.sym = { .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, {.auth = { - .algo = RTE_CRYPTO_AUTH_AES_GCM, + .algo = RTE_CRYPTO_AUTH_AES_GMAC, .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .digest_size = { .min = 8, @@ -57,9 +57,34 @@ .increment = 4 }, .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 + } + }, } + }, } + }, + { /* AES GCM (AUTH) */ + .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, + {.sym = { + .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, + {.auth = { + .algo = RTE_CRYPTO_AUTH_AES_GCM, + .block_size = 16, + .key_size = { + .min = 16, + .max = 32, + .increment = 16 + }, + .digest_size = { .min = 8, - .max = 12, + .max = 16, .increment = 4 + }, + .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 } }, } }, } @@ -73,8 +98,8 @@ .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .iv_size = { .min = 16, @@ -221,7 +246,6 @@ int socket_id) { struct aesni_gcm_qp *qp = NULL; - struct aesni_gcm_private *internals = dev->data->dev_private; /* Free memory prior to re-allocation if needed. */ if (dev->data->queue_pairs[qp_id] != NULL) @@ -239,8 +263,6 @@ if (aesni_gcm_pmd_qp_set_unique_name(dev, qp)) goto qp_setup_cleanup; - qp->ops = &gcm_ops[internals->vector_mode]; - qp->processed_pkts = aesni_gcm_pmd_qp_create_processed_pkts_ring(qp, qp_conf->nb_descriptors, socket_id); if (qp->processed_pkts == NULL) @@ -291,18 +313,15 @@ /** Configure a aesni gcm session from a crypto xform chain */ static void * -aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev, +aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev __rte_unused, struct rte_crypto_sym_xform *xform, void *sess) { - struct aesni_gcm_private *internals = dev->data->dev_private; - if (unlikely(sess == NULL)) { GCM_LOG_ERR("invalid session struct"); return NULL; } - if (aesni_gcm_set_session_parameters(&gcm_ops[internals->vector_mode], - sess, xform) != 0) { + if (aesni_gcm_set_session_parameters(sess, xform) != 0) { GCM_LOG_ERR("failed configure session parameters"); return NULL; } diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h index 9878d6e..0496b44 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h @@ -58,8 +58,6 @@ /** private data structure for each virtual AESNI GCM device */ struct aesni_gcm_private { - enum aesni_gcm_vector_mode vector_mode; - /**< Vector mode */ unsigned max_nb_queue_pairs; /**< Max number of queue pairs supported by device */ unsigned max_nb_sessions; @@ -71,8 +69,6 @@ struct aesni_gcm_qp { /**< Queue Pair Identifier */ char name[RTE_CRYPTODEV_NAME_LEN]; /**< Unique Queue Pair Name */ - const struct aesni_gcm_ops *ops; - /**< Architecture dependent function pointer table of the gcm APIs */ struct rte_ring *processed_pkts; /**< Ring for placing process packets */ struct rte_mempool *sess_mp; @@ -87,10 +83,17 @@ enum aesni_gcm_operation { AESNI_GCM_OP_AUTHENTICATED_DECRYPTION }; +enum aesni_gcm_key { + AESNI_GCM_KEY_128, + AESNI_GCM_KEY_256 +}; + /** AESNI GCM private session structure */ struct aesni_gcm_session { enum aesni_gcm_operation op; /**< GCM operation type */ + enum aesni_gcm_key key; + /**< GCM key type */ struct gcm_data gdata __rte_cache_aligned; /**< GCM parameters */ }; @@ -98,7 +101,6 @@ struct aesni_gcm_session { /** * Setup GCM session parameters - * @param ops gcm ops function pointer table * @param sess aesni gcm session structure * @param xform crypto transform chain * @@ -107,8 +109,7 @@ struct aesni_gcm_session { * - On failure returns error code < 0 */ extern int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform); diff --git a/mk/rte.app.mk b/mk/rte.app.mk index f75f0e2..ed3eab5 100644 --- a/mk/rte.app.mk +++ b/mk/rte.app.mk @@ -134,8 +134,7 @@ _LDLIBS-$(CONFIG_RTE_LIBRTE_VMXNET3_PMD) += -lrte_pmd_vmxnet3_uio ifeq ($(CONFIG_RTE_LIBRTE_CRYPTODEV),y) _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -lrte_pmd_aesni_mb _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lcrypto -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB +_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lisal_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_OPENSSL) += -lrte_pmd_openssl -lcrypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_NULL_CRYPTO) += -lrte_pmd_null_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_QAT) += -lrte_pmd_qat -lcrypto -- 1.7.9.5 ^ permalink raw reply [flat|nested] 15+ messages in thread
* [dpdk-dev] [PATCH v3] crypto/aesni_gcm: migration from MB library to ISA-L 2016-12-27 18:53 [dpdk-dev] [PATCH v2] crypto/aesni_gcm: migration from MB library to ISA-L Michal Jastrzebski @ 2017-01-03 13:02 ` Piotr Azarewicz 2017-01-03 14:14 ` Thomas Monjalon 2017-01-05 13:51 ` [dpdk-dev] [PATCH v4] " Piotr Azarewicz 0 siblings, 2 replies; 15+ messages in thread From: Piotr Azarewicz @ 2017-01-03 13:02 UTC (permalink / raw) To: pablo.de.lara.guarch, dev Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer Crypto library.This patch reimplement the device using ISA-L Crypto library: https://github.com/01org/isa-l_crypto. The migration entailed the following additional support for: * GMAC algorithm. * 256-bit cipher key. * Session-less mode. * Out-of place processing * Scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) Verified current unit tests and added new unit tests to verify new functionalities. Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> --- To be applied on top of: [dpdk-dev] [PATCH v2 0/3] Fix iv sizes in crypto drivers capabilities v3 changes: - rebase on top of dpdk-next-crypto v2 changes: - implement native scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) - write unit test for session-less mode - write unit test for out-of place processing - add support for GMAC authentication algorithm app/test/test_cryptodev.c | 739 +++++++++++++++++++--- app/test/test_cryptodev_gcm_test_vectors.h | 487 +++++++++++++- doc/guides/cryptodevs/aesni_gcm.rst | 23 +- doc/guides/rel_notes/release_17_02.rst | 13 + drivers/crypto/aesni_gcm/Makefile | 8 +- drivers/crypto/aesni_gcm/aesni_gcm_ops.h | 95 +-- drivers/crypto/aesni_gcm/aesni_gcm_pmd.c | 307 ++++----- drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c | 49 +- drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h | 15 +- mk/rte.app.mk | 3 +- 10 files changed, 1356 insertions(+), 383 deletions(-) diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c index ba6bbb5..ecbf765 100644 --- a/app/test/test_cryptodev.c +++ b/app/test/test_cryptodev.c @@ -3979,16 +3979,48 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +create_gcm_xforms(struct rte_crypto_op *op, + enum rte_crypto_cipher_operation cipher_op, + uint8_t *key, const uint8_t key_len, + const uint8_t aad_len, const uint8_t auth_len, + enum rte_crypto_auth_operation auth_op) +{ + TEST_ASSERT_NOT_NULL(rte_crypto_op_sym_xforms_alloc(op, 2), + "failed to allocate space for crypto transforms"); + + struct rte_crypto_sym_op *sym_op = op->sym; + + /* Setup Cipher Parameters */ + sym_op->xform->type = RTE_CRYPTO_SYM_XFORM_CIPHER; + sym_op->xform->cipher.algo = RTE_CRYPTO_CIPHER_AES_GCM; + sym_op->xform->cipher.op = cipher_op; + sym_op->xform->cipher.key.data = key; + sym_op->xform->cipher.key.length = key_len; + + TEST_HEXDUMP(stdout, "key:", key, key_len); + + /* Setup Authentication Parameters */ + sym_op->xform->next->type = RTE_CRYPTO_SYM_XFORM_AUTH; + sym_op->xform->next->auth.algo = RTE_CRYPTO_AUTH_AES_GCM; + sym_op->xform->next->auth.op = auth_op; + sym_op->xform->next->auth.digest_length = auth_len; + sym_op->xform->next->auth.add_auth_data_length = aad_len; + sym_op->xform->next->auth.key.length = 0; + sym_op->xform->next->auth.key.data = NULL; + sym_op->xform->next->next = NULL; + + return 0; +} + +static int create_gcm_operation(enum rte_crypto_cipher_operation op, - const uint8_t *auth_tag, const unsigned auth_tag_len, - const uint8_t *iv, const unsigned iv_len, - const uint8_t *aad, const unsigned aad_len, - const unsigned data_len, unsigned data_pad_len) + const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; - unsigned iv_pad_len = 0, aad_buffer_len; + uint8_t *plaintext, *ciphertext; + unsigned int iv_pad_len, aad_pad_len, plaintext_pad_len; /* Generate Crypto op data structure */ ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool, @@ -3998,63 +4030,118 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct rte_crypto_sym_op *sym_op = ut_params->op->sym; - sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( - ut_params->ibuf, auth_tag_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, - "no room to append digest"); - sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( - ut_params->ibuf, data_pad_len); - sym_op->auth.digest.length = auth_tag_len; - - if (op == RTE_CRYPTO_CIPHER_OP_DECRYPT) { - rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len); - TEST_HEXDUMP(stdout, "digest:", - sym_op->auth.digest.data, - sym_op->auth.digest.length); - } + /* Append aad data */ + aad_pad_len = RTE_ALIGN_CEIL(tdata->aad.len, 16); + sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + aad_pad_len); + TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, + "no room to append aad"); - /* iv */ - iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16); + sym_op->auth.aad.length = tdata->aad.len; + sym_op->auth.aad.phys_addr = + rte_pktmbuf_mtophys(ut_params->ibuf); + memcpy(sym_op->auth.aad.data, tdata->aad.data, tdata->aad.len); + TEST_HEXDUMP(stdout, "aad:", sym_op->auth.aad.data, + sym_op->auth.aad.length); + /* Prepend iv */ + iv_pad_len = RTE_ALIGN_CEIL(tdata->iv.len, 16); sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend( ut_params->ibuf, iv_pad_len); TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv"); memset(sym_op->cipher.iv.data, 0, iv_pad_len); sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf); - sym_op->cipher.iv.length = iv_len; + sym_op->cipher.iv.length = tdata->iv.len; - rte_memcpy(sym_op->cipher.iv.data, iv, iv_len); + rte_memcpy(sym_op->cipher.iv.data, tdata->iv.data, tdata->iv.len); + TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, + sym_op->cipher.iv.length); - /* - * Always allocate the aad up to the block size. - * The cryptodev API calls out - - * - the array must be big enough to hold the AAD, plus any - * space to round this up to the nearest multiple of the - * block size (16 bytes). - */ - aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16); + /* Append plaintext/ciphertext */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, "no room to append plaintext"); - sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend( - ut_params->ibuf, aad_buffer_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, - "no room to prepend aad"); - sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys( - ut_params->ibuf); - sym_op->auth.aad.length = aad_len; + memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); + TEST_HEXDUMP(stdout, "plaintext:", plaintext, + tdata->plaintext.len); - memset(sym_op->auth.aad.data, 0, aad_buffer_len); - rte_memcpy(sym_op->auth.aad.data, aad, aad_len); + if (ut_params->obuf) { + ciphertext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, iv_pad_len); - TEST_HEXDUMP(stdout, "aad:", - sym_op->auth.aad.data, aad_len); + memset(ciphertext + aad_pad_len + iv_pad_len, 0, + tdata->ciphertext.len); + } + } else { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); + ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - sym_op->cipher.data.length = data_len; - sym_op->cipher.data.offset = aad_buffer_len + iv_pad_len; + memcpy(ciphertext, tdata->ciphertext.data, + tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, + tdata->ciphertext.len); - sym_op->auth.data.offset = aad_buffer_len + iv_pad_len; - sym_op->auth.data.length = data_len; + if (ut_params->obuf) { + plaintext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, + "no room to append plaintext"); + + memset(plaintext + aad_pad_len + iv_pad_len, 0, + tdata->plaintext.len); + } + } + + /* Append digest data */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + memset(sym_op->auth.digest.data, 0, tdata->auth_tag.len); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + plaintext_pad_len + + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; + } else { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->ibuf, tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->ibuf, + plaintext_pad_len + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; + + rte_memcpy(sym_op->auth.digest.data, tdata->auth_tag.data, + tdata->auth_tag.len); + TEST_HEXDUMP(stdout, "digest:", + sym_op->auth.digest.data, + sym_op->auth.digest.length); + } + + sym_op->cipher.data.length = tdata->plaintext.len; + sym_op->cipher.data.offset = aad_pad_len + iv_pad_len; + + sym_op->auth.data.length = tdata->plaintext.len; + sym_op->auth.data.offset = aad_pad_len + iv_pad_len; return 0; } @@ -4066,9 +4153,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext, *auth_tag; + uint8_t *ciphertext, *auth_tag; uint16_t plaintext_pad_len; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4079,31 +4166,20 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GMC_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); /* clear mbuf payload */ memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - /* - * Append data which is padded to a multiple - * of the algorithms block size - */ - plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); - - plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - plaintext_pad_len); - memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); - - TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->plaintext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->plaintext.len, plaintext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); if (retval < 0) return retval; @@ -4118,14 +4194,18 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, "crypto op processing failed"); + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + if (ut_params->op->sym->m_dst) { ciphertext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); auth_tag = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_dst, uint8_t *, plaintext_pad_len); } else { - ciphertext = plaintext; - auth_tag = plaintext + plaintext_pad_len; + ciphertext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; } TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); @@ -4191,15 +4271,68 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_encryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_2); +} + +static int test_mb_AES_GCM_authenticated_decryption(const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext; - uint16_t ciphertext_pad_len; + uint8_t *plaintext; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4210,31 +4343,23 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - /* alloc mbuf and set payload */ - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GMC_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); - - ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - ciphertext_pad_len); - memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len); - - TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->ciphertext.len, ciphertext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); if (retval < 0) return retval; - rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); ut_params->op->sym->m_src = ut_params->ibuf; @@ -4250,7 +4375,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) plaintext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); else - plaintext = ciphertext; + plaintext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); @@ -4310,6 +4437,358 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_decryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_2); +} + +static int +test_AES_GCM_authenticated_encryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_oop(void) +{ + return test_AES_GCM_authenticated_encryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_DECRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_oop(void) +{ + return test_AES_GCM_authenticated_decryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_encryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + uint8_t key[tdata->key.len + 1]; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_sessionless(void) +{ + return test_AES_GCM_authenticated_encryption_sessionless( + &gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + uint8_t key[tdata->key.len + 1]; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_DECRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_sessionless(void) +{ + return test_AES_GCM_authenticated_decryption_sessionless( + &gcm_test_case_5); +} + +static int test_stats(void) { struct crypto_testsuite_params *ts_params = &testsuite_params; @@ -6740,6 +7219,82 @@ struct test_crypto_vector { TEST_CASE_ST(ut_setup, ut_teardown, test_mb_AES_GCM_authenticated_decryption_test_case_7), + /** AES GCM Authenticated Encryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_7), + + /** AES GCM Authenticated Decryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_7), + + /** AES GCM Authenticated Encryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_2), + + /** AES GCM Authenticated Decryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_2), + + /** AES GMAC Authentication */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_4), + + /** Negative tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_data_corrupt), + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_tag_corrupt), + + /** Out of place tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_oop), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_oop), + + /** Session-less tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_sessionless), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_sessionless), + TEST_CASES_END() /**< NULL terminate unit test array */ } }; diff --git a/app/test/test_cryptodev_gcm_test_vectors.h b/app/test/test_cryptodev_gcm_test_vectors.h index df984fc..3cb004e 100644 --- a/app/test/test_cryptodev_gcm_test_vectors.h +++ b/app/test/test_cryptodev_gcm_test_vectors.h @@ -33,7 +33,17 @@ #ifndef TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ #define TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ -#define GMAC_LARGE_PLAINTEXT_LENGTH 65376 +#define GMAC_LARGE_PLAINTEXT_LENGTH 65344 +#define GMC_MAX_AAD_LENGTH 65536 +#define GMC_LARGE_AAD_LENGTH 65296 + +static uint8_t gcm_aad_zero_text[GMC_MAX_AAD_LENGTH] = { 0 }; + +static uint8_t gcm_aad_text[GMC_MAX_AAD_LENGTH] = { + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6, 0x97, + 0x88, 0x79, 0x6a, 0x5b, 0x4c, 0x3d, 0x2e, 0x1f }; struct gcm_test_data { @@ -48,7 +58,7 @@ struct gcm_test_data { } iv; struct { - uint8_t data[64]; + uint8_t *data; unsigned len; } aad; @@ -111,7 +121,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -148,7 +158,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -186,7 +196,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -238,8 +248,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .data = gcm_aad_zero_text, .len = 8 }, .plaintext = { @@ -294,8 +303,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef }, + .data = gcm_aad_text, .len = 8 }, .plaintext = { @@ -346,15 +354,11 @@ struct gmac_test_data { .iv = { .data = { 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, - 0xde, 0xca, 0xf8, 0x88 - }, + 0xde, 0xca, 0xf8, 0x88 }, .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, - 0x00, 0x00, 0x00, 0x00 - }, + .data = gcm_aad_zero_text, .len = 12 }, .plaintext = { @@ -409,10 +413,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, - 0xfe, 0xed, 0xfa, 0xce - }, + .data = gcm_aad_text, .len = 12 }, .plaintext = { @@ -450,6 +451,450 @@ struct gmac_test_data { } }; +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_1 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { 0x00 }, + .len = 0 + }, + .ciphertext = { + .data = { 0x00 }, + .len = 0 + }, + .auth_tag = { + .data = { + 0x53, 0x0F, 0x8A, 0xFB, 0xC7, 0x45, 0x36, 0xB9, + 0xA9, 0x63, 0xB4, 0xF1, 0xC4, 0xCB, 0x73, 0x8B }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_2 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 16 + }, + .ciphertext = { + .data = { + 0xCE, 0xA7, 0x40, 0x3D, 0x4D, 0x60, 0x6B, 0x6E, + 0x07, 0x4E, 0xC5, 0xD3, 0xBA, 0xF3, 0x9D, 0x18 }, + .len = 16 + }, + .auth_tag = { + .data = { + 0xD0, 0xD1, 0xC8, 0xA7, 0x99, 0x99, 0x6B, 0xF0, + 0x26, 0x5B, 0x98, 0xB5, 0xD4, 0x8A, 0xB9, 0x19 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_3 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E }, + .len = 64 + }, + .auth_tag = { + .data = { + 0x64, 0xAF, 0x1D, 0xFB, 0xE8, 0x0D, 0x37, 0xD8, + 0x92, 0xC3, 0xB9, 0x1D, 0xD3, 0x08, 0xAB, 0xFC }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_4 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x63, 0x16, 0x91, 0xAE, 0x17, 0x05, 0x5E, 0xA6, + 0x6D, 0x0A, 0x51, 0xE2, 0x50, 0x21, 0x85, 0x4A }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_5 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0xA7, 0x99, 0xAC, 0xB8, 0x27, 0xDA, 0xB1, 0x82, + 0x79, 0xFD, 0x83, 0x73, 0x52, 0x4D, 0xDB, 0xF1 }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_6 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x5D, 0xA5, 0x0E, 0x53, 0x64, 0x7F, 0x3F, 0xAE, + 0x1A, 0x1F, 0xC0, 0xB0, 0xD8, 0xBE, 0xF2, 0x64 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_7 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x4E, 0xD0, 0x91, 0x95, 0x83, 0xA9, 0x38, 0x72, + 0x09, 0xA9, 0xCE, 0x5F, 0x89, 0x06, 0x4E, 0xC8 }, + .len = 16 + } +}; + +/** variable AAD AES-128 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_1 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08 }, + .len = 16 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GMC_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x42, 0x83, 0x1E, 0xC2, 0x21, 0x77, 0x74, 0x24, + 0x4B, 0x72, 0x21, 0xB7, 0x84, 0xD0, 0xD4, 0x9C, + 0xE3, 0xAA, 0x21, 0x2F, 0x2C, 0x02, 0xA4, 0xE0, + 0x35, 0xC1, 0x7E, 0x23, 0x29, 0xAC, 0xA1, 0x2E, + 0x21, 0xD5, 0x14, 0xB2, 0x54, 0x66, 0x93, 0x1C, + 0x7D, 0x8F, 0x6A, 0x5A, 0xAC, 0x84, 0xAA, 0x05, + 0x1B, 0xA3, 0x0B, 0x39, 0x6A, 0x0A, 0xAC, 0x97, + 0x3D, 0x58, 0xE0, 0x91, 0x47, 0x3F, 0x59, 0x85 + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xCA, 0x70, 0xAF, 0x96, 0xA8, 0x5D, 0x40, 0x47, + 0x0C, 0x3C, 0x48, 0xF5, 0xF0, 0xF5, 0xA5, 0x7D + }, + .len = 16 + } +}; + +/** variable AAD AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_2 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GMC_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xBA, 0x06, 0xDA, 0xA1, 0x91, 0xE1, 0xFE, 0x22, + 0x59, 0xDA, 0x67, 0xAF, 0x9D, 0xA5, 0x43, 0x94 + }, + .len = 16 + } +}; + /** GMAC Test Vectors */ static uint8_t gmac_plaintext[GMAC_LARGE_PLAINTEXT_LENGTH] = { 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, @@ -1228,8 +1673,8 @@ struct cryptodev_perf_test_data { }, .gmac_tag = { .data = { - 0x88, 0x82, 0xb4, 0x93, 0x8f, 0x04, 0xcd, 0x06, - 0xfd, 0xac, 0x6d, 0x8b, 0x9c, 0x9e, 0x8f, 0xec + 0x3f, 0x07, 0xcb, 0xb9, 0x86, 0x3a, 0xea, 0xc2, + 0x2f, 0x3a, 0x2a, 0x93, 0xd8, 0x09, 0x6b, 0xda }, .len = 16 } diff --git a/doc/guides/cryptodevs/aesni_gcm.rst b/doc/guides/cryptodevs/aesni_gcm.rst index 04bf43c..184b71c 100644 --- a/doc/guides/cryptodevs/aesni_gcm.rst +++ b/doc/guides/cryptodevs/aesni_gcm.rst @@ -32,10 +32,8 @@ AES-NI GCM Crypto Poll Mode Driver The AES-NI GCM PMD (**librte_pmd_aesni_gcm**) provides poll mode crypto driver -support for utilizing Intel multi buffer library (see AES-NI Multi-buffer PMD documentation -to learn more about it, including installation). - -The AES-NI GCM PMD has current only been tested on Fedora 21 64-bit with gcc. +support for utilizing Intel ISA-L crypto library, which provides operation acceleration +through the AES-NI instruction sets for AES-GCM authenticated cipher algorithm. Features -------- @@ -49,16 +47,21 @@ Cipher algorithms: Authentication algorithms: * RTE_CRYPTO_AUTH_AES_GCM +* RTE_CRYPTO_AUTH_AES_GMAC + +Installation +------------ + +To build DPDK with the AESNI_GCM_PMD the user is required to install +the ``libisal_crypto`` library in the build environment. +For download and more details please visit `<https://github.com/01org/isa-l_crypto>`_. Initialization -------------- In order to enable this virtual crypto PMD, user must: -* Export the environmental variable AESNI_MULTI_BUFFER_LIB_PATH with the path where - the library was extracted. - -* Build the multi buffer library (go to Installation section in AES-NI MB PMD documentation). +* Install the ISA-L crypto library (explained in Installation section). * Set CONFIG_RTE_LIBRTE_PMD_AESNI_GCM=y in config/common_base. @@ -86,9 +89,7 @@ Example: Limitations ----------- -* Chained mbufs are not supported. +* Chained mbufs are supported but only out-of-place (destination mbuf must be contiguous). * Hash only is not supported. * Cipher only is not supported. -* Only in-place is currently supported (destination address is the same as source address). -* Only supports session-oriented API implementation (session-less APIs are not supported). * Not performance tuned. diff --git a/doc/guides/rel_notes/release_17_02.rst b/doc/guides/rel_notes/release_17_02.rst index 5ab7019..fad945b 100644 --- a/doc/guides/rel_notes/release_17_02.rst +++ b/doc/guides/rel_notes/release_17_02.rst @@ -68,6 +68,19 @@ New Features * Support for single operations (cipher only and authentication only). +* **Updated the AES-NI GCM PMD.** + + The AES-NI GCM PMD was migrated from MB library to ISA-L library. + The migration entailed the following additional support for: + + * GMAC algorithm. + * 256-bit cipher key. + * Session-less mode. + * Out-of place processing + * Scatter-gatter support for chained mbufs (only out-of place and destination + mbuf must be contiguous) + + Resolved Issues --------------- diff --git a/drivers/crypto/aesni_gcm/Makefile b/drivers/crypto/aesni_gcm/Makefile index 5898cae..fb17fbf 100644 --- a/drivers/crypto/aesni_gcm/Makefile +++ b/drivers/crypto/aesni_gcm/Makefile @@ -31,9 +31,6 @@ include $(RTE_SDK)/mk/rte.vars.mk ifneq ($(MAKECMDGOALS),clean) -ifeq ($(AESNI_MULTI_BUFFER_LIB_PATH),) -$(error "Please define AESNI_MULTI_BUFFER_LIB_PATH environment variable") -endif endif # library name @@ -50,10 +47,7 @@ LIBABIVER := 1 EXPORT_MAP := rte_pmd_aesni_gcm_version.map # external library dependencies -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH) -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH)/include -LDLIBS += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -LDLIBS += -lcrypto +LDLIBS += -lisal_crypto # library source files SRCS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += aesni_gcm_pmd.c diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h index c399068..e9de654 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h @@ -37,91 +37,26 @@ #define LINUX #endif -#include <gcm_defines.h> -#include <aux_funcs.h> +#include <isa-l_crypto/aes_gcm.h> -/** Supported vector modes */ -enum aesni_gcm_vector_mode { - RTE_AESNI_GCM_NOT_SUPPORTED = 0, - RTE_AESNI_GCM_SSE, - RTE_AESNI_GCM_AVX, - RTE_AESNI_GCM_AVX2 -}; - -typedef void (*aes_keyexp_128_enc_t)(void *key, void *enc_exp_keys); +typedef void (*aesni_gcm_init_t)(struct gcm_data *my_ctx_data, + uint8_t *iv, + uint8_t const *aad, + uint64_t aad_len); -typedef void (*aesni_gcm_t)(gcm_data *my_ctx_data, u8 *out, const u8 *in, - u64 plaintext_len, u8 *iv, const u8 *aad, u64 aad_len, - u8 *auth_tag, u64 auth_tag_len); +typedef void (*aesni_gcm_update_t)(struct gcm_data *my_ctx_data, + uint8_t *out, + const uint8_t *in, + uint64_t plaintext_len); -typedef void (*aesni_gcm_precomp_t)(gcm_data *my_ctx_data, u8 *hash_subkey); +typedef void (*aesni_gcm_finalize_t)(struct gcm_data *my_ctx_data, + uint8_t *auth_tag, + uint64_t auth_tag_len); -/** GCM library function pointer table */ struct aesni_gcm_ops { - struct { - struct { - aes_keyexp_128_enc_t aes128_enc; - /**< AES128 enc key expansion */ - } keyexp; - /**< Key expansion functions */ - } aux; /**< Auxiliary functions */ - - struct { - aesni_gcm_t enc; /**< GCM encode function pointer */ - aesni_gcm_t dec; /**< GCM decode function pointer */ - aesni_gcm_precomp_t precomp; /**< GCM pre-compute */ - } gcm; /**< GCM functions */ + aesni_gcm_init_t init; + aesni_gcm_update_t update; + aesni_gcm_finalize_t finalize; }; - -static const struct aesni_gcm_ops gcm_ops[] = { - [RTE_AESNI_GCM_NOT_SUPPORTED] = { - .aux = { - .keyexp = { - NULL - } - }, - .gcm = { - NULL - } - }, - [RTE_AESNI_GCM_SSE] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_sse - } - }, - .gcm = { - aesni_gcm_enc_sse, - aesni_gcm_dec_sse, - aesni_gcm_precomp_sse - } - }, - [RTE_AESNI_GCM_AVX] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen2, - aesni_gcm_dec_avx_gen2, - aesni_gcm_precomp_avx_gen2 - } - }, - [RTE_AESNI_GCM_AVX2] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx2, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen4, - aesni_gcm_dec_avx_gen4, - aesni_gcm_precomp_avx_gen4 - } - } -}; - - #endif /* _AESNI_GCM_OPS_H_ */ diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c index af3d60f..0c501a5 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c @@ -30,8 +30,6 @@ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ -#include <openssl/aes.h> - #include <rte_common.h> #include <rte_config.h> #include <rte_hexdump.h> @@ -44,6 +42,34 @@ #include "aesni_gcm_pmd_private.h" +/** GCM encode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_enc[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_enc_update, + aesni_gcm128_enc_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_enc_update, + aesni_gcm256_enc_finalize + } +}; + +/** GCM decode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_dec[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_dec_update, + aesni_gcm128_dec_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_dec_update, + aesni_gcm256_dec_finalize + } +}; + /** * Global static parameter used to create a unique name for each AES-NI multi * buffer crypto device. @@ -65,112 +91,68 @@ return 0; } -static int -aesni_gcm_calculate_hash_sub_key(uint8_t *hsubkey, unsigned hsubkey_length, - uint8_t *aeskey, unsigned aeskey_length) -{ - uint8_t key[aeskey_length] __rte_aligned(16); - AES_KEY enc_key; - - if (hsubkey_length % 16 != 0 && aeskey_length % 16 != 0) - return -EFAULT; - - memcpy(key, aeskey, aeskey_length); - - if (AES_set_encrypt_key(key, aeskey_length << 3, &enc_key) != 0) - return -EFAULT; - - AES_encrypt(hsubkey, hsubkey, &enc_key); - - return 0; -} - -/** Get xform chain order */ -static int -aesni_gcm_get_mode(const struct rte_crypto_sym_xform *xform) -{ - /* - * GCM only supports authenticated encryption or authenticated - * decryption, all other options are invalid, so we must have exactly - * 2 xform structs chained together - */ - if (xform->next == NULL || xform->next->next != NULL) - return -1; - - if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && - xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { - return AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; - } - - if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && - xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { - return AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - } - - return -1; -} - /** Parse crypto xform chain and set private session parameters */ int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *gcm_ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform) { - const struct rte_crypto_sym_xform *auth_xform = NULL; - const struct rte_crypto_sym_xform *cipher_xform = NULL; - - uint8_t hsubkey[16] __rte_aligned(16) = { 0 }; + const struct rte_crypto_sym_xform *auth_xform; + const struct rte_crypto_sym_xform *cipher_xform; - /* Select Crypto operation - hash then cipher / cipher then hash */ - switch (aesni_gcm_get_mode(xform)) { - case AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + if (xform->next == NULL || xform->next->next != NULL) { + GCM_LOG_ERR("Two and only two chained xform required"); + return -EINVAL; + } - cipher_xform = xform; + if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && + xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { auth_xform = xform->next; - break; - case AESNI_GCM_OP_AUTHENTICATED_DECRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - + cipher_xform = xform; + } else if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && + xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { auth_xform = xform; cipher_xform = xform->next; - break; - default: - GCM_LOG_ERR("Unsupported operation chain order parameter"); + } else { + GCM_LOG_ERR("Cipher and auth xform required"); return -EINVAL; } - /* We only support AES GCM */ - if (cipher_xform->cipher.algo != RTE_CRYPTO_CIPHER_AES_GCM && - auth_xform->auth.algo != RTE_CRYPTO_AUTH_AES_GCM) + if (!(cipher_xform->cipher.algo == RTE_CRYPTO_CIPHER_AES_GCM && + (auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GCM || + auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GMAC))) { + GCM_LOG_ERR("We only support AES GCM and AES GMAC"); return -EINVAL; + } - /* Select cipher direction */ - if (sess->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_ENCRYPT) { - GCM_LOG_ERR("xform chain (CIPHER/AUTH) and cipher operation " - "(DECRYPT) specified are an invalid selection"); - return -EINVAL; - } else if (sess->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_DECRYPT) { - GCM_LOG_ERR("xform chain (AUTH/CIPHER) and cipher operation " - "(ENCRYPT) specified are an invalid selection"); + /* Select Crypto operation */ + if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_ENCRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_GENERATE) + sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + else if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_DECRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_VERIFY) + sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; + else { + GCM_LOG_ERR("Cipher/Auth operations: Encrypt/Generate or" + " Decrypt/Verify are valid only"); return -EINVAL; } - /* Expand GCM AES128 key */ - (*gcm_ops->aux.keyexp.aes128_enc)(cipher_xform->cipher.key.data, - sess->gdata.expanded_keys); + /* Check key length and calculate GCM pre-compute. */ + switch (cipher_xform->cipher.key.length) { + case 16: + aesni_gcm128_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_128; - /* Calculate hash sub key here */ - aesni_gcm_calculate_hash_sub_key(hsubkey, sizeof(hsubkey), - cipher_xform->cipher.key.data, - cipher_xform->cipher.key.length); + break; + case 32: + aesni_gcm256_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_256; - /* Calculate GCM pre-compute */ - (*gcm_ops->gcm.precomp)(&sess->gdata, hsubkey); + break; + default: + GCM_LOG_ERR("Unsupported cipher key length"); + return -EINVAL; + } return 0; } @@ -194,10 +176,10 @@ return sess; sess = (struct aesni_gcm_session *) - ((struct rte_cryptodev_session *)_sess)->_private; + ((struct rte_cryptodev_sym_session *)_sess)->_private; - if (unlikely(aesni_gcm_set_session_parameters(qp->ops, - sess, op->xform) != 0)) { + if (unlikely(aesni_gcm_set_session_parameters(sess, + op->xform) != 0)) { rte_mempool_put(qp->sess_mp, _sess); sess = NULL; } @@ -217,19 +199,45 @@ * */ static int -process_gcm_crypto_op(struct aesni_gcm_qp *qp, struct rte_crypto_sym_op *op, +process_gcm_crypto_op(struct rte_crypto_sym_op *op, struct aesni_gcm_session *session) { uint8_t *src, *dst; - struct rte_mbuf *m = op->m_src; + struct rte_mbuf *m_src = op->m_src; + uint32_t offset = op->cipher.data.offset; + uint32_t part_len, total_len, data_len; + + RTE_ASSERT(m_src != NULL); + + while (offset >= m_src->data_len) { + offset -= m_src->data_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + } + + data_len = m_src->data_len - offset; + part_len = (data_len < op->cipher.data.length) ? data_len : + op->cipher.data.length; + + /* Destination buffer is required when segmented source buffer */ + RTE_ASSERT((part_len == op->cipher.data.length) || + ((part_len != op->cipher.data.length) && + (op->m_dst != NULL))); + /* Segmented destination buffer is not supported */ + RTE_ASSERT((op->m_dst == NULL) || + ((op->m_dst != NULL) && + rte_pktmbuf_is_contiguous(op->m_dst))); + - src = rte_pktmbuf_mtod(m, uint8_t *) + op->cipher.data.offset; dst = op->m_dst ? rte_pktmbuf_mtod_offset(op->m_dst, uint8_t *, op->cipher.data.offset) : - rte_pktmbuf_mtod_offset(m, uint8_t *, + rte_pktmbuf_mtod_offset(op->m_src, uint8_t *, op->cipher.data.offset); + src = rte_pktmbuf_mtod_offset(m_src, uint8_t *, offset); + /* sanity checks */ if (op->cipher.iv.length != 16 && op->cipher.iv.length != 12 && op->cipher.iv.length != 0) { @@ -246,48 +254,81 @@ *iv_padd = rte_bswap32(1); } - if (op->auth.aad.length != 12 && op->auth.aad.length != 8 && - op->auth.aad.length != 0) { - GCM_LOG_ERR("iv"); - return -1; - } - if (op->auth.digest.length != 16 && op->auth.digest.length != 12 && - op->auth.digest.length != 8 && - op->auth.digest.length != 0) { - GCM_LOG_ERR("iv"); + op->auth.digest.length != 8) { + GCM_LOG_ERR("digest"); return -1; } if (session->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION) { - (*qp->ops->gcm.enc)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_enc[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_enc[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_enc[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_enc[session->key].finalize(&session->gdata, op->auth.digest.data, (uint64_t)op->auth.digest.length); - } else if (session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION) { - uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(m, + } else { /* session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION */ + uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(op->m_dst ? + op->m_dst : op->m_src, op->auth.digest.length); if (!auth_tag) { - GCM_LOG_ERR("iv"); + GCM_LOG_ERR("auth_tag"); return -1; } - (*qp->ops->gcm.dec)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_dec[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_dec[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_dec[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_dec[session->key].finalize(&session->gdata, auth_tag, (uint64_t)op->auth.digest.length); - } else { - GCM_LOG_ERR("iv"); - return -1; } return 0; @@ -377,7 +418,7 @@ break; } - retval = process_gcm_crypto_op(qp, ops[i]->sym, sess); + retval = process_gcm_crypto_op(ops[i]->sym, sess); if (retval < 0) { ops[i]->status = RTE_CRYPTO_OP_STATUS_INVALID_ARGS; qp->qp_stats.enqueue_err_count++; @@ -415,7 +456,6 @@ struct rte_cryptodev *dev; char crypto_dev_name[RTE_CRYPTODEV_NAME_MAX_LEN]; struct aesni_gcm_private *internals; - enum aesni_gcm_vector_mode vector_mode; /* Check CPU for support for AES instruction set */ if (!rte_cpu_get_flag_enabled(RTE_CPUFLAG_AES)) { @@ -423,18 +463,6 @@ return -EFAULT; } - /* Check CPU for supported vector instruction set */ - if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX2)) - vector_mode = RTE_AESNI_GCM_AVX2; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX)) - vector_mode = RTE_AESNI_GCM_AVX; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_SSE4_1)) - vector_mode = RTE_AESNI_GCM_SSE; - else { - GCM_LOG_ERR("Vector instructions are not supported by CPU"); - return -EFAULT; - } - /* create a unique device name */ if (create_unique_device_name(crypto_dev_name, RTE_CRYPTODEV_NAME_MAX_LEN) != 0) { @@ -461,25 +489,8 @@ RTE_CRYPTODEV_FF_SYM_OPERATION_CHAINING | RTE_CRYPTODEV_FF_CPU_AESNI; - switch (vector_mode) { - case RTE_AESNI_GCM_SSE: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_SSE; - break; - case RTE_AESNI_GCM_AVX: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX; - break; - case RTE_AESNI_GCM_AVX2: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX2; - break; - default: - break; - } - - /* Set vector instructions mode supported */ internals = dev->data->dev_private; - internals->vector_mode = vector_mode; - internals->max_nb_queue_pairs = init_params->max_nb_queue_pairs; internals->max_nb_sessions = init_params->max_nb_sessions; diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c index c51f82a..2362006 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c @@ -39,17 +39,17 @@ #include "aesni_gcm_pmd_private.h" static const struct rte_cryptodev_capabilities aesni_gcm_pmd_capabilities[] = { - { /* AES GCM (AUTH) */ + { /* AES GMAC (AUTH) */ .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, {.sym = { .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, {.auth = { - .algo = RTE_CRYPTO_AUTH_AES_GCM, + .algo = RTE_CRYPTO_AUTH_AES_GMAC, .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .digest_size = { .min = 8, @@ -57,9 +57,34 @@ .increment = 4 }, .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 + } + }, } + }, } + }, + { /* AES GCM (AUTH) */ + .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, + {.sym = { + .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, + {.auth = { + .algo = RTE_CRYPTO_AUTH_AES_GCM, + .block_size = 16, + .key_size = { + .min = 16, + .max = 32, + .increment = 16 + }, + .digest_size = { .min = 8, - .max = 12, + .max = 16, .increment = 4 + }, + .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 } }, } }, } @@ -73,8 +98,8 @@ .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .iv_size = { .min = 12, @@ -221,7 +246,6 @@ int socket_id) { struct aesni_gcm_qp *qp = NULL; - struct aesni_gcm_private *internals = dev->data->dev_private; /* Free memory prior to re-allocation if needed. */ if (dev->data->queue_pairs[qp_id] != NULL) @@ -239,8 +263,6 @@ if (aesni_gcm_pmd_qp_set_unique_name(dev, qp)) goto qp_setup_cleanup; - qp->ops = &gcm_ops[internals->vector_mode]; - qp->processed_pkts = aesni_gcm_pmd_qp_create_processed_pkts_ring(qp, qp_conf->nb_descriptors, socket_id); if (qp->processed_pkts == NULL) @@ -291,18 +313,15 @@ /** Configure a aesni gcm session from a crypto xform chain */ static void * -aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev, +aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev __rte_unused, struct rte_crypto_sym_xform *xform, void *sess) { - struct aesni_gcm_private *internals = dev->data->dev_private; - if (unlikely(sess == NULL)) { GCM_LOG_ERR("invalid session struct"); return NULL; } - if (aesni_gcm_set_session_parameters(&gcm_ops[internals->vector_mode], - sess, xform) != 0) { + if (aesni_gcm_set_session_parameters(sess, xform) != 0) { GCM_LOG_ERR("failed configure session parameters"); return NULL; } diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h index 9878d6e..0496b44 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h @@ -58,8 +58,6 @@ /** private data structure for each virtual AESNI GCM device */ struct aesni_gcm_private { - enum aesni_gcm_vector_mode vector_mode; - /**< Vector mode */ unsigned max_nb_queue_pairs; /**< Max number of queue pairs supported by device */ unsigned max_nb_sessions; @@ -71,8 +69,6 @@ struct aesni_gcm_qp { /**< Queue Pair Identifier */ char name[RTE_CRYPTODEV_NAME_LEN]; /**< Unique Queue Pair Name */ - const struct aesni_gcm_ops *ops; - /**< Architecture dependent function pointer table of the gcm APIs */ struct rte_ring *processed_pkts; /**< Ring for placing process packets */ struct rte_mempool *sess_mp; @@ -87,10 +83,17 @@ enum aesni_gcm_operation { AESNI_GCM_OP_AUTHENTICATED_DECRYPTION }; +enum aesni_gcm_key { + AESNI_GCM_KEY_128, + AESNI_GCM_KEY_256 +}; + /** AESNI GCM private session structure */ struct aesni_gcm_session { enum aesni_gcm_operation op; /**< GCM operation type */ + enum aesni_gcm_key key; + /**< GCM key type */ struct gcm_data gdata __rte_cache_aligned; /**< GCM parameters */ }; @@ -98,7 +101,6 @@ struct aesni_gcm_session { /** * Setup GCM session parameters - * @param ops gcm ops function pointer table * @param sess aesni gcm session structure * @param xform crypto transform chain * @@ -107,8 +109,7 @@ struct aesni_gcm_session { * - On failure returns error code < 0 */ extern int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform); diff --git a/mk/rte.app.mk b/mk/rte.app.mk index f75f0e2..ed3eab5 100644 --- a/mk/rte.app.mk +++ b/mk/rte.app.mk @@ -134,8 +134,7 @@ _LDLIBS-$(CONFIG_RTE_LIBRTE_VMXNET3_PMD) += -lrte_pmd_vmxnet3_uio ifeq ($(CONFIG_RTE_LIBRTE_CRYPTODEV),y) _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -lrte_pmd_aesni_mb _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lcrypto -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB +_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lisal_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_OPENSSL) += -lrte_pmd_openssl -lcrypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_NULL_CRYPTO) += -lrte_pmd_null_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_QAT) += -lrte_pmd_qat -lcrypto -- 1.7.9.5 ^ permalink raw reply [flat|nested] 15+ messages in thread
* Re: [dpdk-dev] [PATCH v3] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-03 13:02 ` [dpdk-dev] [PATCH v3] " Piotr Azarewicz @ 2017-01-03 14:14 ` Thomas Monjalon 2017-01-05 13:51 ` [dpdk-dev] [PATCH v4] " Piotr Azarewicz 1 sibling, 0 replies; 15+ messages in thread From: Thomas Monjalon @ 2017-01-03 14:14 UTC (permalink / raw) To: Piotr Azarewicz; +Cc: dev, pablo.de.lara.guarch 2017-01-03 14:02, Piotr Azarewicz: > Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer > Crypto library.This patch reimplement the device using ISA-L Crypto > library: https://github.com/01org/isa-l_crypto. > > The migration entailed the following additional support for: > * GMAC algorithm. > * 256-bit cipher key. > * Session-less mode. > * Out-of place processing > * Scatter-gatter support for chained mbufs (only out-of place and > destination mbuf must be contiguous) > > Verified current unit tests and added new unit tests to verify new > functionalities. > > Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> [...] > The AES-NI GCM PMD (**librte_pmd_aesni_gcm**) provides poll mode crypto driver > -support for utilizing Intel multi buffer library (see AES-NI Multi-buffer PMD documentation > -to learn more about it, including installation). > - > -The AES-NI GCM PMD has current only been tested on Fedora 21 64-bit with gcc. > +support for utilizing Intel ISA-L crypto library, which provides operation acceleration > +through the AES-NI instruction sets for AES-GCM authenticated cipher algorithm. Please could you compare these libraries regarding the performance? [...] > Features > -------- > @@ -49,16 +47,21 @@ Cipher algorithms: > Authentication algorithms: > > * RTE_CRYPTO_AUTH_AES_GCM > +* RTE_CRYPTO_AUTH_AES_GMAC > + > +Installation > +------------ > + > +To build DPDK with the AESNI_GCM_PMD the user is required to install > +the ``libisal_crypto`` library in the build environment. > +For download and more details please visit `<https://github.com/01org/isa-l_crypto>`_. [...] > Limitations > ----------- > > -* Chained mbufs are not supported. > +* Chained mbufs are supported but only out-of-place (destination mbuf must be contiguous). > * Hash only is not supported. > * Cipher only is not supported. > -* Only in-place is currently supported (destination address is the same as source address). > -* Only supports session-oriented API implementation (session-less APIs are not supported). > * Not performance tuned. [...] > --- a/drivers/crypto/aesni_gcm/Makefile > +++ b/drivers/crypto/aesni_gcm/Makefile > @@ -31,9 +31,6 @@ > include $(RTE_SDK)/mk/rte.vars.mk > > ifneq ($(MAKECMDGOALS),clean) > -ifeq ($(AESNI_MULTI_BUFFER_LIB_PATH),) > -$(error "Please define AESNI_MULTI_BUFFER_LIB_PATH environment variable") > -endif > endif > > # library name > @@ -50,10 +47,7 @@ LIBABIVER := 1 > EXPORT_MAP := rte_pmd_aesni_gcm_version.map > > # external library dependencies > -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH) > -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH)/include > -LDLIBS += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB > -LDLIBS += -lcrypto > +LDLIBS += -lisal_crypto You need to update the script test-build.sh. Thanks ^ permalink raw reply [flat|nested] 15+ messages in thread
* [dpdk-dev] [PATCH v4] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-03 13:02 ` [dpdk-dev] [PATCH v3] " Piotr Azarewicz 2017-01-03 14:14 ` Thomas Monjalon @ 2017-01-05 13:51 ` Piotr Azarewicz 2017-01-05 15:12 ` Thomas Monjalon 2017-01-16 12:37 ` [dpdk-dev] [PATCH v5] " Piotr Azarewicz 1 sibling, 2 replies; 15+ messages in thread From: Piotr Azarewicz @ 2017-01-05 13:51 UTC (permalink / raw) To: pablo.de.lara.guarch, dev Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer Crypto library.This patch reimplement the device using ISA-L Crypto library: https://github.com/01org/isa-l_crypto. The migration entailed the following additional support for: * GMAC algorithm. * 256-bit cipher key. * Session-less mode. * Out-of place processing * Scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) Verified current unit tests and added new unit tests to verify new functionalities. PERFORMANCE COMPARISON ---------------------- Comparison the new and old implementation is made by running app/test and calling cryptodev_aesni_gcm_perftest. As we may see below, the new implementation has small performance drop when buffer size is above 64B. Old implementation with MB library: Cipher algo: AES_GCM Cipher hash: AES_GCM ciphr key: 128b burst size: 32 Buffer Size(B) OPS(M) Throughput(Gbps) Retries EmptyPolls 64 4.57 2.34 0 0 128 4.28 4.39 0 0 256 2.76 5.66 0 0 512 1.60 6.56 0 0 1024 0.90 7.34 0 0 1536 0.62 7.66 0 0 2048 0.48 7.84 0 0 New implementation with ISA-L library: Cipher algo: AES_GCM Cipher hash: AES_GCM ciphr key: 128b burst size: 32 Buffer Size(B) OPS(M) Throughput(Gbps) Retries EmptyPolls 64 4.62 2.37 0 0 128 4.06 4.16 0 0 256 2.65 5.44 0 0 512 1.57 6.45 0 0 1024 0.89 7.26 0 0 1536 0.62 7.58 0 0 2048 0.47 7.77 0 0 Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> --- To be applied on top of: [dpdk-dev] [PATCH v3 0/5] Chained Mbufs support in SW PMDs v4 changes: - rebase on top of dpdk-next-crypto - update the script test-build.sh v3 changes: - rebase on top of dpdk-next-crypto v2 changes: - implement native scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) - write unit test for session-less mode - write unit test for out-of place processing - add support for GMAC authentication algorithm app/test/test_cryptodev.c | 753 +++++++++++++++++++--- app/test/test_cryptodev_gcm_test_vectors.h | 491 +++++++++++++- doc/guides/cryptodevs/aesni_gcm.rst | 23 +- doc/guides/rel_notes/release_17_02.rst | 12 + drivers/crypto/aesni_gcm/Makefile | 8 +- drivers/crypto/aesni_gcm/aesni_gcm_ops.h | 95 +-- drivers/crypto/aesni_gcm/aesni_gcm_pmd.c | 324 +++++----- drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c | 49 +- drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h | 15 +- mk/rte.app.mk | 3 +- scripts/test-build.sh | 4 +- 11 files changed, 1363 insertions(+), 414 deletions(-) diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c index 4c9a54f..1d5c67d 100644 --- a/app/test/test_cryptodev.c +++ b/app/test/test_cryptodev.c @@ -4317,16 +4317,48 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +create_gcm_xforms(struct rte_crypto_op *op, + enum rte_crypto_cipher_operation cipher_op, + uint8_t *key, const uint8_t key_len, + const uint8_t aad_len, const uint8_t auth_len, + enum rte_crypto_auth_operation auth_op) +{ + TEST_ASSERT_NOT_NULL(rte_crypto_op_sym_xforms_alloc(op, 2), + "failed to allocate space for crypto transforms"); + + struct rte_crypto_sym_op *sym_op = op->sym; + + /* Setup Cipher Parameters */ + sym_op->xform->type = RTE_CRYPTO_SYM_XFORM_CIPHER; + sym_op->xform->cipher.algo = RTE_CRYPTO_CIPHER_AES_GCM; + sym_op->xform->cipher.op = cipher_op; + sym_op->xform->cipher.key.data = key; + sym_op->xform->cipher.key.length = key_len; + + TEST_HEXDUMP(stdout, "key:", key, key_len); + + /* Setup Authentication Parameters */ + sym_op->xform->next->type = RTE_CRYPTO_SYM_XFORM_AUTH; + sym_op->xform->next->auth.algo = RTE_CRYPTO_AUTH_AES_GCM; + sym_op->xform->next->auth.op = auth_op; + sym_op->xform->next->auth.digest_length = auth_len; + sym_op->xform->next->auth.add_auth_data_length = aad_len; + sym_op->xform->next->auth.key.length = 0; + sym_op->xform->next->auth.key.data = NULL; + sym_op->xform->next->next = NULL; + + return 0; +} + +static int create_gcm_operation(enum rte_crypto_cipher_operation op, - const uint8_t *auth_tag, const unsigned auth_tag_len, - const uint8_t *iv, const unsigned iv_len, - const uint8_t *aad, const unsigned aad_len, - const unsigned data_len, unsigned data_pad_len) + const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; - unsigned iv_pad_len = 0, aad_buffer_len; + uint8_t *plaintext, *ciphertext; + unsigned int iv_pad_len, aad_pad_len, plaintext_pad_len; /* Generate Crypto op data structure */ ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool, @@ -4336,77 +4368,118 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct rte_crypto_sym_op *sym_op = ut_params->op->sym; - if (ut_params->obuf) { - sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( - ut_params->obuf, auth_tag_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, - "no room to append digest"); - sym_op->auth.digest.phys_addr = sgl_pktmbuf_mtophys_offset( - ut_params->obuf, data_pad_len); - } else { - sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( - ut_params->ibuf, auth_tag_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, - "no room to append digest"); - sym_op->auth.digest.phys_addr = sgl_pktmbuf_mtophys_offset( - ut_params->ibuf, data_pad_len); - } - sym_op->auth.digest.length = auth_tag_len; - - if (op == RTE_CRYPTO_CIPHER_OP_DECRYPT) { - rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len); - TEST_HEXDUMP(stdout, "digest:", - sym_op->auth.digest.data, - sym_op->auth.digest.length); - } + /* Append aad data */ + aad_pad_len = RTE_ALIGN_CEIL(tdata->aad.len, 16); + sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + aad_pad_len); + TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, + "no room to append aad"); - /* iv */ - iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16); + sym_op->auth.aad.length = tdata->aad.len; + sym_op->auth.aad.phys_addr = + rte_pktmbuf_mtophys(ut_params->ibuf); + memcpy(sym_op->auth.aad.data, tdata->aad.data, tdata->aad.len); + TEST_HEXDUMP(stdout, "aad:", sym_op->auth.aad.data, + sym_op->auth.aad.length); + /* Prepend iv */ + iv_pad_len = RTE_ALIGN_CEIL(tdata->iv.len, 16); sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend( ut_params->ibuf, iv_pad_len); TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv"); memset(sym_op->cipher.iv.data, 0, iv_pad_len); sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf); - sym_op->cipher.iv.length = iv_len; + sym_op->cipher.iv.length = tdata->iv.len; - rte_memcpy(sym_op->cipher.iv.data, iv, iv_len); + rte_memcpy(sym_op->cipher.iv.data, tdata->iv.data, tdata->iv.len); + TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, + sym_op->cipher.iv.length); - /* - * Always allocate the aad up to the block size. - * The cryptodev API calls out - - * - the array must be big enough to hold the AAD, plus any - * space to round this up to the nearest multiple of the - * block size (16 bytes). - */ - aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16); + /* Append plaintext/ciphertext */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, "no room to append plaintext"); - sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend( - ut_params->ibuf, aad_buffer_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, - "no room to prepend aad"); - sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys( - ut_params->ibuf); - sym_op->auth.aad.length = aad_len; + memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); + TEST_HEXDUMP(stdout, "plaintext:", plaintext, + tdata->plaintext.len); - memset(sym_op->auth.aad.data, 0, aad_buffer_len); - rte_memcpy(sym_op->auth.aad.data, aad, aad_len); + if (ut_params->obuf) { + ciphertext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, iv_pad_len); - TEST_HEXDUMP(stdout, "aad:", - sym_op->auth.aad.data, aad_len); + memset(ciphertext + aad_pad_len + iv_pad_len, 0, + tdata->ciphertext.len); + } + } else { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); + ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - if (ut_params->obuf) { - rte_pktmbuf_prepend(ut_params->obuf, iv_pad_len); - rte_pktmbuf_prepend(ut_params->obuf, aad_buffer_len); + memcpy(ciphertext, tdata->ciphertext.data, + tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, + tdata->ciphertext.len); + + if (ut_params->obuf) { + plaintext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, + "no room to append plaintext"); + + memset(plaintext + aad_pad_len + iv_pad_len, 0, + tdata->plaintext.len); + } } - sym_op->cipher.data.length = data_len; - sym_op->cipher.data.offset = aad_buffer_len + iv_pad_len; + /* Append digest data */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + memset(sym_op->auth.digest.data, 0, tdata->auth_tag.len); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + plaintext_pad_len + + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; + } else { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->ibuf, tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->ibuf, + plaintext_pad_len + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; - sym_op->auth.data.offset = aad_buffer_len + iv_pad_len; - sym_op->auth.data.length = data_len; + rte_memcpy(sym_op->auth.digest.data, tdata->auth_tag.data, + tdata->auth_tag.len); + TEST_HEXDUMP(stdout, "digest:", + sym_op->auth.digest.data, + sym_op->auth.digest.length); + } + + sym_op->cipher.data.length = tdata->plaintext.len; + sym_op->cipher.data.offset = aad_pad_len + iv_pad_len; + + sym_op->auth.data.length = tdata->plaintext.len; + sym_op->auth.data.offset = aad_pad_len + iv_pad_len; return 0; } @@ -4418,9 +4491,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext, *auth_tag; + uint8_t *ciphertext, *auth_tag; uint16_t plaintext_pad_len; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4431,31 +4504,20 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GMC_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); /* clear mbuf payload */ memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - /* - * Append data which is padded to a multiple - * of the algorithms block size - */ - plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); - - plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - plaintext_pad_len); - memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); - - TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->plaintext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->plaintext.len, plaintext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); if (retval < 0) return retval; @@ -4470,14 +4532,18 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, "crypto op processing failed"); + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + if (ut_params->op->sym->m_dst) { ciphertext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); auth_tag = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_dst, uint8_t *, plaintext_pad_len); } else { - ciphertext = plaintext; - auth_tag = plaintext + plaintext_pad_len; + ciphertext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; } TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); @@ -4543,15 +4609,68 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_encryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_2); +} + +static int test_mb_AES_GCM_authenticated_decryption(const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext; - uint16_t ciphertext_pad_len; + uint8_t *plaintext; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4562,31 +4681,23 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - /* alloc mbuf and set payload */ - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GMC_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); - - ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - ciphertext_pad_len); - memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len); - - TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->ciphertext.len, ciphertext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); if (retval < 0) return retval; - rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); ut_params->op->sym->m_src = ut_params->ibuf; @@ -4602,7 +4713,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) plaintext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); else - plaintext = ciphertext; + plaintext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); @@ -4662,6 +4775,358 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_decryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_2); +} + +static int +test_AES_GCM_authenticated_encryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_oop(void) +{ + return test_AES_GCM_authenticated_encryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_DECRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_oop(void) +{ + return test_AES_GCM_authenticated_decryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_encryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + uint8_t key[tdata->key.len + 1]; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_sessionless(void) +{ + return test_AES_GCM_authenticated_encryption_sessionless( + &gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + uint8_t key[tdata->key.len + 1]; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_DECRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_sessionless(void) +{ + return test_AES_GCM_authenticated_decryption_sessionless( + &gcm_test_case_5); +} + +static int test_stats(void) { struct crypto_testsuite_params *ts_params = &testsuite_params; @@ -7102,6 +7567,86 @@ struct test_crypto_vector { TEST_CASE_ST(ut_setup, ut_teardown, test_mb_AES_GCM_authenticated_decryption_test_case_7), + /** AES GCM Authenticated Encryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_7), + + /** AES GCM Authenticated Decryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_7), + + /** AES GCM Authenticated Encryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_2), + + /** AES GCM Authenticated Decryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_2), + + /** AES GMAC Authentication */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_4), + + /** Negative tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_data_corrupt), + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_tag_corrupt), + + /** Out of place tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_oop), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_oop), + + /** Session-less tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_sessionless), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_sessionless), + + /** Scatter-Gather */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GCM_auth_encrypt_SGL_out_of_place_400B_1seg), + TEST_CASES_END() /**< NULL terminate unit test array */ } }; diff --git a/app/test/test_cryptodev_gcm_test_vectors.h b/app/test/test_cryptodev_gcm_test_vectors.h index 45ea3d4..81b4755 100644 --- a/app/test/test_cryptodev_gcm_test_vectors.h +++ b/app/test/test_cryptodev_gcm_test_vectors.h @@ -33,7 +33,17 @@ #ifndef TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ #define TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ -#define GMAC_LARGE_PLAINTEXT_LENGTH 65376 +#define GMAC_LARGE_PLAINTEXT_LENGTH 65344 +#define GMC_MAX_AAD_LENGTH 65536 +#define GMC_LARGE_AAD_LENGTH 65296 + +static uint8_t gcm_aad_zero_text[GMC_MAX_AAD_LENGTH] = { 0 }; + +static uint8_t gcm_aad_text[GMC_MAX_AAD_LENGTH] = { + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6, 0x97, + 0x88, 0x79, 0x6a, 0x5b, 0x4c, 0x3d, 0x2e, 0x1f }; struct gcm_test_data { @@ -48,7 +58,7 @@ struct gcm_test_data { } iv; struct { - uint8_t data[64]; + uint8_t *data; unsigned len; } aad; @@ -111,7 +121,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -148,7 +158,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -186,7 +196,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -238,8 +248,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .data = gcm_aad_zero_text, .len = 8 }, .plaintext = { @@ -294,8 +303,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef }, + .data = gcm_aad_text, .len = 8 }, .plaintext = { @@ -351,10 +359,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, - 0x00, 0x00, 0x00, 0x00 - }, + .data = gcm_aad_zero_text, .len = 12 }, .plaintext = { @@ -409,10 +414,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, - 0xfe, 0xed, 0xfa, 0xce - }, + .data = gcm_aad_text, .len = 12 }, .plaintext = { @@ -466,10 +468,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, - 0xfe, 0xed, 0xfa, 0xce - }, + .data = gcm_aad_text, .len = 12 }, .plaintext = { @@ -1003,6 +1002,450 @@ struct gmac_test_data { } }; +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_1 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { 0x00 }, + .len = 0 + }, + .ciphertext = { + .data = { 0x00 }, + .len = 0 + }, + .auth_tag = { + .data = { + 0x53, 0x0F, 0x8A, 0xFB, 0xC7, 0x45, 0x36, 0xB9, + 0xA9, 0x63, 0xB4, 0xF1, 0xC4, 0xCB, 0x73, 0x8B }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_2 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 16 + }, + .ciphertext = { + .data = { + 0xCE, 0xA7, 0x40, 0x3D, 0x4D, 0x60, 0x6B, 0x6E, + 0x07, 0x4E, 0xC5, 0xD3, 0xBA, 0xF3, 0x9D, 0x18 }, + .len = 16 + }, + .auth_tag = { + .data = { + 0xD0, 0xD1, 0xC8, 0xA7, 0x99, 0x99, 0x6B, 0xF0, + 0x26, 0x5B, 0x98, 0xB5, 0xD4, 0x8A, 0xB9, 0x19 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_3 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E }, + .len = 64 + }, + .auth_tag = { + .data = { + 0x64, 0xAF, 0x1D, 0xFB, 0xE8, 0x0D, 0x37, 0xD8, + 0x92, 0xC3, 0xB9, 0x1D, 0xD3, 0x08, 0xAB, 0xFC }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_4 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x63, 0x16, 0x91, 0xAE, 0x17, 0x05, 0x5E, 0xA6, + 0x6D, 0x0A, 0x51, 0xE2, 0x50, 0x21, 0x85, 0x4A }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_5 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0xA7, 0x99, 0xAC, 0xB8, 0x27, 0xDA, 0xB1, 0x82, + 0x79, 0xFD, 0x83, 0x73, 0x52, 0x4D, 0xDB, 0xF1 }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_6 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x5D, 0xA5, 0x0E, 0x53, 0x64, 0x7F, 0x3F, 0xAE, + 0x1A, 0x1F, 0xC0, 0xB0, 0xD8, 0xBE, 0xF2, 0x64 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_7 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x4E, 0xD0, 0x91, 0x95, 0x83, 0xA9, 0x38, 0x72, + 0x09, 0xA9, 0xCE, 0x5F, 0x89, 0x06, 0x4E, 0xC8 }, + .len = 16 + } +}; + +/** variable AAD AES-128 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_1 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08 }, + .len = 16 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GMC_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x42, 0x83, 0x1E, 0xC2, 0x21, 0x77, 0x74, 0x24, + 0x4B, 0x72, 0x21, 0xB7, 0x84, 0xD0, 0xD4, 0x9C, + 0xE3, 0xAA, 0x21, 0x2F, 0x2C, 0x02, 0xA4, 0xE0, + 0x35, 0xC1, 0x7E, 0x23, 0x29, 0xAC, 0xA1, 0x2E, + 0x21, 0xD5, 0x14, 0xB2, 0x54, 0x66, 0x93, 0x1C, + 0x7D, 0x8F, 0x6A, 0x5A, 0xAC, 0x84, 0xAA, 0x05, + 0x1B, 0xA3, 0x0B, 0x39, 0x6A, 0x0A, 0xAC, 0x97, + 0x3D, 0x58, 0xE0, 0x91, 0x47, 0x3F, 0x59, 0x85 + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xCA, 0x70, 0xAF, 0x96, 0xA8, 0x5D, 0x40, 0x47, + 0x0C, 0x3C, 0x48, 0xF5, 0xF0, 0xF5, 0xA5, 0x7D + }, + .len = 16 + } +}; + +/** variable AAD AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_2 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GMC_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xBA, 0x06, 0xDA, 0xA1, 0x91, 0xE1, 0xFE, 0x22, + 0x59, 0xDA, 0x67, 0xAF, 0x9D, 0xA5, 0x43, 0x94 + }, + .len = 16 + } +}; + /** GMAC Test Vectors */ static uint8_t gmac_plaintext[GMAC_LARGE_PLAINTEXT_LENGTH] = { 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, @@ -1781,8 +2224,8 @@ struct cryptodev_perf_test_data { }, .gmac_tag = { .data = { - 0x88, 0x82, 0xb4, 0x93, 0x8f, 0x04, 0xcd, 0x06, - 0xfd, 0xac, 0x6d, 0x8b, 0x9c, 0x9e, 0x8f, 0xec + 0x3f, 0x07, 0xcb, 0xb9, 0x86, 0x3a, 0xea, 0xc2, + 0x2f, 0x3a, 0x2a, 0x93, 0xd8, 0x09, 0x6b, 0xda }, .len = 16 } @@ -1802,7 +2245,7 @@ struct cryptodev_perf_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { diff --git a/doc/guides/cryptodevs/aesni_gcm.rst b/doc/guides/cryptodevs/aesni_gcm.rst index 04bf43c..184b71c 100644 --- a/doc/guides/cryptodevs/aesni_gcm.rst +++ b/doc/guides/cryptodevs/aesni_gcm.rst @@ -32,10 +32,8 @@ AES-NI GCM Crypto Poll Mode Driver The AES-NI GCM PMD (**librte_pmd_aesni_gcm**) provides poll mode crypto driver -support for utilizing Intel multi buffer library (see AES-NI Multi-buffer PMD documentation -to learn more about it, including installation). - -The AES-NI GCM PMD has current only been tested on Fedora 21 64-bit with gcc. +support for utilizing Intel ISA-L crypto library, which provides operation acceleration +through the AES-NI instruction sets for AES-GCM authenticated cipher algorithm. Features -------- @@ -49,16 +47,21 @@ Cipher algorithms: Authentication algorithms: * RTE_CRYPTO_AUTH_AES_GCM +* RTE_CRYPTO_AUTH_AES_GMAC + +Installation +------------ + +To build DPDK with the AESNI_GCM_PMD the user is required to install +the ``libisal_crypto`` library in the build environment. +For download and more details please visit `<https://github.com/01org/isa-l_crypto>`_. Initialization -------------- In order to enable this virtual crypto PMD, user must: -* Export the environmental variable AESNI_MULTI_BUFFER_LIB_PATH with the path where - the library was extracted. - -* Build the multi buffer library (go to Installation section in AES-NI MB PMD documentation). +* Install the ISA-L crypto library (explained in Installation section). * Set CONFIG_RTE_LIBRTE_PMD_AESNI_GCM=y in config/common_base. @@ -86,9 +89,7 @@ Example: Limitations ----------- -* Chained mbufs are not supported. +* Chained mbufs are supported but only out-of-place (destination mbuf must be contiguous). * Hash only is not supported. * Cipher only is not supported. -* Only in-place is currently supported (destination address is the same as source address). -* Only supports session-oriented API implementation (session-less APIs are not supported). * Not performance tuned. diff --git a/doc/guides/rel_notes/release_17_02.rst b/doc/guides/rel_notes/release_17_02.rst index 5ab7019..3b982b2 100644 --- a/doc/guides/rel_notes/release_17_02.rst +++ b/doc/guides/rel_notes/release_17_02.rst @@ -67,6 +67,18 @@ New Features * Support for single operations (cipher only and authentication only). +* **Updated the AES-NI GCM PMD.** + + The AES-NI GCM PMD was migrated from MB library to ISA-L library. + The migration entailed the following additional support for: + + * GMAC algorithm. + * 256-bit cipher key. + * Session-less mode. + * Out-of place processing + * Scatter-gatter support for chained mbufs (only out-of place and destination + mbuf must be contiguous) + Resolved Issues --------------- diff --git a/drivers/crypto/aesni_gcm/Makefile b/drivers/crypto/aesni_gcm/Makefile index 5898cae..fb17fbf 100644 --- a/drivers/crypto/aesni_gcm/Makefile +++ b/drivers/crypto/aesni_gcm/Makefile @@ -31,9 +31,6 @@ include $(RTE_SDK)/mk/rte.vars.mk ifneq ($(MAKECMDGOALS),clean) -ifeq ($(AESNI_MULTI_BUFFER_LIB_PATH),) -$(error "Please define AESNI_MULTI_BUFFER_LIB_PATH environment variable") -endif endif # library name @@ -50,10 +47,7 @@ LIBABIVER := 1 EXPORT_MAP := rte_pmd_aesni_gcm_version.map # external library dependencies -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH) -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH)/include -LDLIBS += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -LDLIBS += -lcrypto +LDLIBS += -lisal_crypto # library source files SRCS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += aesni_gcm_pmd.c diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h index c399068..e9de654 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h @@ -37,91 +37,26 @@ #define LINUX #endif -#include <gcm_defines.h> -#include <aux_funcs.h> +#include <isa-l_crypto/aes_gcm.h> -/** Supported vector modes */ -enum aesni_gcm_vector_mode { - RTE_AESNI_GCM_NOT_SUPPORTED = 0, - RTE_AESNI_GCM_SSE, - RTE_AESNI_GCM_AVX, - RTE_AESNI_GCM_AVX2 -}; - -typedef void (*aes_keyexp_128_enc_t)(void *key, void *enc_exp_keys); +typedef void (*aesni_gcm_init_t)(struct gcm_data *my_ctx_data, + uint8_t *iv, + uint8_t const *aad, + uint64_t aad_len); -typedef void (*aesni_gcm_t)(gcm_data *my_ctx_data, u8 *out, const u8 *in, - u64 plaintext_len, u8 *iv, const u8 *aad, u64 aad_len, - u8 *auth_tag, u64 auth_tag_len); +typedef void (*aesni_gcm_update_t)(struct gcm_data *my_ctx_data, + uint8_t *out, + const uint8_t *in, + uint64_t plaintext_len); -typedef void (*aesni_gcm_precomp_t)(gcm_data *my_ctx_data, u8 *hash_subkey); +typedef void (*aesni_gcm_finalize_t)(struct gcm_data *my_ctx_data, + uint8_t *auth_tag, + uint64_t auth_tag_len); -/** GCM library function pointer table */ struct aesni_gcm_ops { - struct { - struct { - aes_keyexp_128_enc_t aes128_enc; - /**< AES128 enc key expansion */ - } keyexp; - /**< Key expansion functions */ - } aux; /**< Auxiliary functions */ - - struct { - aesni_gcm_t enc; /**< GCM encode function pointer */ - aesni_gcm_t dec; /**< GCM decode function pointer */ - aesni_gcm_precomp_t precomp; /**< GCM pre-compute */ - } gcm; /**< GCM functions */ + aesni_gcm_init_t init; + aesni_gcm_update_t update; + aesni_gcm_finalize_t finalize; }; - -static const struct aesni_gcm_ops gcm_ops[] = { - [RTE_AESNI_GCM_NOT_SUPPORTED] = { - .aux = { - .keyexp = { - NULL - } - }, - .gcm = { - NULL - } - }, - [RTE_AESNI_GCM_SSE] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_sse - } - }, - .gcm = { - aesni_gcm_enc_sse, - aesni_gcm_dec_sse, - aesni_gcm_precomp_sse - } - }, - [RTE_AESNI_GCM_AVX] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen2, - aesni_gcm_dec_avx_gen2, - aesni_gcm_precomp_avx_gen2 - } - }, - [RTE_AESNI_GCM_AVX2] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx2, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen4, - aesni_gcm_dec_avx_gen4, - aesni_gcm_precomp_avx_gen4 - } - } -}; - - #endif /* _AESNI_GCM_OPS_H_ */ diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c index 5af22f7..8b2d792 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c @@ -30,8 +30,6 @@ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ -#include <openssl/aes.h> - #include <rte_common.h> #include <rte_config.h> #include <rte_hexdump.h> @@ -44,6 +42,34 @@ #include "aesni_gcm_pmd_private.h" +/** GCM encode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_enc[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_enc_update, + aesni_gcm128_enc_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_enc_update, + aesni_gcm256_enc_finalize + } +}; + +/** GCM decode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_dec[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_dec_update, + aesni_gcm128_dec_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_dec_update, + aesni_gcm256_dec_finalize + } +}; + /** * Global static parameter used to create a unique name for each AES-NI multi * buffer crypto device. @@ -65,112 +91,68 @@ return 0; } -static int -aesni_gcm_calculate_hash_sub_key(uint8_t *hsubkey, unsigned hsubkey_length, - uint8_t *aeskey, unsigned aeskey_length) -{ - uint8_t key[aeskey_length] __rte_aligned(16); - AES_KEY enc_key; - - if (hsubkey_length % 16 != 0 && aeskey_length % 16 != 0) - return -EFAULT; - - memcpy(key, aeskey, aeskey_length); - - if (AES_set_encrypt_key(key, aeskey_length << 3, &enc_key) != 0) - return -EFAULT; - - AES_encrypt(hsubkey, hsubkey, &enc_key); - - return 0; -} - -/** Get xform chain order */ -static int -aesni_gcm_get_mode(const struct rte_crypto_sym_xform *xform) -{ - /* - * GCM only supports authenticated encryption or authenticated - * decryption, all other options are invalid, so we must have exactly - * 2 xform structs chained together - */ - if (xform->next == NULL || xform->next->next != NULL) - return -1; - - if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && - xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { - return AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; - } - - if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && - xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { - return AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - } - - return -1; -} - /** Parse crypto xform chain and set private session parameters */ int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *gcm_ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform) { - const struct rte_crypto_sym_xform *auth_xform = NULL; - const struct rte_crypto_sym_xform *cipher_xform = NULL; + const struct rte_crypto_sym_xform *auth_xform; + const struct rte_crypto_sym_xform *cipher_xform; - uint8_t hsubkey[16] __rte_aligned(16) = { 0 }; - - /* Select Crypto operation - hash then cipher / cipher then hash */ - switch (aesni_gcm_get_mode(xform)) { - case AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + if (xform->next == NULL || xform->next->next != NULL) { + GCM_LOG_ERR("Two and only two chained xform required"); + return -EINVAL; + } - cipher_xform = xform; + if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && + xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { auth_xform = xform->next; - break; - case AESNI_GCM_OP_AUTHENTICATED_DECRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - + cipher_xform = xform; + } else if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && + xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { auth_xform = xform; cipher_xform = xform->next; - break; - default: - GCM_LOG_ERR("Unsupported operation chain order parameter"); + } else { + GCM_LOG_ERR("Cipher and auth xform required"); return -EINVAL; } - /* We only support AES GCM */ - if (cipher_xform->cipher.algo != RTE_CRYPTO_CIPHER_AES_GCM && - auth_xform->auth.algo != RTE_CRYPTO_AUTH_AES_GCM) + if (!(cipher_xform->cipher.algo == RTE_CRYPTO_CIPHER_AES_GCM && + (auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GCM || + auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GMAC))) { + GCM_LOG_ERR("We only support AES GCM and AES GMAC"); return -EINVAL; + } - /* Select cipher direction */ - if (sess->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_ENCRYPT) { - GCM_LOG_ERR("xform chain (CIPHER/AUTH) and cipher operation " - "(DECRYPT) specified are an invalid selection"); - return -EINVAL; - } else if (sess->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_DECRYPT) { - GCM_LOG_ERR("xform chain (AUTH/CIPHER) and cipher operation " - "(ENCRYPT) specified are an invalid selection"); + /* Select Crypto operation */ + if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_ENCRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_GENERATE) + sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + else if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_DECRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_VERIFY) + sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; + else { + GCM_LOG_ERR("Cipher/Auth operations: Encrypt/Generate or" + " Decrypt/Verify are valid only"); return -EINVAL; } - /* Expand GCM AES128 key */ - (*gcm_ops->aux.keyexp.aes128_enc)(cipher_xform->cipher.key.data, - sess->gdata.expanded_keys); + /* Check key length and calculate GCM pre-compute. */ + switch (cipher_xform->cipher.key.length) { + case 16: + aesni_gcm128_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_128; - /* Calculate hash sub key here */ - aesni_gcm_calculate_hash_sub_key(hsubkey, sizeof(hsubkey), - cipher_xform->cipher.key.data, - cipher_xform->cipher.key.length); + break; + case 32: + aesni_gcm256_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_256; - /* Calculate GCM pre-compute */ - (*gcm_ops->gcm.precomp)(&sess->gdata, hsubkey); + break; + default: + GCM_LOG_ERR("Unsupported cipher key length"); + return -EINVAL; + } return 0; } @@ -194,10 +176,10 @@ return sess; sess = (struct aesni_gcm_session *) - ((struct rte_cryptodev_session *)_sess)->_private; + ((struct rte_cryptodev_sym_session *)_sess)->_private; - if (unlikely(aesni_gcm_set_session_parameters(qp->ops, - sess, op->xform) != 0)) { + if (unlikely(aesni_gcm_set_session_parameters(sess, + op->xform) != 0)) { rte_mempool_put(qp->sess_mp, _sess); sess = NULL; } @@ -217,19 +199,45 @@ * */ static int -process_gcm_crypto_op(struct aesni_gcm_qp *qp, struct rte_crypto_sym_op *op, +process_gcm_crypto_op(struct rte_crypto_sym_op *op, struct aesni_gcm_session *session) { uint8_t *src, *dst; - struct rte_mbuf *m = op->m_src; + struct rte_mbuf *m_src = op->m_src; + uint32_t offset = op->cipher.data.offset; + uint32_t part_len, total_len, data_len; + + RTE_ASSERT(m_src != NULL); + + while (offset >= m_src->data_len) { + offset -= m_src->data_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + } + + data_len = m_src->data_len - offset; + part_len = (data_len < op->cipher.data.length) ? data_len : + op->cipher.data.length; + + /* Destination buffer is required when segmented source buffer */ + RTE_ASSERT((part_len == op->cipher.data.length) || + ((part_len != op->cipher.data.length) && + (op->m_dst != NULL))); + /* Segmented destination buffer is not supported */ + RTE_ASSERT((op->m_dst == NULL) || + ((op->m_dst != NULL) && + rte_pktmbuf_is_contiguous(op->m_dst))); + - src = rte_pktmbuf_mtod(m, uint8_t *) + op->cipher.data.offset; dst = op->m_dst ? rte_pktmbuf_mtod_offset(op->m_dst, uint8_t *, op->cipher.data.offset) : - rte_pktmbuf_mtod_offset(m, uint8_t *, + rte_pktmbuf_mtod_offset(op->m_src, uint8_t *, op->cipher.data.offset); + src = rte_pktmbuf_mtod_offset(m_src, uint8_t *, offset); + /* sanity checks */ if (op->cipher.iv.length != 16 && op->cipher.iv.length != 12 && op->cipher.iv.length != 0) { @@ -246,48 +254,81 @@ *iv_padd = rte_bswap32(1); } - if (op->auth.aad.length != 12 && op->auth.aad.length != 8 && - op->auth.aad.length != 0) { - GCM_LOG_ERR("iv"); - return -1; - } - if (op->auth.digest.length != 16 && op->auth.digest.length != 12 && - op->auth.digest.length != 8 && - op->auth.digest.length != 0) { - GCM_LOG_ERR("iv"); + op->auth.digest.length != 8) { + GCM_LOG_ERR("digest"); return -1; } if (session->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION) { - (*qp->ops->gcm.enc)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_enc[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_enc[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_enc[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_enc[session->key].finalize(&session->gdata, op->auth.digest.data, (uint64_t)op->auth.digest.length); - } else if (session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION) { - uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(m, + } else { /* session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION */ + uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(op->m_dst ? + op->m_dst : op->m_src, op->auth.digest.length); if (!auth_tag) { - GCM_LOG_ERR("iv"); + GCM_LOG_ERR("auth_tag"); return -1; } - (*qp->ops->gcm.dec)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_dec[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_dec[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_dec[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_dec[session->key].finalize(&session->gdata, auth_tag, (uint64_t)op->auth.digest.length); - } else { - GCM_LOG_ERR("iv"); - return -1; } return 0; @@ -377,21 +418,7 @@ break; } -#ifdef RTE_LIBRTE_PMD_AESNI_GCM_DEBUG - if (!rte_pktmbuf_is_contiguous(ops[i]->sym->m_src) || - (ops[i]->sym->m_dst != NULL && - !rte_pktmbuf_is_contiguous( - ops[i]->sym->m_dst))) { - ops[i]->status = RTE_CRYPTO_OP_STATUS_INVALID_ARGS; - GCM_LOG_ERR("PMD supports only contiguous mbufs, " - "op (%p) provides noncontiguous mbuf as " - "source/destination buffer.\n", ops[i]); - qp->qp_stats.enqueue_err_count++; - break; - } -#endif - - retval = process_gcm_crypto_op(qp, ops[i]->sym, sess); + retval = process_gcm_crypto_op(ops[i]->sym, sess); if (retval < 0) { ops[i]->status = RTE_CRYPTO_OP_STATUS_INVALID_ARGS; qp->qp_stats.enqueue_err_count++; @@ -429,7 +456,6 @@ struct rte_cryptodev *dev; char crypto_dev_name[RTE_CRYPTODEV_NAME_MAX_LEN]; struct aesni_gcm_private *internals; - enum aesni_gcm_vector_mode vector_mode; /* Check CPU for support for AES instruction set */ if (!rte_cpu_get_flag_enabled(RTE_CPUFLAG_AES)) { @@ -437,18 +463,6 @@ return -EFAULT; } - /* Check CPU for supported vector instruction set */ - if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX2)) - vector_mode = RTE_AESNI_GCM_AVX2; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX)) - vector_mode = RTE_AESNI_GCM_AVX; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_SSE4_1)) - vector_mode = RTE_AESNI_GCM_SSE; - else { - GCM_LOG_ERR("Vector instructions are not supported by CPU"); - return -EFAULT; - } - /* create a unique device name */ if (create_unique_device_name(crypto_dev_name, RTE_CRYPTODEV_NAME_MAX_LEN) != 0) { @@ -473,27 +487,11 @@ dev->feature_flags = RTE_CRYPTODEV_FF_SYMMETRIC_CRYPTO | RTE_CRYPTODEV_FF_SYM_OPERATION_CHAINING | - RTE_CRYPTODEV_FF_CPU_AESNI; + RTE_CRYPTODEV_FF_CPU_AESNI | + RTE_CRYPTODEV_FF_MBUF_SCATTER_GATHER; - switch (vector_mode) { - case RTE_AESNI_GCM_SSE: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_SSE; - break; - case RTE_AESNI_GCM_AVX: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX; - break; - case RTE_AESNI_GCM_AVX2: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX2; - break; - default: - break; - } - - /* Set vector instructions mode supported */ internals = dev->data->dev_private; - internals->vector_mode = vector_mode; - internals->max_nb_queue_pairs = init_params->max_nb_queue_pairs; internals->max_nb_sessions = init_params->max_nb_sessions; diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c index c51f82a..2362006 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c @@ -39,17 +39,17 @@ #include "aesni_gcm_pmd_private.h" static const struct rte_cryptodev_capabilities aesni_gcm_pmd_capabilities[] = { - { /* AES GCM (AUTH) */ + { /* AES GMAC (AUTH) */ .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, {.sym = { .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, {.auth = { - .algo = RTE_CRYPTO_AUTH_AES_GCM, + .algo = RTE_CRYPTO_AUTH_AES_GMAC, .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .digest_size = { .min = 8, @@ -57,9 +57,34 @@ .increment = 4 }, .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 + } + }, } + }, } + }, + { /* AES GCM (AUTH) */ + .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, + {.sym = { + .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, + {.auth = { + .algo = RTE_CRYPTO_AUTH_AES_GCM, + .block_size = 16, + .key_size = { + .min = 16, + .max = 32, + .increment = 16 + }, + .digest_size = { .min = 8, - .max = 12, + .max = 16, .increment = 4 + }, + .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 } }, } }, } @@ -73,8 +98,8 @@ .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .iv_size = { .min = 12, @@ -221,7 +246,6 @@ int socket_id) { struct aesni_gcm_qp *qp = NULL; - struct aesni_gcm_private *internals = dev->data->dev_private; /* Free memory prior to re-allocation if needed. */ if (dev->data->queue_pairs[qp_id] != NULL) @@ -239,8 +263,6 @@ if (aesni_gcm_pmd_qp_set_unique_name(dev, qp)) goto qp_setup_cleanup; - qp->ops = &gcm_ops[internals->vector_mode]; - qp->processed_pkts = aesni_gcm_pmd_qp_create_processed_pkts_ring(qp, qp_conf->nb_descriptors, socket_id); if (qp->processed_pkts == NULL) @@ -291,18 +313,15 @@ /** Configure a aesni gcm session from a crypto xform chain */ static void * -aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev, +aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev __rte_unused, struct rte_crypto_sym_xform *xform, void *sess) { - struct aesni_gcm_private *internals = dev->data->dev_private; - if (unlikely(sess == NULL)) { GCM_LOG_ERR("invalid session struct"); return NULL; } - if (aesni_gcm_set_session_parameters(&gcm_ops[internals->vector_mode], - sess, xform) != 0) { + if (aesni_gcm_set_session_parameters(sess, xform) != 0) { GCM_LOG_ERR("failed configure session parameters"); return NULL; } diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h index 9878d6e..0496b44 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h @@ -58,8 +58,6 @@ /** private data structure for each virtual AESNI GCM device */ struct aesni_gcm_private { - enum aesni_gcm_vector_mode vector_mode; - /**< Vector mode */ unsigned max_nb_queue_pairs; /**< Max number of queue pairs supported by device */ unsigned max_nb_sessions; @@ -71,8 +69,6 @@ struct aesni_gcm_qp { /**< Queue Pair Identifier */ char name[RTE_CRYPTODEV_NAME_LEN]; /**< Unique Queue Pair Name */ - const struct aesni_gcm_ops *ops; - /**< Architecture dependent function pointer table of the gcm APIs */ struct rte_ring *processed_pkts; /**< Ring for placing process packets */ struct rte_mempool *sess_mp; @@ -87,10 +83,17 @@ enum aesni_gcm_operation { AESNI_GCM_OP_AUTHENTICATED_DECRYPTION }; +enum aesni_gcm_key { + AESNI_GCM_KEY_128, + AESNI_GCM_KEY_256 +}; + /** AESNI GCM private session structure */ struct aesni_gcm_session { enum aesni_gcm_operation op; /**< GCM operation type */ + enum aesni_gcm_key key; + /**< GCM key type */ struct gcm_data gdata __rte_cache_aligned; /**< GCM parameters */ }; @@ -98,7 +101,6 @@ struct aesni_gcm_session { /** * Setup GCM session parameters - * @param ops gcm ops function pointer table * @param sess aesni gcm session structure * @param xform crypto transform chain * @@ -107,8 +109,7 @@ struct aesni_gcm_session { * - On failure returns error code < 0 */ extern int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform); diff --git a/mk/rte.app.mk b/mk/rte.app.mk index f75f0e2..ed3eab5 100644 --- a/mk/rte.app.mk +++ b/mk/rte.app.mk @@ -134,8 +134,7 @@ _LDLIBS-$(CONFIG_RTE_LIBRTE_VMXNET3_PMD) += -lrte_pmd_vmxnet3_uio ifeq ($(CONFIG_RTE_LIBRTE_CRYPTODEV),y) _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -lrte_pmd_aesni_mb _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lcrypto -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB +_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lisal_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_OPENSSL) += -lrte_pmd_openssl -lcrypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_NULL_CRYPTO) += -lrte_pmd_null_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_QAT) += -lrte_pmd_qat -lcrypto diff --git a/scripts/test-build.sh b/scripts/test-build.sh index a979309..d0c1a34 100755 --- a/scripts/test-build.sh +++ b/scripts/test-build.sh @@ -44,6 +44,7 @@ default_path=$PATH # - DPDK_DEP_SSL (y/[n]) # - DPDK_DEP_SZE (y/[n]) # - DPDK_DEP_ZLIB (y/[n]) +# - DPDK_DEP_ISAL (y/[n]) # - DPDK_MAKE_JOBS (int) # - DPDK_NOTIFY (notify-send) # - LIBSSO_SNOW3G_PATH @@ -126,6 +127,7 @@ reset_env () unset DPDK_DEP_SSL unset DPDK_DEP_SZE unset DPDK_DEP_ZLIB + unset DPDK_DEP_ISAL unset AESNI_MULTI_BUFFER_LIB_PATH unset LIBSSO_SNOW3G_PATH unset LIBSSO_KASUMI_PATH @@ -176,7 +178,7 @@ config () # <directory> <target> <options> sed -ri 's,(PCAP=)n,\1y,' $1/.config test -z "$AESNI_MULTI_BUFFER_LIB_PATH" || \ sed -ri 's,(PMD_AESNI_MB=)n,\1y,' $1/.config - test -z "$AESNI_MULTI_BUFFER_LIB_PATH" || \ + test "$DPDK_DEP_ISAL" != y || \ sed -ri 's,(PMD_AESNI_GCM=)n,\1y,' $1/.config test -z "$LIBSSO_SNOW3G_PATH" || \ sed -ri 's,(PMD_SNOW3G=)n,\1y,' $1/.config -- 1.7.9.5 ^ permalink raw reply [flat|nested] 15+ messages in thread
* Re: [dpdk-dev] [PATCH v4] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-05 13:51 ` [dpdk-dev] [PATCH v4] " Piotr Azarewicz @ 2017-01-05 15:12 ` Thomas Monjalon 2017-01-16 12:37 ` [dpdk-dev] [PATCH v5] " Piotr Azarewicz 1 sibling, 0 replies; 15+ messages in thread From: Thomas Monjalon @ 2017-01-05 15:12 UTC (permalink / raw) To: Piotr Azarewicz; +Cc: dev, pablo.de.lara.guarch 2017-01-05 14:51, Piotr Azarewicz: > Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer > Crypto library.This patch reimplement the device using ISA-L Crypto > library: https://github.com/01org/isa-l_crypto. > > The migration entailed the following additional support for: > * GMAC algorithm. > * 256-bit cipher key. > * Session-less mode. > * Out-of place processing > * Scatter-gatter support for chained mbufs (only out-of place and > destination mbuf must be contiguous) > > Verified current unit tests and added new unit tests to verify new > functionalities. > > PERFORMANCE COMPARISON > ---------------------- > Comparison the new and old implementation is made by running app/test > and calling cryptodev_aesni_gcm_perftest. > As we may see below, the new implementation has small performance drop > when buffer size is above 64B. I am a bit surprised that you change for lower performance. I understand that it brings new features. Is it possible to add such features without performance impact? > Old implementation with MB library: > Cipher algo: AES_GCM Cipher hash: AES_GCM ciphr key: 128b burst size: 32 > Buffer Size(B) OPS(M) Throughput(Gbps) Retries EmptyPolls > 64 4.57 2.34 0 0 > 128 4.28 4.39 0 0 > 256 2.76 5.66 0 0 > 512 1.60 6.56 0 0 > 1024 0.90 7.34 0 0 > 1536 0.62 7.66 0 0 > 2048 0.48 7.84 0 0 > > New implementation with ISA-L library: > Cipher algo: AES_GCM Cipher hash: AES_GCM ciphr key: 128b burst size: 32 > Buffer Size(B) OPS(M) Throughput(Gbps) Retries EmptyPolls > 64 4.62 2.37 0 0 > 128 4.06 4.16 0 0 > 256 2.65 5.44 0 0 > 512 1.57 6.45 0 0 > 1024 0.89 7.26 0 0 > 1536 0.62 7.58 0 0 > 2048 0.47 7.77 0 0 > ^ permalink raw reply [flat|nested] 15+ messages in thread
* [dpdk-dev] [PATCH v5] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-05 13:51 ` [dpdk-dev] [PATCH v4] " Piotr Azarewicz 2017-01-05 15:12 ` Thomas Monjalon @ 2017-01-16 12:37 ` Piotr Azarewicz 2017-01-16 14:42 ` Thomas Monjalon ` (3 more replies) 1 sibling, 4 replies; 15+ messages in thread From: Piotr Azarewicz @ 2017-01-16 12:37 UTC (permalink / raw) To: pablo.de.lara.guarch, declan.doherty, dev Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer Crypto library.This patch reimplement the device using ISA-L Crypto library: https://github.com/01org/isa-l_crypto. The migration entailed the following additional support for: * GMAC algorithm. * 256-bit cipher key. * Session-less mode. * Out-of place processing * Scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) Verified current unit tests and added new unit tests to verify new functionalities. Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> --- To be applied on top of: [dpdk-dev] [PATCH v5 0/3] Chained Mbufs support in SW PMDs v5 changes: - rebase on top of dpdk-next-crypto - remove the perftest output from commit message - correction in aesni_gcm.rst - fix typo v4 changes: - rebase on top of dpdk-next-crypto - update the script test-build.sh v3 changes: - rebase on top of dpdk-next-crypto v2 changes: - implement native scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) - write unit test for session-less mode - write unit test for out-of place processing - add support for GMAC authentication algorithm app/test/test_cryptodev.c | 753 +++++++++++++++++++--- app/test/test_cryptodev_gcm_test_vectors.h | 491 +++++++++++++- devtools/test-build.sh | 4 +- doc/guides/cryptodevs/aesni_gcm.rst | 24 +- doc/guides/rel_notes/release_17_02.rst | 12 + drivers/crypto/aesni_gcm/Makefile | 8 +- drivers/crypto/aesni_gcm/aesni_gcm_ops.h | 95 +-- drivers/crypto/aesni_gcm/aesni_gcm_pmd.c | 324 +++++----- drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c | 49 +- drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h | 15 +- mk/rte.app.mk | 3 +- 11 files changed, 1363 insertions(+), 415 deletions(-) diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c index 5786fde..e8d1eae 100644 --- a/app/test/test_cryptodev.c +++ b/app/test/test_cryptodev.c @@ -4317,16 +4317,48 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +create_gcm_xforms(struct rte_crypto_op *op, + enum rte_crypto_cipher_operation cipher_op, + uint8_t *key, const uint8_t key_len, + const uint8_t aad_len, const uint8_t auth_len, + enum rte_crypto_auth_operation auth_op) +{ + TEST_ASSERT_NOT_NULL(rte_crypto_op_sym_xforms_alloc(op, 2), + "failed to allocate space for crypto transforms"); + + struct rte_crypto_sym_op *sym_op = op->sym; + + /* Setup Cipher Parameters */ + sym_op->xform->type = RTE_CRYPTO_SYM_XFORM_CIPHER; + sym_op->xform->cipher.algo = RTE_CRYPTO_CIPHER_AES_GCM; + sym_op->xform->cipher.op = cipher_op; + sym_op->xform->cipher.key.data = key; + sym_op->xform->cipher.key.length = key_len; + + TEST_HEXDUMP(stdout, "key:", key, key_len); + + /* Setup Authentication Parameters */ + sym_op->xform->next->type = RTE_CRYPTO_SYM_XFORM_AUTH; + sym_op->xform->next->auth.algo = RTE_CRYPTO_AUTH_AES_GCM; + sym_op->xform->next->auth.op = auth_op; + sym_op->xform->next->auth.digest_length = auth_len; + sym_op->xform->next->auth.add_auth_data_length = aad_len; + sym_op->xform->next->auth.key.length = 0; + sym_op->xform->next->auth.key.data = NULL; + sym_op->xform->next->next = NULL; + + return 0; +} + +static int create_gcm_operation(enum rte_crypto_cipher_operation op, - const uint8_t *auth_tag, const unsigned auth_tag_len, - const uint8_t *iv, const unsigned iv_len, - const uint8_t *aad, const unsigned aad_len, - const unsigned data_len, unsigned data_pad_len) + const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; - unsigned iv_pad_len = 0, aad_buffer_len; + uint8_t *plaintext, *ciphertext; + unsigned int iv_pad_len, aad_pad_len, plaintext_pad_len; /* Generate Crypto op data structure */ ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool, @@ -4336,77 +4368,118 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct rte_crypto_sym_op *sym_op = ut_params->op->sym; - if (ut_params->obuf) { - sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( - ut_params->obuf, auth_tag_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, - "no room to append digest"); - sym_op->auth.digest.phys_addr = pktmbuf_mtophys_offset( - ut_params->obuf, data_pad_len); - } else { - sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( - ut_params->ibuf, auth_tag_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, - "no room to append digest"); - sym_op->auth.digest.phys_addr = pktmbuf_mtophys_offset( - ut_params->ibuf, data_pad_len); - } - sym_op->auth.digest.length = auth_tag_len; - - if (op == RTE_CRYPTO_CIPHER_OP_DECRYPT) { - rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len); - TEST_HEXDUMP(stdout, "digest:", - sym_op->auth.digest.data, - sym_op->auth.digest.length); - } + /* Append aad data */ + aad_pad_len = RTE_ALIGN_CEIL(tdata->aad.len, 16); + sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + aad_pad_len); + TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, + "no room to append aad"); - /* iv */ - iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16); + sym_op->auth.aad.length = tdata->aad.len; + sym_op->auth.aad.phys_addr = + rte_pktmbuf_mtophys(ut_params->ibuf); + memcpy(sym_op->auth.aad.data, tdata->aad.data, tdata->aad.len); + TEST_HEXDUMP(stdout, "aad:", sym_op->auth.aad.data, + sym_op->auth.aad.length); + /* Prepend iv */ + iv_pad_len = RTE_ALIGN_CEIL(tdata->iv.len, 16); sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend( ut_params->ibuf, iv_pad_len); TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv"); memset(sym_op->cipher.iv.data, 0, iv_pad_len); sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf); - sym_op->cipher.iv.length = iv_len; + sym_op->cipher.iv.length = tdata->iv.len; - rte_memcpy(sym_op->cipher.iv.data, iv, iv_len); + rte_memcpy(sym_op->cipher.iv.data, tdata->iv.data, tdata->iv.len); + TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, + sym_op->cipher.iv.length); - /* - * Always allocate the aad up to the block size. - * The cryptodev API calls out - - * - the array must be big enough to hold the AAD, plus any - * space to round this up to the nearest multiple of the - * block size (16 bytes). - */ - aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16); + /* Append plaintext/ciphertext */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, "no room to append plaintext"); - sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend( - ut_params->ibuf, aad_buffer_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, - "no room to prepend aad"); - sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys( - ut_params->ibuf); - sym_op->auth.aad.length = aad_len; + memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); + TEST_HEXDUMP(stdout, "plaintext:", plaintext, + tdata->plaintext.len); - memset(sym_op->auth.aad.data, 0, aad_buffer_len); - rte_memcpy(sym_op->auth.aad.data, aad, aad_len); + if (ut_params->obuf) { + ciphertext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, iv_pad_len); - TEST_HEXDUMP(stdout, "aad:", - sym_op->auth.aad.data, aad_len); + memset(ciphertext + aad_pad_len + iv_pad_len, 0, + tdata->ciphertext.len); + } + } else { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); + ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - if (ut_params->obuf) { - rte_pktmbuf_prepend(ut_params->obuf, iv_pad_len); - rte_pktmbuf_prepend(ut_params->obuf, aad_buffer_len); + memcpy(ciphertext, tdata->ciphertext.data, + tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, + tdata->ciphertext.len); + + if (ut_params->obuf) { + plaintext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, + "no room to append plaintext"); + + memset(plaintext + aad_pad_len + iv_pad_len, 0, + tdata->plaintext.len); + } } - sym_op->cipher.data.length = data_len; - sym_op->cipher.data.offset = aad_buffer_len + iv_pad_len; + /* Append digest data */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + memset(sym_op->auth.digest.data, 0, tdata->auth_tag.len); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + plaintext_pad_len + + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; + } else { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->ibuf, tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->ibuf, + plaintext_pad_len + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; - sym_op->auth.data.offset = aad_buffer_len + iv_pad_len; - sym_op->auth.data.length = data_len; + rte_memcpy(sym_op->auth.digest.data, tdata->auth_tag.data, + tdata->auth_tag.len); + TEST_HEXDUMP(stdout, "digest:", + sym_op->auth.digest.data, + sym_op->auth.digest.length); + } + + sym_op->cipher.data.length = tdata->plaintext.len; + sym_op->cipher.data.offset = aad_pad_len + iv_pad_len; + + sym_op->auth.data.length = tdata->plaintext.len; + sym_op->auth.data.offset = aad_pad_len + iv_pad_len; return 0; } @@ -4418,9 +4491,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext, *auth_tag; + uint8_t *ciphertext, *auth_tag; uint16_t plaintext_pad_len; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4431,31 +4504,20 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GCM_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); /* clear mbuf payload */ memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - /* - * Append data which is padded to a multiple - * of the algorithms block size - */ - plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); - - plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - plaintext_pad_len); - memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); - - TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->plaintext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->plaintext.len, plaintext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); if (retval < 0) return retval; @@ -4470,14 +4532,18 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, "crypto op processing failed"); + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + if (ut_params->op->sym->m_dst) { ciphertext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); auth_tag = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_dst, uint8_t *, plaintext_pad_len); } else { - ciphertext = plaintext; - auth_tag = plaintext + plaintext_pad_len; + ciphertext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; } TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); @@ -4543,15 +4609,68 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_encryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_2); +} + +static int test_mb_AES_GCM_authenticated_decryption(const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext; - uint16_t ciphertext_pad_len; + uint8_t *plaintext; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4562,31 +4681,23 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - /* alloc mbuf and set payload */ - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GCM_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); - - ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - ciphertext_pad_len); - memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len); - - TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->ciphertext.len, ciphertext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); if (retval < 0) return retval; - rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); ut_params->op->sym->m_src = ut_params->ibuf; @@ -4602,7 +4713,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) plaintext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); else - plaintext = ciphertext; + plaintext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); @@ -4662,6 +4775,358 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_decryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_2); +} + +static int +test_AES_GCM_authenticated_encryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_oop(void) +{ + return test_AES_GCM_authenticated_encryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_DECRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_oop(void) +{ + return test_AES_GCM_authenticated_decryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_encryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + uint8_t key[tdata->key.len + 1]; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_sessionless(void) +{ + return test_AES_GCM_authenticated_encryption_sessionless( + &gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + uint8_t key[tdata->key.len + 1]; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_DECRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_sessionless(void) +{ + return test_AES_GCM_authenticated_decryption_sessionless( + &gcm_test_case_5); +} + +static int test_stats(void) { struct crypto_testsuite_params *ts_params = &testsuite_params; @@ -7102,6 +7567,86 @@ struct test_crypto_vector { TEST_CASE_ST(ut_setup, ut_teardown, test_mb_AES_GCM_authenticated_decryption_test_case_7), + /** AES GCM Authenticated Encryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_7), + + /** AES GCM Authenticated Decryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_7), + + /** AES GCM Authenticated Encryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_2), + + /** AES GCM Authenticated Decryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_2), + + /** AES GMAC Authentication */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_4), + + /** Negative tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_data_corrupt), + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_tag_corrupt), + + /** Out of place tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_oop), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_oop), + + /** Session-less tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_sessionless), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_sessionless), + + /** Scatter-Gather */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GCM_auth_encrypt_SGL_out_of_place_400B_1seg), + TEST_CASES_END() /**< NULL terminate unit test array */ } }; diff --git a/app/test/test_cryptodev_gcm_test_vectors.h b/app/test/test_cryptodev_gcm_test_vectors.h index 45ea3d4..5764edb 100644 --- a/app/test/test_cryptodev_gcm_test_vectors.h +++ b/app/test/test_cryptodev_gcm_test_vectors.h @@ -33,7 +33,17 @@ #ifndef TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ #define TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ -#define GMAC_LARGE_PLAINTEXT_LENGTH 65376 +#define GMAC_LARGE_PLAINTEXT_LENGTH 65344 +#define GCM_MAX_AAD_LENGTH 65536 +#define GCM_LARGE_AAD_LENGTH 65296 + +static uint8_t gcm_aad_zero_text[GCM_MAX_AAD_LENGTH] = { 0 }; + +static uint8_t gcm_aad_text[GCM_MAX_AAD_LENGTH] = { + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6, 0x97, + 0x88, 0x79, 0x6a, 0x5b, 0x4c, 0x3d, 0x2e, 0x1f }; struct gcm_test_data { @@ -48,7 +58,7 @@ struct gcm_test_data { } iv; struct { - uint8_t data[64]; + uint8_t *data; unsigned len; } aad; @@ -111,7 +121,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -148,7 +158,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -186,7 +196,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -238,8 +248,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .data = gcm_aad_zero_text, .len = 8 }, .plaintext = { @@ -294,8 +303,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef }, + .data = gcm_aad_text, .len = 8 }, .plaintext = { @@ -351,10 +359,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, - 0x00, 0x00, 0x00, 0x00 - }, + .data = gcm_aad_zero_text, .len = 12 }, .plaintext = { @@ -409,10 +414,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, - 0xfe, 0xed, 0xfa, 0xce - }, + .data = gcm_aad_text, .len = 12 }, .plaintext = { @@ -466,10 +468,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, - 0xfe, 0xed, 0xfa, 0xce - }, + .data = gcm_aad_text, .len = 12 }, .plaintext = { @@ -1003,6 +1002,450 @@ struct gmac_test_data { } }; +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_1 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { 0x00 }, + .len = 0 + }, + .ciphertext = { + .data = { 0x00 }, + .len = 0 + }, + .auth_tag = { + .data = { + 0x53, 0x0F, 0x8A, 0xFB, 0xC7, 0x45, 0x36, 0xB9, + 0xA9, 0x63, 0xB4, 0xF1, 0xC4, 0xCB, 0x73, 0x8B }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_2 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 16 + }, + .ciphertext = { + .data = { + 0xCE, 0xA7, 0x40, 0x3D, 0x4D, 0x60, 0x6B, 0x6E, + 0x07, 0x4E, 0xC5, 0xD3, 0xBA, 0xF3, 0x9D, 0x18 }, + .len = 16 + }, + .auth_tag = { + .data = { + 0xD0, 0xD1, 0xC8, 0xA7, 0x99, 0x99, 0x6B, 0xF0, + 0x26, 0x5B, 0x98, 0xB5, 0xD4, 0x8A, 0xB9, 0x19 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_3 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E }, + .len = 64 + }, + .auth_tag = { + .data = { + 0x64, 0xAF, 0x1D, 0xFB, 0xE8, 0x0D, 0x37, 0xD8, + 0x92, 0xC3, 0xB9, 0x1D, 0xD3, 0x08, 0xAB, 0xFC }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_4 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x63, 0x16, 0x91, 0xAE, 0x17, 0x05, 0x5E, 0xA6, + 0x6D, 0x0A, 0x51, 0xE2, 0x50, 0x21, 0x85, 0x4A }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_5 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0xA7, 0x99, 0xAC, 0xB8, 0x27, 0xDA, 0xB1, 0x82, + 0x79, 0xFD, 0x83, 0x73, 0x52, 0x4D, 0xDB, 0xF1 }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_6 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x5D, 0xA5, 0x0E, 0x53, 0x64, 0x7F, 0x3F, 0xAE, + 0x1A, 0x1F, 0xC0, 0xB0, 0xD8, 0xBE, 0xF2, 0x64 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_7 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x4E, 0xD0, 0x91, 0x95, 0x83, 0xA9, 0x38, 0x72, + 0x09, 0xA9, 0xCE, 0x5F, 0x89, 0x06, 0x4E, 0xC8 }, + .len = 16 + } +}; + +/** variable AAD AES-128 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_1 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08 }, + .len = 16 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GCM_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x42, 0x83, 0x1E, 0xC2, 0x21, 0x77, 0x74, 0x24, + 0x4B, 0x72, 0x21, 0xB7, 0x84, 0xD0, 0xD4, 0x9C, + 0xE3, 0xAA, 0x21, 0x2F, 0x2C, 0x02, 0xA4, 0xE0, + 0x35, 0xC1, 0x7E, 0x23, 0x29, 0xAC, 0xA1, 0x2E, + 0x21, 0xD5, 0x14, 0xB2, 0x54, 0x66, 0x93, 0x1C, + 0x7D, 0x8F, 0x6A, 0x5A, 0xAC, 0x84, 0xAA, 0x05, + 0x1B, 0xA3, 0x0B, 0x39, 0x6A, 0x0A, 0xAC, 0x97, + 0x3D, 0x58, 0xE0, 0x91, 0x47, 0x3F, 0x59, 0x85 + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xCA, 0x70, 0xAF, 0x96, 0xA8, 0x5D, 0x40, 0x47, + 0x0C, 0x3C, 0x48, 0xF5, 0xF0, 0xF5, 0xA5, 0x7D + }, + .len = 16 + } +}; + +/** variable AAD AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_2 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GCM_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xBA, 0x06, 0xDA, 0xA1, 0x91, 0xE1, 0xFE, 0x22, + 0x59, 0xDA, 0x67, 0xAF, 0x9D, 0xA5, 0x43, 0x94 + }, + .len = 16 + } +}; + /** GMAC Test Vectors */ static uint8_t gmac_plaintext[GMAC_LARGE_PLAINTEXT_LENGTH] = { 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, @@ -1781,8 +2224,8 @@ struct cryptodev_perf_test_data { }, .gmac_tag = { .data = { - 0x88, 0x82, 0xb4, 0x93, 0x8f, 0x04, 0xcd, 0x06, - 0xfd, 0xac, 0x6d, 0x8b, 0x9c, 0x9e, 0x8f, 0xec + 0x3f, 0x07, 0xcb, 0xb9, 0x86, 0x3a, 0xea, 0xc2, + 0x2f, 0x3a, 0x2a, 0x93, 0xd8, 0x09, 0x6b, 0xda }, .len = 16 } @@ -1802,7 +2245,7 @@ struct cryptodev_perf_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { diff --git a/devtools/test-build.sh b/devtools/test-build.sh index a979309..d0c1a34 100755 --- a/devtools/test-build.sh +++ b/devtools/test-build.sh @@ -44,6 +44,7 @@ default_path=$PATH # - DPDK_DEP_SSL (y/[n]) # - DPDK_DEP_SZE (y/[n]) # - DPDK_DEP_ZLIB (y/[n]) +# - DPDK_DEP_ISAL (y/[n]) # - DPDK_MAKE_JOBS (int) # - DPDK_NOTIFY (notify-send) # - LIBSSO_SNOW3G_PATH @@ -126,6 +127,7 @@ reset_env () unset DPDK_DEP_SSL unset DPDK_DEP_SZE unset DPDK_DEP_ZLIB + unset DPDK_DEP_ISAL unset AESNI_MULTI_BUFFER_LIB_PATH unset LIBSSO_SNOW3G_PATH unset LIBSSO_KASUMI_PATH @@ -176,7 +178,7 @@ config () # <directory> <target> <options> sed -ri 's,(PCAP=)n,\1y,' $1/.config test -z "$AESNI_MULTI_BUFFER_LIB_PATH" || \ sed -ri 's,(PMD_AESNI_MB=)n,\1y,' $1/.config - test -z "$AESNI_MULTI_BUFFER_LIB_PATH" || \ + test "$DPDK_DEP_ISAL" != y || \ sed -ri 's,(PMD_AESNI_GCM=)n,\1y,' $1/.config test -z "$LIBSSO_SNOW3G_PATH" || \ sed -ri 's,(PMD_SNOW3G=)n,\1y,' $1/.config diff --git a/doc/guides/cryptodevs/aesni_gcm.rst b/doc/guides/cryptodevs/aesni_gcm.rst index 04bf43c..e4b4108 100644 --- a/doc/guides/cryptodevs/aesni_gcm.rst +++ b/doc/guides/cryptodevs/aesni_gcm.rst @@ -32,10 +32,8 @@ AES-NI GCM Crypto Poll Mode Driver The AES-NI GCM PMD (**librte_pmd_aesni_gcm**) provides poll mode crypto driver -support for utilizing Intel multi buffer library (see AES-NI Multi-buffer PMD documentation -to learn more about it, including installation). - -The AES-NI GCM PMD has current only been tested on Fedora 21 64-bit with gcc. +support for utilizing Intel ISA-L crypto library, which provides operation acceleration +through the AES-NI instruction sets for AES-GCM authenticated cipher algorithm. Features -------- @@ -49,16 +47,21 @@ Cipher algorithms: Authentication algorithms: * RTE_CRYPTO_AUTH_AES_GCM +* RTE_CRYPTO_AUTH_AES_GMAC + +Installation +------------ + +To build DPDK with the AESNI_GCM_PMD the user is required to install +the ``libisal_crypto`` library in the build environment. +For download and more details please visit `<https://github.com/01org/isa-l_crypto>`_. Initialization -------------- In order to enable this virtual crypto PMD, user must: -* Export the environmental variable AESNI_MULTI_BUFFER_LIB_PATH with the path where - the library was extracted. - -* Build the multi buffer library (go to Installation section in AES-NI MB PMD documentation). +* Install the ISA-L crypto library (explained in Installation section). * Set CONFIG_RTE_LIBRTE_PMD_AESNI_GCM=y in config/common_base. @@ -86,9 +89,6 @@ Example: Limitations ----------- -* Chained mbufs are not supported. +* Chained mbufs are supported but only out-of-place (destination mbuf must be contiguous). * Hash only is not supported. * Cipher only is not supported. -* Only in-place is currently supported (destination address is the same as source address). -* Only supports session-oriented API implementation (session-less APIs are not supported). -* Not performance tuned. diff --git a/doc/guides/rel_notes/release_17_02.rst b/doc/guides/rel_notes/release_17_02.rst index 5ab7019..3b982b2 100644 --- a/doc/guides/rel_notes/release_17_02.rst +++ b/doc/guides/rel_notes/release_17_02.rst @@ -67,6 +67,18 @@ New Features * Support for single operations (cipher only and authentication only). +* **Updated the AES-NI GCM PMD.** + + The AES-NI GCM PMD was migrated from MB library to ISA-L library. + The migration entailed the following additional support for: + + * GMAC algorithm. + * 256-bit cipher key. + * Session-less mode. + * Out-of place processing + * Scatter-gatter support for chained mbufs (only out-of place and destination + mbuf must be contiguous) + Resolved Issues --------------- diff --git a/drivers/crypto/aesni_gcm/Makefile b/drivers/crypto/aesni_gcm/Makefile index 5898cae..fb17fbf 100644 --- a/drivers/crypto/aesni_gcm/Makefile +++ b/drivers/crypto/aesni_gcm/Makefile @@ -31,9 +31,6 @@ include $(RTE_SDK)/mk/rte.vars.mk ifneq ($(MAKECMDGOALS),clean) -ifeq ($(AESNI_MULTI_BUFFER_LIB_PATH),) -$(error "Please define AESNI_MULTI_BUFFER_LIB_PATH environment variable") -endif endif # library name @@ -50,10 +47,7 @@ LIBABIVER := 1 EXPORT_MAP := rte_pmd_aesni_gcm_version.map # external library dependencies -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH) -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH)/include -LDLIBS += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -LDLIBS += -lcrypto +LDLIBS += -lisal_crypto # library source files SRCS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += aesni_gcm_pmd.c diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h index c399068..e9de654 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h @@ -37,91 +37,26 @@ #define LINUX #endif -#include <gcm_defines.h> -#include <aux_funcs.h> +#include <isa-l_crypto/aes_gcm.h> -/** Supported vector modes */ -enum aesni_gcm_vector_mode { - RTE_AESNI_GCM_NOT_SUPPORTED = 0, - RTE_AESNI_GCM_SSE, - RTE_AESNI_GCM_AVX, - RTE_AESNI_GCM_AVX2 -}; - -typedef void (*aes_keyexp_128_enc_t)(void *key, void *enc_exp_keys); +typedef void (*aesni_gcm_init_t)(struct gcm_data *my_ctx_data, + uint8_t *iv, + uint8_t const *aad, + uint64_t aad_len); -typedef void (*aesni_gcm_t)(gcm_data *my_ctx_data, u8 *out, const u8 *in, - u64 plaintext_len, u8 *iv, const u8 *aad, u64 aad_len, - u8 *auth_tag, u64 auth_tag_len); +typedef void (*aesni_gcm_update_t)(struct gcm_data *my_ctx_data, + uint8_t *out, + const uint8_t *in, + uint64_t plaintext_len); -typedef void (*aesni_gcm_precomp_t)(gcm_data *my_ctx_data, u8 *hash_subkey); +typedef void (*aesni_gcm_finalize_t)(struct gcm_data *my_ctx_data, + uint8_t *auth_tag, + uint64_t auth_tag_len); -/** GCM library function pointer table */ struct aesni_gcm_ops { - struct { - struct { - aes_keyexp_128_enc_t aes128_enc; - /**< AES128 enc key expansion */ - } keyexp; - /**< Key expansion functions */ - } aux; /**< Auxiliary functions */ - - struct { - aesni_gcm_t enc; /**< GCM encode function pointer */ - aesni_gcm_t dec; /**< GCM decode function pointer */ - aesni_gcm_precomp_t precomp; /**< GCM pre-compute */ - } gcm; /**< GCM functions */ + aesni_gcm_init_t init; + aesni_gcm_update_t update; + aesni_gcm_finalize_t finalize; }; - -static const struct aesni_gcm_ops gcm_ops[] = { - [RTE_AESNI_GCM_NOT_SUPPORTED] = { - .aux = { - .keyexp = { - NULL - } - }, - .gcm = { - NULL - } - }, - [RTE_AESNI_GCM_SSE] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_sse - } - }, - .gcm = { - aesni_gcm_enc_sse, - aesni_gcm_dec_sse, - aesni_gcm_precomp_sse - } - }, - [RTE_AESNI_GCM_AVX] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen2, - aesni_gcm_dec_avx_gen2, - aesni_gcm_precomp_avx_gen2 - } - }, - [RTE_AESNI_GCM_AVX2] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx2, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen4, - aesni_gcm_dec_avx_gen4, - aesni_gcm_precomp_avx_gen4 - } - } -}; - - #endif /* _AESNI_GCM_OPS_H_ */ diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c index 5af22f7..8b2d792 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c @@ -30,8 +30,6 @@ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ -#include <openssl/aes.h> - #include <rte_common.h> #include <rte_config.h> #include <rte_hexdump.h> @@ -44,6 +42,34 @@ #include "aesni_gcm_pmd_private.h" +/** GCM encode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_enc[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_enc_update, + aesni_gcm128_enc_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_enc_update, + aesni_gcm256_enc_finalize + } +}; + +/** GCM decode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_dec[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_dec_update, + aesni_gcm128_dec_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_dec_update, + aesni_gcm256_dec_finalize + } +}; + /** * Global static parameter used to create a unique name for each AES-NI multi * buffer crypto device. @@ -65,112 +91,68 @@ return 0; } -static int -aesni_gcm_calculate_hash_sub_key(uint8_t *hsubkey, unsigned hsubkey_length, - uint8_t *aeskey, unsigned aeskey_length) -{ - uint8_t key[aeskey_length] __rte_aligned(16); - AES_KEY enc_key; - - if (hsubkey_length % 16 != 0 && aeskey_length % 16 != 0) - return -EFAULT; - - memcpy(key, aeskey, aeskey_length); - - if (AES_set_encrypt_key(key, aeskey_length << 3, &enc_key) != 0) - return -EFAULT; - - AES_encrypt(hsubkey, hsubkey, &enc_key); - - return 0; -} - -/** Get xform chain order */ -static int -aesni_gcm_get_mode(const struct rte_crypto_sym_xform *xform) -{ - /* - * GCM only supports authenticated encryption or authenticated - * decryption, all other options are invalid, so we must have exactly - * 2 xform structs chained together - */ - if (xform->next == NULL || xform->next->next != NULL) - return -1; - - if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && - xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { - return AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; - } - - if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && - xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { - return AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - } - - return -1; -} - /** Parse crypto xform chain and set private session parameters */ int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *gcm_ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform) { - const struct rte_crypto_sym_xform *auth_xform = NULL; - const struct rte_crypto_sym_xform *cipher_xform = NULL; + const struct rte_crypto_sym_xform *auth_xform; + const struct rte_crypto_sym_xform *cipher_xform; - uint8_t hsubkey[16] __rte_aligned(16) = { 0 }; - - /* Select Crypto operation - hash then cipher / cipher then hash */ - switch (aesni_gcm_get_mode(xform)) { - case AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + if (xform->next == NULL || xform->next->next != NULL) { + GCM_LOG_ERR("Two and only two chained xform required"); + return -EINVAL; + } - cipher_xform = xform; + if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && + xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { auth_xform = xform->next; - break; - case AESNI_GCM_OP_AUTHENTICATED_DECRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - + cipher_xform = xform; + } else if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && + xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { auth_xform = xform; cipher_xform = xform->next; - break; - default: - GCM_LOG_ERR("Unsupported operation chain order parameter"); + } else { + GCM_LOG_ERR("Cipher and auth xform required"); return -EINVAL; } - /* We only support AES GCM */ - if (cipher_xform->cipher.algo != RTE_CRYPTO_CIPHER_AES_GCM && - auth_xform->auth.algo != RTE_CRYPTO_AUTH_AES_GCM) + if (!(cipher_xform->cipher.algo == RTE_CRYPTO_CIPHER_AES_GCM && + (auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GCM || + auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GMAC))) { + GCM_LOG_ERR("We only support AES GCM and AES GMAC"); return -EINVAL; + } - /* Select cipher direction */ - if (sess->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_ENCRYPT) { - GCM_LOG_ERR("xform chain (CIPHER/AUTH) and cipher operation " - "(DECRYPT) specified are an invalid selection"); - return -EINVAL; - } else if (sess->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_DECRYPT) { - GCM_LOG_ERR("xform chain (AUTH/CIPHER) and cipher operation " - "(ENCRYPT) specified are an invalid selection"); + /* Select Crypto operation */ + if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_ENCRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_GENERATE) + sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + else if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_DECRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_VERIFY) + sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; + else { + GCM_LOG_ERR("Cipher/Auth operations: Encrypt/Generate or" + " Decrypt/Verify are valid only"); return -EINVAL; } - /* Expand GCM AES128 key */ - (*gcm_ops->aux.keyexp.aes128_enc)(cipher_xform->cipher.key.data, - sess->gdata.expanded_keys); + /* Check key length and calculate GCM pre-compute. */ + switch (cipher_xform->cipher.key.length) { + case 16: + aesni_gcm128_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_128; - /* Calculate hash sub key here */ - aesni_gcm_calculate_hash_sub_key(hsubkey, sizeof(hsubkey), - cipher_xform->cipher.key.data, - cipher_xform->cipher.key.length); + break; + case 32: + aesni_gcm256_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_256; - /* Calculate GCM pre-compute */ - (*gcm_ops->gcm.precomp)(&sess->gdata, hsubkey); + break; + default: + GCM_LOG_ERR("Unsupported cipher key length"); + return -EINVAL; + } return 0; } @@ -194,10 +176,10 @@ return sess; sess = (struct aesni_gcm_session *) - ((struct rte_cryptodev_session *)_sess)->_private; + ((struct rte_cryptodev_sym_session *)_sess)->_private; - if (unlikely(aesni_gcm_set_session_parameters(qp->ops, - sess, op->xform) != 0)) { + if (unlikely(aesni_gcm_set_session_parameters(sess, + op->xform) != 0)) { rte_mempool_put(qp->sess_mp, _sess); sess = NULL; } @@ -217,19 +199,45 @@ * */ static int -process_gcm_crypto_op(struct aesni_gcm_qp *qp, struct rte_crypto_sym_op *op, +process_gcm_crypto_op(struct rte_crypto_sym_op *op, struct aesni_gcm_session *session) { uint8_t *src, *dst; - struct rte_mbuf *m = op->m_src; + struct rte_mbuf *m_src = op->m_src; + uint32_t offset = op->cipher.data.offset; + uint32_t part_len, total_len, data_len; + + RTE_ASSERT(m_src != NULL); + + while (offset >= m_src->data_len) { + offset -= m_src->data_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + } + + data_len = m_src->data_len - offset; + part_len = (data_len < op->cipher.data.length) ? data_len : + op->cipher.data.length; + + /* Destination buffer is required when segmented source buffer */ + RTE_ASSERT((part_len == op->cipher.data.length) || + ((part_len != op->cipher.data.length) && + (op->m_dst != NULL))); + /* Segmented destination buffer is not supported */ + RTE_ASSERT((op->m_dst == NULL) || + ((op->m_dst != NULL) && + rte_pktmbuf_is_contiguous(op->m_dst))); + - src = rte_pktmbuf_mtod(m, uint8_t *) + op->cipher.data.offset; dst = op->m_dst ? rte_pktmbuf_mtod_offset(op->m_dst, uint8_t *, op->cipher.data.offset) : - rte_pktmbuf_mtod_offset(m, uint8_t *, + rte_pktmbuf_mtod_offset(op->m_src, uint8_t *, op->cipher.data.offset); + src = rte_pktmbuf_mtod_offset(m_src, uint8_t *, offset); + /* sanity checks */ if (op->cipher.iv.length != 16 && op->cipher.iv.length != 12 && op->cipher.iv.length != 0) { @@ -246,48 +254,81 @@ *iv_padd = rte_bswap32(1); } - if (op->auth.aad.length != 12 && op->auth.aad.length != 8 && - op->auth.aad.length != 0) { - GCM_LOG_ERR("iv"); - return -1; - } - if (op->auth.digest.length != 16 && op->auth.digest.length != 12 && - op->auth.digest.length != 8 && - op->auth.digest.length != 0) { - GCM_LOG_ERR("iv"); + op->auth.digest.length != 8) { + GCM_LOG_ERR("digest"); return -1; } if (session->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION) { - (*qp->ops->gcm.enc)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_enc[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_enc[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_enc[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_enc[session->key].finalize(&session->gdata, op->auth.digest.data, (uint64_t)op->auth.digest.length); - } else if (session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION) { - uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(m, + } else { /* session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION */ + uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(op->m_dst ? + op->m_dst : op->m_src, op->auth.digest.length); if (!auth_tag) { - GCM_LOG_ERR("iv"); + GCM_LOG_ERR("auth_tag"); return -1; } - (*qp->ops->gcm.dec)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_dec[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_dec[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_dec[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_dec[session->key].finalize(&session->gdata, auth_tag, (uint64_t)op->auth.digest.length); - } else { - GCM_LOG_ERR("iv"); - return -1; } return 0; @@ -377,21 +418,7 @@ break; } -#ifdef RTE_LIBRTE_PMD_AESNI_GCM_DEBUG - if (!rte_pktmbuf_is_contiguous(ops[i]->sym->m_src) || - (ops[i]->sym->m_dst != NULL && - !rte_pktmbuf_is_contiguous( - ops[i]->sym->m_dst))) { - ops[i]->status = RTE_CRYPTO_OP_STATUS_INVALID_ARGS; - GCM_LOG_ERR("PMD supports only contiguous mbufs, " - "op (%p) provides noncontiguous mbuf as " - "source/destination buffer.\n", ops[i]); - qp->qp_stats.enqueue_err_count++; - break; - } -#endif - - retval = process_gcm_crypto_op(qp, ops[i]->sym, sess); + retval = process_gcm_crypto_op(ops[i]->sym, sess); if (retval < 0) { ops[i]->status = RTE_CRYPTO_OP_STATUS_INVALID_ARGS; qp->qp_stats.enqueue_err_count++; @@ -429,7 +456,6 @@ struct rte_cryptodev *dev; char crypto_dev_name[RTE_CRYPTODEV_NAME_MAX_LEN]; struct aesni_gcm_private *internals; - enum aesni_gcm_vector_mode vector_mode; /* Check CPU for support for AES instruction set */ if (!rte_cpu_get_flag_enabled(RTE_CPUFLAG_AES)) { @@ -437,18 +463,6 @@ return -EFAULT; } - /* Check CPU for supported vector instruction set */ - if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX2)) - vector_mode = RTE_AESNI_GCM_AVX2; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX)) - vector_mode = RTE_AESNI_GCM_AVX; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_SSE4_1)) - vector_mode = RTE_AESNI_GCM_SSE; - else { - GCM_LOG_ERR("Vector instructions are not supported by CPU"); - return -EFAULT; - } - /* create a unique device name */ if (create_unique_device_name(crypto_dev_name, RTE_CRYPTODEV_NAME_MAX_LEN) != 0) { @@ -473,27 +487,11 @@ dev->feature_flags = RTE_CRYPTODEV_FF_SYMMETRIC_CRYPTO | RTE_CRYPTODEV_FF_SYM_OPERATION_CHAINING | - RTE_CRYPTODEV_FF_CPU_AESNI; + RTE_CRYPTODEV_FF_CPU_AESNI | + RTE_CRYPTODEV_FF_MBUF_SCATTER_GATHER; - switch (vector_mode) { - case RTE_AESNI_GCM_SSE: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_SSE; - break; - case RTE_AESNI_GCM_AVX: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX; - break; - case RTE_AESNI_GCM_AVX2: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX2; - break; - default: - break; - } - - /* Set vector instructions mode supported */ internals = dev->data->dev_private; - internals->vector_mode = vector_mode; - internals->max_nb_queue_pairs = init_params->max_nb_queue_pairs; internals->max_nb_sessions = init_params->max_nb_sessions; diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c index c51f82a..2362006 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c @@ -39,17 +39,17 @@ #include "aesni_gcm_pmd_private.h" static const struct rte_cryptodev_capabilities aesni_gcm_pmd_capabilities[] = { - { /* AES GCM (AUTH) */ + { /* AES GMAC (AUTH) */ .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, {.sym = { .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, {.auth = { - .algo = RTE_CRYPTO_AUTH_AES_GCM, + .algo = RTE_CRYPTO_AUTH_AES_GMAC, .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .digest_size = { .min = 8, @@ -57,9 +57,34 @@ .increment = 4 }, .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 + } + }, } + }, } + }, + { /* AES GCM (AUTH) */ + .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, + {.sym = { + .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, + {.auth = { + .algo = RTE_CRYPTO_AUTH_AES_GCM, + .block_size = 16, + .key_size = { + .min = 16, + .max = 32, + .increment = 16 + }, + .digest_size = { .min = 8, - .max = 12, + .max = 16, .increment = 4 + }, + .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 } }, } }, } @@ -73,8 +98,8 @@ .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .iv_size = { .min = 12, @@ -221,7 +246,6 @@ int socket_id) { struct aesni_gcm_qp *qp = NULL; - struct aesni_gcm_private *internals = dev->data->dev_private; /* Free memory prior to re-allocation if needed. */ if (dev->data->queue_pairs[qp_id] != NULL) @@ -239,8 +263,6 @@ if (aesni_gcm_pmd_qp_set_unique_name(dev, qp)) goto qp_setup_cleanup; - qp->ops = &gcm_ops[internals->vector_mode]; - qp->processed_pkts = aesni_gcm_pmd_qp_create_processed_pkts_ring(qp, qp_conf->nb_descriptors, socket_id); if (qp->processed_pkts == NULL) @@ -291,18 +313,15 @@ /** Configure a aesni gcm session from a crypto xform chain */ static void * -aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev, +aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev __rte_unused, struct rte_crypto_sym_xform *xform, void *sess) { - struct aesni_gcm_private *internals = dev->data->dev_private; - if (unlikely(sess == NULL)) { GCM_LOG_ERR("invalid session struct"); return NULL; } - if (aesni_gcm_set_session_parameters(&gcm_ops[internals->vector_mode], - sess, xform) != 0) { + if (aesni_gcm_set_session_parameters(sess, xform) != 0) { GCM_LOG_ERR("failed configure session parameters"); return NULL; } diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h index 9878d6e..0496b44 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h @@ -58,8 +58,6 @@ /** private data structure for each virtual AESNI GCM device */ struct aesni_gcm_private { - enum aesni_gcm_vector_mode vector_mode; - /**< Vector mode */ unsigned max_nb_queue_pairs; /**< Max number of queue pairs supported by device */ unsigned max_nb_sessions; @@ -71,8 +69,6 @@ struct aesni_gcm_qp { /**< Queue Pair Identifier */ char name[RTE_CRYPTODEV_NAME_LEN]; /**< Unique Queue Pair Name */ - const struct aesni_gcm_ops *ops; - /**< Architecture dependent function pointer table of the gcm APIs */ struct rte_ring *processed_pkts; /**< Ring for placing process packets */ struct rte_mempool *sess_mp; @@ -87,10 +83,17 @@ enum aesni_gcm_operation { AESNI_GCM_OP_AUTHENTICATED_DECRYPTION }; +enum aesni_gcm_key { + AESNI_GCM_KEY_128, + AESNI_GCM_KEY_256 +}; + /** AESNI GCM private session structure */ struct aesni_gcm_session { enum aesni_gcm_operation op; /**< GCM operation type */ + enum aesni_gcm_key key; + /**< GCM key type */ struct gcm_data gdata __rte_cache_aligned; /**< GCM parameters */ }; @@ -98,7 +101,6 @@ struct aesni_gcm_session { /** * Setup GCM session parameters - * @param ops gcm ops function pointer table * @param sess aesni gcm session structure * @param xform crypto transform chain * @@ -107,8 +109,7 @@ struct aesni_gcm_session { * - On failure returns error code < 0 */ extern int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform); diff --git a/mk/rte.app.mk b/mk/rte.app.mk index f75f0e2..ed3eab5 100644 --- a/mk/rte.app.mk +++ b/mk/rte.app.mk @@ -134,8 +134,7 @@ _LDLIBS-$(CONFIG_RTE_LIBRTE_VMXNET3_PMD) += -lrte_pmd_vmxnet3_uio ifeq ($(CONFIG_RTE_LIBRTE_CRYPTODEV),y) _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -lrte_pmd_aesni_mb _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lcrypto -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB +_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lisal_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_OPENSSL) += -lrte_pmd_openssl -lcrypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_NULL_CRYPTO) += -lrte_pmd_null_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_QAT) += -lrte_pmd_qat -lcrypto -- 1.7.9.5 ^ permalink raw reply [flat|nested] 15+ messages in thread
* Re: [dpdk-dev] [PATCH v5] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-16 12:37 ` [dpdk-dev] [PATCH v5] " Piotr Azarewicz @ 2017-01-16 14:42 ` Thomas Monjalon 2017-01-16 15:20 ` Mcnamara, John 2017-01-16 14:43 ` Thomas Monjalon ` (2 subsequent siblings) 3 siblings, 1 reply; 15+ messages in thread From: Thomas Monjalon @ 2017-01-16 14:42 UTC (permalink / raw) To: Piotr Azarewicz; +Cc: dev, pablo.de.lara.guarch, declan.doherty 2017-01-16 13:37, Piotr Azarewicz: > v5 changes: > - rebase on top of dpdk-next-crypto > - remove the perftest output from commit message > - correction in aesni_gcm.rst > - fix typo Why have you removed the perftest output? Because it is not in favor of this patch? ^ permalink raw reply [flat|nested] 15+ messages in thread
* Re: [dpdk-dev] [PATCH v5] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-16 14:42 ` Thomas Monjalon @ 2017-01-16 15:20 ` Mcnamara, John 0 siblings, 0 replies; 15+ messages in thread From: Mcnamara, John @ 2017-01-16 15:20 UTC (permalink / raw) To: Thomas Monjalon, Azarewicz, PiotrX T Cc: dev, De Lara Guarch, Pablo, Doherty, Declan > -----Original Message----- > From: dev [mailto:dev-bounces@dpdk.org] On Behalf Of Thomas Monjalon > Sent: Monday, January 16, 2017 2:42 PM > To: Azarewicz, PiotrX T <piotrx.t.azarewicz@intel.com> > Cc: dev@dpdk.org; De Lara Guarch, Pablo <pablo.de.lara.guarch@intel.com>; > Doherty, Declan <declan.doherty@intel.com> > Subject: Re: [dpdk-dev] [PATCH v5] crypto/aesni_gcm: migration from MB > library to ISA-L > > 2017-01-16 13:37, Piotr Azarewicz: > > v5 changes: > > - rebase on top of dpdk-next-crypto > > - remove the perftest output from commit message > > - correction in aesni_gcm.rst > > - fix typo > > Why have you removed the perftest output? > Because it is not in favor of this patch? Hi, We removed the performance results because we are not allowed to publish them publically without legal approval. Also, and more importantly, the perftest results aren't reliable which is one of the reasons that we are developing the new dpdk-test-crypto-perf app (patch submitted). Using the newer app we are seeing more of less the same performance, slightly higher for some packets sizes, slightly lower for some others. John. ^ permalink raw reply [flat|nested] 15+ messages in thread
* Re: [dpdk-dev] [PATCH v5] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-16 12:37 ` [dpdk-dev] [PATCH v5] " Piotr Azarewicz 2017-01-16 14:42 ` Thomas Monjalon @ 2017-01-16 14:43 ` Thomas Monjalon 2017-01-17 7:42 ` Azarewicz, PiotrX T 2017-01-16 16:05 ` Declan Doherty 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 0/2] " Piotr Azarewicz 3 siblings, 1 reply; 15+ messages in thread From: Thomas Monjalon @ 2017-01-16 14:43 UTC (permalink / raw) To: Piotr Azarewicz; +Cc: dev, pablo.de.lara.guarch, declan.doherty 2017-01-16 13:37, Piotr Azarewicz: > app/test/test_cryptodev.c | 753 +++++++++++++++++++--- > app/test/test_cryptodev_gcm_test_vectors.h | 491 +++++++++++++- > devtools/test-build.sh | 4 +- > doc/guides/cryptodevs/aesni_gcm.rst | 24 +- > doc/guides/rel_notes/release_17_02.rst | 12 + > drivers/crypto/aesni_gcm/Makefile | 8 +- > drivers/crypto/aesni_gcm/aesni_gcm_ops.h | 95 +-- > drivers/crypto/aesni_gcm/aesni_gcm_pmd.c | 324 +++++----- > drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c | 49 +- > drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h | 15 +- > mk/rte.app.mk | 3 +- > 11 files changed, 1363 insertions(+), 415 deletions(-) That's the kind of patch where test and implementation could be split. ^ permalink raw reply [flat|nested] 15+ messages in thread
* Re: [dpdk-dev] [PATCH v5] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-16 14:43 ` Thomas Monjalon @ 2017-01-17 7:42 ` Azarewicz, PiotrX T 0 siblings, 0 replies; 15+ messages in thread From: Azarewicz, PiotrX T @ 2017-01-17 7:42 UTC (permalink / raw) To: Thomas Monjalon; +Cc: dev, De Lara Guarch, Pablo, Doherty, Declan > 2017-01-16 13:37, Piotr Azarewicz: > > app/test/test_cryptodev.c | 753 +++++++++++++++++++--- > > app/test/test_cryptodev_gcm_test_vectors.h | 491 +++++++++++++- > > devtools/test-build.sh | 4 +- > > doc/guides/cryptodevs/aesni_gcm.rst | 24 +- > > doc/guides/rel_notes/release_17_02.rst | 12 + > > drivers/crypto/aesni_gcm/Makefile | 8 +- > > drivers/crypto/aesni_gcm/aesni_gcm_ops.h | 95 +-- > > drivers/crypto/aesni_gcm/aesni_gcm_pmd.c | 324 +++++----- > > drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c | 49 +- > > drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h | 15 +- > > mk/rte.app.mk | 3 +- > > 11 files changed, 1363 insertions(+), 415 deletions(-) > > That's the kind of patch where test and implementation could be split. Okay I will. ^ permalink raw reply [flat|nested] 15+ messages in thread
* Re: [dpdk-dev] [PATCH v5] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-16 12:37 ` [dpdk-dev] [PATCH v5] " Piotr Azarewicz 2017-01-16 14:42 ` Thomas Monjalon 2017-01-16 14:43 ` Thomas Monjalon @ 2017-01-16 16:05 ` Declan Doherty 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 0/2] " Piotr Azarewicz 3 siblings, 0 replies; 15+ messages in thread From: Declan Doherty @ 2017-01-16 16:05 UTC (permalink / raw) To: Piotr Azarewicz, pablo.de.lara.guarch, dev On 16/01/17 12:37, Piotr Azarewicz wrote: > Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer > Crypto library.This patch reimplement the device using ISA-L Crypto > library: https://github.com/01org/isa-l_crypto. > > The migration entailed the following additional support for: > * GMAC algorithm. > * 256-bit cipher key. > * Session-less mode. > * Out-of place processing > * Scatter-gatter support for chained mbufs (only out-of place and > destination mbuf must be contiguous) > > Verified current unit tests and added new unit tests to verify new > functionalities. > > Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> > --- > ... > Acked-by: Declan Doherty <declan.doherty@intel.com> ^ permalink raw reply [flat|nested] 15+ messages in thread
* [dpdk-dev] [PATCH v6 0/2] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-16 12:37 ` [dpdk-dev] [PATCH v5] " Piotr Azarewicz ` (2 preceding siblings ...) 2017-01-16 16:05 ` Declan Doherty @ 2017-01-17 11:19 ` Piotr Azarewicz 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 1/2] " Piotr Azarewicz ` (2 more replies) 3 siblings, 3 replies; 15+ messages in thread From: Piotr Azarewicz @ 2017-01-17 11:19 UTC (permalink / raw) To: pablo.de.lara.guarch, dev Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer Crypto library.This patch reimplement the device using ISA-L Crypto library: https://github.com/01org/isa-l_crypto. The migration entailed the following additional support for: * GMAC algorithm. * 256-bit cipher key. * Session-less mode. * Out-of place processing * Scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) Verified current unit tests and added new unit tests to verify new functionalities. Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> Acked-by: Declan Doherty <declan.doherty@intel.com> v6 changes: - rebase on top of dpdk-next-crypto - split driver implementation and tests v5 changes: - rebase on top of dpdk-next-crypto - remove the perftest output from commit message - correction in aesni_gcm.rst - fix typo v4 changes: - rebase on top of dpdk-next-crypto - update the script test-build.sh v3 changes: - rebase on top of dpdk-next-crypto v2 changes: - implement native scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) - write unit test for session-less mode - write unit test for out-of place processing - add support for GMAC authentication algorithm Piotr Azarewicz (2): crypto/aesni_gcm: migration from MB library to ISA-L app/test: add GCM additional tests app/test/test_cryptodev.c | 753 +++++++++++++++++++--- app/test/test_cryptodev_gcm_test_vectors.h | 491 +++++++++++++- devtools/test-build.sh | 4 +- doc/guides/cryptodevs/aesni_gcm.rst | 24 +- doc/guides/rel_notes/release_17_02.rst | 12 + drivers/crypto/aesni_gcm/Makefile | 8 +- drivers/crypto/aesni_gcm/aesni_gcm_ops.h | 95 +-- drivers/crypto/aesni_gcm/aesni_gcm_pmd.c | 320 +++++---- drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c | 49 +- drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h | 15 +- mk/rte.app.mk | 3 +- 11 files changed, 1361 insertions(+), 413 deletions(-) -- 1.7.9.5 ^ permalink raw reply [flat|nested] 15+ messages in thread
* [dpdk-dev] [PATCH v6 1/2] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 0/2] " Piotr Azarewicz @ 2017-01-17 11:19 ` Piotr Azarewicz 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 2/2] app/test: add GCM additional tests Piotr Azarewicz 2017-01-18 12:37 ` [dpdk-dev] [PATCH v6 0/2] crypto/aesni_gcm: migration from MB library to ISA-L De Lara Guarch, Pablo 2 siblings, 0 replies; 15+ messages in thread From: Piotr Azarewicz @ 2017-01-17 11:19 UTC (permalink / raw) To: pablo.de.lara.guarch, dev Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer Crypto library.This patch reimplement the device using ISA-L Crypto library: https://github.com/01org/isa-l_crypto. The migration entailed the following additional support for: * GMAC algorithm. * 256-bit cipher key. * Session-less mode. * Out-of place processing * Scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> Acked-by: Declan Doherty <declan.doherty@intel.com> --- v6 changes: - rebase on top of dpdk-next-crypto v5 changes: - rebase on top of dpdk-next-crypto - correction in aesni_gcm.rst v4 changes: - rebase on top of dpdk-next-crypto - update the script test-build.sh v3 changes: - rebase on top of dpdk-next-crypto v2 changes: - implement native scatter-gatter support for chained mbufs (only out-of place and destination mbuf must be contiguous) - add support for GMAC authentication algorithm devtools/test-build.sh | 4 +- doc/guides/cryptodevs/aesni_gcm.rst | 24 +- doc/guides/rel_notes/release_17_02.rst | 12 + drivers/crypto/aesni_gcm/Makefile | 8 +- drivers/crypto/aesni_gcm/aesni_gcm_ops.h | 95 +------ drivers/crypto/aesni_gcm/aesni_gcm_pmd.c | 320 +++++++++++----------- drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c | 49 +++- drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h | 15 +- mk/rte.app.mk | 3 +- 9 files changed, 245 insertions(+), 285 deletions(-) diff --git a/devtools/test-build.sh b/devtools/test-build.sh index a979309..d0c1a34 100755 --- a/devtools/test-build.sh +++ b/devtools/test-build.sh @@ -44,6 +44,7 @@ default_path=$PATH # - DPDK_DEP_SSL (y/[n]) # - DPDK_DEP_SZE (y/[n]) # - DPDK_DEP_ZLIB (y/[n]) +# - DPDK_DEP_ISAL (y/[n]) # - DPDK_MAKE_JOBS (int) # - DPDK_NOTIFY (notify-send) # - LIBSSO_SNOW3G_PATH @@ -126,6 +127,7 @@ reset_env () unset DPDK_DEP_SSL unset DPDK_DEP_SZE unset DPDK_DEP_ZLIB + unset DPDK_DEP_ISAL unset AESNI_MULTI_BUFFER_LIB_PATH unset LIBSSO_SNOW3G_PATH unset LIBSSO_KASUMI_PATH @@ -176,7 +178,7 @@ config () # <directory> <target> <options> sed -ri 's,(PCAP=)n,\1y,' $1/.config test -z "$AESNI_MULTI_BUFFER_LIB_PATH" || \ sed -ri 's,(PMD_AESNI_MB=)n,\1y,' $1/.config - test -z "$AESNI_MULTI_BUFFER_LIB_PATH" || \ + test "$DPDK_DEP_ISAL" != y || \ sed -ri 's,(PMD_AESNI_GCM=)n,\1y,' $1/.config test -z "$LIBSSO_SNOW3G_PATH" || \ sed -ri 's,(PMD_SNOW3G=)n,\1y,' $1/.config diff --git a/doc/guides/cryptodevs/aesni_gcm.rst b/doc/guides/cryptodevs/aesni_gcm.rst index 04bf43c..e4b4108 100644 --- a/doc/guides/cryptodevs/aesni_gcm.rst +++ b/doc/guides/cryptodevs/aesni_gcm.rst @@ -32,10 +32,8 @@ AES-NI GCM Crypto Poll Mode Driver The AES-NI GCM PMD (**librte_pmd_aesni_gcm**) provides poll mode crypto driver -support for utilizing Intel multi buffer library (see AES-NI Multi-buffer PMD documentation -to learn more about it, including installation). - -The AES-NI GCM PMD has current only been tested on Fedora 21 64-bit with gcc. +support for utilizing Intel ISA-L crypto library, which provides operation acceleration +through the AES-NI instruction sets for AES-GCM authenticated cipher algorithm. Features -------- @@ -49,16 +47,21 @@ Cipher algorithms: Authentication algorithms: * RTE_CRYPTO_AUTH_AES_GCM +* RTE_CRYPTO_AUTH_AES_GMAC + +Installation +------------ + +To build DPDK with the AESNI_GCM_PMD the user is required to install +the ``libisal_crypto`` library in the build environment. +For download and more details please visit `<https://github.com/01org/isa-l_crypto>`_. Initialization -------------- In order to enable this virtual crypto PMD, user must: -* Export the environmental variable AESNI_MULTI_BUFFER_LIB_PATH with the path where - the library was extracted. - -* Build the multi buffer library (go to Installation section in AES-NI MB PMD documentation). +* Install the ISA-L crypto library (explained in Installation section). * Set CONFIG_RTE_LIBRTE_PMD_AESNI_GCM=y in config/common_base. @@ -86,9 +89,6 @@ Example: Limitations ----------- -* Chained mbufs are not supported. +* Chained mbufs are supported but only out-of-place (destination mbuf must be contiguous). * Hash only is not supported. * Cipher only is not supported. -* Only in-place is currently supported (destination address is the same as source address). -* Only supports session-oriented API implementation (session-less APIs are not supported). -* Not performance tuned. diff --git a/doc/guides/rel_notes/release_17_02.rst b/doc/guides/rel_notes/release_17_02.rst index d280daa..8b4ab7f 100644 --- a/doc/guides/rel_notes/release_17_02.rst +++ b/doc/guides/rel_notes/release_17_02.rst @@ -72,6 +72,18 @@ New Features * Support for single operations (cipher only and authentication only). +* **Updated the AES-NI GCM PMD.** + + The AES-NI GCM PMD was migrated from MB library to ISA-L library. + The migration entailed the following additional support for: + + * GMAC algorithm. + * 256-bit cipher key. + * Session-less mode. + * Out-of place processing + * Scatter-gatter support for chained mbufs (only out-of place and destination + mbuf must be contiguous) + Resolved Issues --------------- diff --git a/drivers/crypto/aesni_gcm/Makefile b/drivers/crypto/aesni_gcm/Makefile index 5898cae..fb17fbf 100644 --- a/drivers/crypto/aesni_gcm/Makefile +++ b/drivers/crypto/aesni_gcm/Makefile @@ -31,9 +31,6 @@ include $(RTE_SDK)/mk/rte.vars.mk ifneq ($(MAKECMDGOALS),clean) -ifeq ($(AESNI_MULTI_BUFFER_LIB_PATH),) -$(error "Please define AESNI_MULTI_BUFFER_LIB_PATH environment variable") -endif endif # library name @@ -50,10 +47,7 @@ LIBABIVER := 1 EXPORT_MAP := rte_pmd_aesni_gcm_version.map # external library dependencies -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH) -CFLAGS += -I$(AESNI_MULTI_BUFFER_LIB_PATH)/include -LDLIBS += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -LDLIBS += -lcrypto +LDLIBS += -lisal_crypto # library source files SRCS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += aesni_gcm_pmd.c diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h index c399068..e9de654 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_ops.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_ops.h @@ -37,91 +37,26 @@ #define LINUX #endif -#include <gcm_defines.h> -#include <aux_funcs.h> +#include <isa-l_crypto/aes_gcm.h> -/** Supported vector modes */ -enum aesni_gcm_vector_mode { - RTE_AESNI_GCM_NOT_SUPPORTED = 0, - RTE_AESNI_GCM_SSE, - RTE_AESNI_GCM_AVX, - RTE_AESNI_GCM_AVX2 -}; - -typedef void (*aes_keyexp_128_enc_t)(void *key, void *enc_exp_keys); +typedef void (*aesni_gcm_init_t)(struct gcm_data *my_ctx_data, + uint8_t *iv, + uint8_t const *aad, + uint64_t aad_len); -typedef void (*aesni_gcm_t)(gcm_data *my_ctx_data, u8 *out, const u8 *in, - u64 plaintext_len, u8 *iv, const u8 *aad, u64 aad_len, - u8 *auth_tag, u64 auth_tag_len); +typedef void (*aesni_gcm_update_t)(struct gcm_data *my_ctx_data, + uint8_t *out, + const uint8_t *in, + uint64_t plaintext_len); -typedef void (*aesni_gcm_precomp_t)(gcm_data *my_ctx_data, u8 *hash_subkey); +typedef void (*aesni_gcm_finalize_t)(struct gcm_data *my_ctx_data, + uint8_t *auth_tag, + uint64_t auth_tag_len); -/** GCM library function pointer table */ struct aesni_gcm_ops { - struct { - struct { - aes_keyexp_128_enc_t aes128_enc; - /**< AES128 enc key expansion */ - } keyexp; - /**< Key expansion functions */ - } aux; /**< Auxiliary functions */ - - struct { - aesni_gcm_t enc; /**< GCM encode function pointer */ - aesni_gcm_t dec; /**< GCM decode function pointer */ - aesni_gcm_precomp_t precomp; /**< GCM pre-compute */ - } gcm; /**< GCM functions */ + aesni_gcm_init_t init; + aesni_gcm_update_t update; + aesni_gcm_finalize_t finalize; }; - -static const struct aesni_gcm_ops gcm_ops[] = { - [RTE_AESNI_GCM_NOT_SUPPORTED] = { - .aux = { - .keyexp = { - NULL - } - }, - .gcm = { - NULL - } - }, - [RTE_AESNI_GCM_SSE] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_sse - } - }, - .gcm = { - aesni_gcm_enc_sse, - aesni_gcm_dec_sse, - aesni_gcm_precomp_sse - } - }, - [RTE_AESNI_GCM_AVX] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen2, - aesni_gcm_dec_avx_gen2, - aesni_gcm_precomp_avx_gen2 - } - }, - [RTE_AESNI_GCM_AVX2] = { - .aux = { - .keyexp = { - aes_keyexp_128_enc_avx2, - } - }, - .gcm = { - aesni_gcm_enc_avx_gen4, - aesni_gcm_dec_avx_gen4, - aesni_gcm_precomp_avx_gen4 - } - } -}; - - #endif /* _AESNI_GCM_OPS_H_ */ diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c index 796825a..a2d10a5 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd.c @@ -30,8 +30,6 @@ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ -#include <openssl/aes.h> - #include <rte_common.h> #include <rte_config.h> #include <rte_hexdump.h> @@ -44,112 +42,96 @@ #include "aesni_gcm_pmd_private.h" -static int -aesni_gcm_calculate_hash_sub_key(uint8_t *hsubkey, unsigned hsubkey_length, - uint8_t *aeskey, unsigned aeskey_length) -{ - uint8_t key[aeskey_length] __rte_aligned(16); - AES_KEY enc_key; - - if (hsubkey_length % 16 != 0 && aeskey_length % 16 != 0) - return -EFAULT; - - memcpy(key, aeskey, aeskey_length); - - if (AES_set_encrypt_key(key, aeskey_length << 3, &enc_key) != 0) - return -EFAULT; - - AES_encrypt(hsubkey, hsubkey, &enc_key); - - return 0; -} - -/** Get xform chain order */ -static int -aesni_gcm_get_mode(const struct rte_crypto_sym_xform *xform) -{ - /* - * GCM only supports authenticated encryption or authenticated - * decryption, all other options are invalid, so we must have exactly - * 2 xform structs chained together - */ - if (xform->next == NULL || xform->next->next != NULL) - return -1; - - if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && - xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { - return AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; - } - - if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && - xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { - return AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - } +/** GCM encode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_enc[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_enc_update, + aesni_gcm128_enc_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_enc_update, + aesni_gcm256_enc_finalize + } +}; - return -1; -} +/** GCM decode functions pointer table */ +static const struct aesni_gcm_ops aesni_gcm_dec[] = { + [AESNI_GCM_KEY_128] = { + aesni_gcm128_init, + aesni_gcm128_dec_update, + aesni_gcm128_dec_finalize + }, + [AESNI_GCM_KEY_256] = { + aesni_gcm256_init, + aesni_gcm256_dec_update, + aesni_gcm256_dec_finalize + } +}; /** Parse crypto xform chain and set private session parameters */ int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *gcm_ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform) { - const struct rte_crypto_sym_xform *auth_xform = NULL; - const struct rte_crypto_sym_xform *cipher_xform = NULL; + const struct rte_crypto_sym_xform *auth_xform; + const struct rte_crypto_sym_xform *cipher_xform; - uint8_t hsubkey[16] __rte_aligned(16) = { 0 }; - - /* Select Crypto operation - hash then cipher / cipher then hash */ - switch (aesni_gcm_get_mode(xform)) { - case AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + if (xform->next == NULL || xform->next->next != NULL) { + GCM_LOG_ERR("Two and only two chained xform required"); + return -EINVAL; + } - cipher_xform = xform; + if (xform->type == RTE_CRYPTO_SYM_XFORM_CIPHER && + xform->next->type == RTE_CRYPTO_SYM_XFORM_AUTH) { auth_xform = xform->next; - break; - case AESNI_GCM_OP_AUTHENTICATED_DECRYPTION: - sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; - + cipher_xform = xform; + } else if (xform->type == RTE_CRYPTO_SYM_XFORM_AUTH && + xform->next->type == RTE_CRYPTO_SYM_XFORM_CIPHER) { auth_xform = xform; cipher_xform = xform->next; - break; - default: - GCM_LOG_ERR("Unsupported operation chain order parameter"); + } else { + GCM_LOG_ERR("Cipher and auth xform required"); return -EINVAL; } - /* We only support AES GCM */ - if (cipher_xform->cipher.algo != RTE_CRYPTO_CIPHER_AES_GCM && - auth_xform->auth.algo != RTE_CRYPTO_AUTH_AES_GCM) + if (!(cipher_xform->cipher.algo == RTE_CRYPTO_CIPHER_AES_GCM && + (auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GCM || + auth_xform->auth.algo == RTE_CRYPTO_AUTH_AES_GMAC))) { + GCM_LOG_ERR("We only support AES GCM and AES GMAC"); return -EINVAL; + } - /* Select cipher direction */ - if (sess->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_ENCRYPT) { - GCM_LOG_ERR("xform chain (CIPHER/AUTH) and cipher operation " - "(DECRYPT) specified are an invalid selection"); - return -EINVAL; - } else if (sess->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION && - cipher_xform->cipher.op != - RTE_CRYPTO_CIPHER_OP_DECRYPT) { - GCM_LOG_ERR("xform chain (AUTH/CIPHER) and cipher operation " - "(ENCRYPT) specified are an invalid selection"); + /* Select Crypto operation */ + if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_ENCRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_GENERATE) + sess->op = AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION; + else if (cipher_xform->cipher.op == RTE_CRYPTO_CIPHER_OP_DECRYPT && + auth_xform->auth.op == RTE_CRYPTO_AUTH_OP_VERIFY) + sess->op = AESNI_GCM_OP_AUTHENTICATED_DECRYPTION; + else { + GCM_LOG_ERR("Cipher/Auth operations: Encrypt/Generate or" + " Decrypt/Verify are valid only"); return -EINVAL; } - /* Expand GCM AES128 key */ - (*gcm_ops->aux.keyexp.aes128_enc)(cipher_xform->cipher.key.data, - sess->gdata.expanded_keys); + /* Check key length and calculate GCM pre-compute. */ + switch (cipher_xform->cipher.key.length) { + case 16: + aesni_gcm128_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_128; - /* Calculate hash sub key here */ - aesni_gcm_calculate_hash_sub_key(hsubkey, sizeof(hsubkey), - cipher_xform->cipher.key.data, - cipher_xform->cipher.key.length); + break; + case 32: + aesni_gcm256_pre(cipher_xform->cipher.key.data, &sess->gdata); + sess->key = AESNI_GCM_KEY_256; - /* Calculate GCM pre-compute */ - (*gcm_ops->gcm.precomp)(&sess->gdata, hsubkey); + break; + default: + GCM_LOG_ERR("Unsupported cipher key length"); + return -EINVAL; + } return 0; } @@ -173,10 +155,10 @@ return sess; sess = (struct aesni_gcm_session *) - ((struct rte_cryptodev_session *)_sess)->_private; + ((struct rte_cryptodev_sym_session *)_sess)->_private; - if (unlikely(aesni_gcm_set_session_parameters(qp->ops, - sess, op->xform) != 0)) { + if (unlikely(aesni_gcm_set_session_parameters(sess, + op->xform) != 0)) { rte_mempool_put(qp->sess_mp, _sess); sess = NULL; } @@ -196,19 +178,45 @@ * */ static int -process_gcm_crypto_op(struct aesni_gcm_qp *qp, struct rte_crypto_sym_op *op, +process_gcm_crypto_op(struct rte_crypto_sym_op *op, struct aesni_gcm_session *session) { uint8_t *src, *dst; - struct rte_mbuf *m = op->m_src; + struct rte_mbuf *m_src = op->m_src; + uint32_t offset = op->cipher.data.offset; + uint32_t part_len, total_len, data_len; + + RTE_ASSERT(m_src != NULL); + + while (offset >= m_src->data_len) { + offset -= m_src->data_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + } + + data_len = m_src->data_len - offset; + part_len = (data_len < op->cipher.data.length) ? data_len : + op->cipher.data.length; + + /* Destination buffer is required when segmented source buffer */ + RTE_ASSERT((part_len == op->cipher.data.length) || + ((part_len != op->cipher.data.length) && + (op->m_dst != NULL))); + /* Segmented destination buffer is not supported */ + RTE_ASSERT((op->m_dst == NULL) || + ((op->m_dst != NULL) && + rte_pktmbuf_is_contiguous(op->m_dst))); + - src = rte_pktmbuf_mtod(m, uint8_t *) + op->cipher.data.offset; dst = op->m_dst ? rte_pktmbuf_mtod_offset(op->m_dst, uint8_t *, op->cipher.data.offset) : - rte_pktmbuf_mtod_offset(m, uint8_t *, + rte_pktmbuf_mtod_offset(op->m_src, uint8_t *, op->cipher.data.offset); + src = rte_pktmbuf_mtod_offset(m_src, uint8_t *, offset); + /* sanity checks */ if (op->cipher.iv.length != 16 && op->cipher.iv.length != 12 && op->cipher.iv.length != 0) { @@ -225,48 +233,81 @@ *iv_padd = rte_bswap32(1); } - if (op->auth.aad.length != 12 && op->auth.aad.length != 8 && - op->auth.aad.length != 0) { - GCM_LOG_ERR("iv"); - return -1; - } - if (op->auth.digest.length != 16 && op->auth.digest.length != 12 && - op->auth.digest.length != 8 && - op->auth.digest.length != 0) { - GCM_LOG_ERR("iv"); + op->auth.digest.length != 8) { + GCM_LOG_ERR("digest"); return -1; } if (session->op == AESNI_GCM_OP_AUTHENTICATED_ENCRYPTION) { - (*qp->ops->gcm.enc)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_enc[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_enc[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_enc[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_enc[session->key].finalize(&session->gdata, op->auth.digest.data, (uint64_t)op->auth.digest.length); - } else if (session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION) { - uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(m, + } else { /* session->op == AESNI_GCM_OP_AUTHENTICATED_DECRYPTION */ + uint8_t *auth_tag = (uint8_t *)rte_pktmbuf_append(op->m_dst ? + op->m_dst : op->m_src, op->auth.digest.length); if (!auth_tag) { - GCM_LOG_ERR("iv"); + GCM_LOG_ERR("auth_tag"); return -1; } - (*qp->ops->gcm.dec)(&session->gdata, dst, src, - (uint64_t)op->cipher.data.length, + aesni_gcm_dec[session->key].init(&session->gdata, op->cipher.iv.data, op->auth.aad.data, - (uint64_t)op->auth.aad.length, + (uint64_t)op->auth.aad.length); + + aesni_gcm_dec[session->key].update(&session->gdata, dst, src, + (uint64_t)part_len); + total_len = op->cipher.data.length - part_len; + + while (total_len) { + dst += part_len; + m_src = m_src->next; + + RTE_ASSERT(m_src != NULL); + + src = rte_pktmbuf_mtod(m_src, uint8_t *); + part_len = (m_src->data_len < total_len) ? + m_src->data_len : total_len; + + aesni_gcm_dec[session->key].update(&session->gdata, + dst, src, + (uint64_t)part_len); + total_len -= part_len; + } + + aesni_gcm_dec[session->key].finalize(&session->gdata, auth_tag, (uint64_t)op->auth.digest.length); - } else { - GCM_LOG_ERR("iv"); - return -1; } return 0; @@ -356,21 +397,7 @@ break; } -#ifdef RTE_LIBRTE_PMD_AESNI_GCM_DEBUG - if (!rte_pktmbuf_is_contiguous(ops[i]->sym->m_src) || - (ops[i]->sym->m_dst != NULL && - !rte_pktmbuf_is_contiguous( - ops[i]->sym->m_dst))) { - ops[i]->status = RTE_CRYPTO_OP_STATUS_INVALID_ARGS; - GCM_LOG_ERR("PMD supports only contiguous mbufs, " - "op (%p) provides noncontiguous mbuf as " - "source/destination buffer.\n", ops[i]); - qp->qp_stats.enqueue_err_count++; - break; - } -#endif - - retval = process_gcm_crypto_op(qp, ops[i]->sym, sess); + retval = process_gcm_crypto_op(ops[i]->sym, sess); if (retval < 0) { ops[i]->status = RTE_CRYPTO_OP_STATUS_INVALID_ARGS; qp->qp_stats.enqueue_err_count++; @@ -406,7 +433,6 @@ { struct rte_cryptodev *dev; struct aesni_gcm_private *internals; - enum aesni_gcm_vector_mode vector_mode; if (init_params->name[0] == '\0') { int ret = rte_cryptodev_pmd_create_dev_name( @@ -425,18 +451,6 @@ return -EFAULT; } - /* Check CPU for supported vector instruction set */ - if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX2)) - vector_mode = RTE_AESNI_GCM_AVX2; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_AVX)) - vector_mode = RTE_AESNI_GCM_AVX; - else if (rte_cpu_get_flag_enabled(RTE_CPUFLAG_SSE4_1)) - vector_mode = RTE_AESNI_GCM_SSE; - else { - GCM_LOG_ERR("Vector instructions are not supported by CPU"); - return -EFAULT; - } - dev = rte_cryptodev_pmd_virtual_dev_init(init_params->name, sizeof(struct aesni_gcm_private), init_params->socket_id); if (dev == NULL) { @@ -453,27 +467,11 @@ dev->feature_flags = RTE_CRYPTODEV_FF_SYMMETRIC_CRYPTO | RTE_CRYPTODEV_FF_SYM_OPERATION_CHAINING | - RTE_CRYPTODEV_FF_CPU_AESNI; + RTE_CRYPTODEV_FF_CPU_AESNI | + RTE_CRYPTODEV_FF_MBUF_SCATTER_GATHER; - switch (vector_mode) { - case RTE_AESNI_GCM_SSE: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_SSE; - break; - case RTE_AESNI_GCM_AVX: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX; - break; - case RTE_AESNI_GCM_AVX2: - dev->feature_flags |= RTE_CRYPTODEV_FF_CPU_AVX2; - break; - default: - break; - } - - /* Set vector instructions mode supported */ internals = dev->data->dev_private; - internals->vector_mode = vector_mode; - internals->max_nb_queue_pairs = init_params->max_nb_queue_pairs; internals->max_nb_sessions = init_params->max_nb_sessions; diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c index c51f82a..2362006 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_ops.c @@ -39,17 +39,17 @@ #include "aesni_gcm_pmd_private.h" static const struct rte_cryptodev_capabilities aesni_gcm_pmd_capabilities[] = { - { /* AES GCM (AUTH) */ + { /* AES GMAC (AUTH) */ .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, {.sym = { .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, {.auth = { - .algo = RTE_CRYPTO_AUTH_AES_GCM, + .algo = RTE_CRYPTO_AUTH_AES_GMAC, .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .digest_size = { .min = 8, @@ -57,9 +57,34 @@ .increment = 4 }, .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 + } + }, } + }, } + }, + { /* AES GCM (AUTH) */ + .op = RTE_CRYPTO_OP_TYPE_SYMMETRIC, + {.sym = { + .xform_type = RTE_CRYPTO_SYM_XFORM_AUTH, + {.auth = { + .algo = RTE_CRYPTO_AUTH_AES_GCM, + .block_size = 16, + .key_size = { + .min = 16, + .max = 32, + .increment = 16 + }, + .digest_size = { .min = 8, - .max = 12, + .max = 16, .increment = 4 + }, + .aad_size = { + .min = 0, + .max = 65535, + .increment = 1 } }, } }, } @@ -73,8 +98,8 @@ .block_size = 16, .key_size = { .min = 16, - .max = 16, - .increment = 0 + .max = 32, + .increment = 16 }, .iv_size = { .min = 12, @@ -221,7 +246,6 @@ int socket_id) { struct aesni_gcm_qp *qp = NULL; - struct aesni_gcm_private *internals = dev->data->dev_private; /* Free memory prior to re-allocation if needed. */ if (dev->data->queue_pairs[qp_id] != NULL) @@ -239,8 +263,6 @@ if (aesni_gcm_pmd_qp_set_unique_name(dev, qp)) goto qp_setup_cleanup; - qp->ops = &gcm_ops[internals->vector_mode]; - qp->processed_pkts = aesni_gcm_pmd_qp_create_processed_pkts_ring(qp, qp_conf->nb_descriptors, socket_id); if (qp->processed_pkts == NULL) @@ -291,18 +313,15 @@ /** Configure a aesni gcm session from a crypto xform chain */ static void * -aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev, +aesni_gcm_pmd_session_configure(struct rte_cryptodev *dev __rte_unused, struct rte_crypto_sym_xform *xform, void *sess) { - struct aesni_gcm_private *internals = dev->data->dev_private; - if (unlikely(sess == NULL)) { GCM_LOG_ERR("invalid session struct"); return NULL; } - if (aesni_gcm_set_session_parameters(&gcm_ops[internals->vector_mode], - sess, xform) != 0) { + if (aesni_gcm_set_session_parameters(sess, xform) != 0) { GCM_LOG_ERR("failed configure session parameters"); return NULL; } diff --git a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h index 9878d6e..0496b44 100644 --- a/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h +++ b/drivers/crypto/aesni_gcm/aesni_gcm_pmd_private.h @@ -58,8 +58,6 @@ /** private data structure for each virtual AESNI GCM device */ struct aesni_gcm_private { - enum aesni_gcm_vector_mode vector_mode; - /**< Vector mode */ unsigned max_nb_queue_pairs; /**< Max number of queue pairs supported by device */ unsigned max_nb_sessions; @@ -71,8 +69,6 @@ struct aesni_gcm_qp { /**< Queue Pair Identifier */ char name[RTE_CRYPTODEV_NAME_LEN]; /**< Unique Queue Pair Name */ - const struct aesni_gcm_ops *ops; - /**< Architecture dependent function pointer table of the gcm APIs */ struct rte_ring *processed_pkts; /**< Ring for placing process packets */ struct rte_mempool *sess_mp; @@ -87,10 +83,17 @@ enum aesni_gcm_operation { AESNI_GCM_OP_AUTHENTICATED_DECRYPTION }; +enum aesni_gcm_key { + AESNI_GCM_KEY_128, + AESNI_GCM_KEY_256 +}; + /** AESNI GCM private session structure */ struct aesni_gcm_session { enum aesni_gcm_operation op; /**< GCM operation type */ + enum aesni_gcm_key key; + /**< GCM key type */ struct gcm_data gdata __rte_cache_aligned; /**< GCM parameters */ }; @@ -98,7 +101,6 @@ struct aesni_gcm_session { /** * Setup GCM session parameters - * @param ops gcm ops function pointer table * @param sess aesni gcm session structure * @param xform crypto transform chain * @@ -107,8 +109,7 @@ struct aesni_gcm_session { * - On failure returns error code < 0 */ extern int -aesni_gcm_set_session_parameters(const struct aesni_gcm_ops *ops, - struct aesni_gcm_session *sess, +aesni_gcm_set_session_parameters(struct aesni_gcm_session *sess, const struct rte_crypto_sym_xform *xform); diff --git a/mk/rte.app.mk b/mk/rte.app.mk index f75f0e2..ed3eab5 100644 --- a/mk/rte.app.mk +++ b/mk/rte.app.mk @@ -134,8 +134,7 @@ _LDLIBS-$(CONFIG_RTE_LIBRTE_VMXNET3_PMD) += -lrte_pmd_vmxnet3_uio ifeq ($(CONFIG_RTE_LIBRTE_CRYPTODEV),y) _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -lrte_pmd_aesni_mb _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_MB) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lcrypto -_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -L$(AESNI_MULTI_BUFFER_LIB_PATH) -lIPSec_MB +_LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_AESNI_GCM) += -lrte_pmd_aesni_gcm -lisal_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_OPENSSL) += -lrte_pmd_openssl -lcrypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_NULL_CRYPTO) += -lrte_pmd_null_crypto _LDLIBS-$(CONFIG_RTE_LIBRTE_PMD_QAT) += -lrte_pmd_qat -lcrypto -- 1.7.9.5 ^ permalink raw reply [flat|nested] 15+ messages in thread
* [dpdk-dev] [PATCH v6 2/2] app/test: add GCM additional tests 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 0/2] " Piotr Azarewicz 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 1/2] " Piotr Azarewicz @ 2017-01-17 11:19 ` Piotr Azarewicz 2017-01-18 12:37 ` [dpdk-dev] [PATCH v6 0/2] crypto/aesni_gcm: migration from MB library to ISA-L De Lara Guarch, Pablo 2 siblings, 0 replies; 15+ messages in thread From: Piotr Azarewicz @ 2017-01-17 11:19 UTC (permalink / raw) To: pablo.de.lara.guarch, dev Added new unit tests for AES-NI GCM PMD to verify new functionalities. Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> Acked-by: Declan Doherty <declan.doherty@intel.com> --- v5 changes: - rebase on top of dpdk-next-crypto - fix typo v4 changes: - rebase on top of dpdk-next-crypto v3 changes: - rebase on top of dpdk-next-crypto v2 changes: - write unit test for session-less mode - write unit test for out-of place processing app/test/test_cryptodev.c | 753 ++++++++++++++++++++++++---- app/test/test_cryptodev_gcm_test_vectors.h | 491 +++++++++++++++++- 2 files changed, 1116 insertions(+), 128 deletions(-) diff --git a/app/test/test_cryptodev.c b/app/test/test_cryptodev.c index 5786fde..e8d1eae 100644 --- a/app/test/test_cryptodev.c +++ b/app/test/test_cryptodev.c @@ -4317,16 +4317,48 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +create_gcm_xforms(struct rte_crypto_op *op, + enum rte_crypto_cipher_operation cipher_op, + uint8_t *key, const uint8_t key_len, + const uint8_t aad_len, const uint8_t auth_len, + enum rte_crypto_auth_operation auth_op) +{ + TEST_ASSERT_NOT_NULL(rte_crypto_op_sym_xforms_alloc(op, 2), + "failed to allocate space for crypto transforms"); + + struct rte_crypto_sym_op *sym_op = op->sym; + + /* Setup Cipher Parameters */ + sym_op->xform->type = RTE_CRYPTO_SYM_XFORM_CIPHER; + sym_op->xform->cipher.algo = RTE_CRYPTO_CIPHER_AES_GCM; + sym_op->xform->cipher.op = cipher_op; + sym_op->xform->cipher.key.data = key; + sym_op->xform->cipher.key.length = key_len; + + TEST_HEXDUMP(stdout, "key:", key, key_len); + + /* Setup Authentication Parameters */ + sym_op->xform->next->type = RTE_CRYPTO_SYM_XFORM_AUTH; + sym_op->xform->next->auth.algo = RTE_CRYPTO_AUTH_AES_GCM; + sym_op->xform->next->auth.op = auth_op; + sym_op->xform->next->auth.digest_length = auth_len; + sym_op->xform->next->auth.add_auth_data_length = aad_len; + sym_op->xform->next->auth.key.length = 0; + sym_op->xform->next->auth.key.data = NULL; + sym_op->xform->next->next = NULL; + + return 0; +} + +static int create_gcm_operation(enum rte_crypto_cipher_operation op, - const uint8_t *auth_tag, const unsigned auth_tag_len, - const uint8_t *iv, const unsigned iv_len, - const uint8_t *aad, const unsigned aad_len, - const unsigned data_len, unsigned data_pad_len) + const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; - unsigned iv_pad_len = 0, aad_buffer_len; + uint8_t *plaintext, *ciphertext; + unsigned int iv_pad_len, aad_pad_len, plaintext_pad_len; /* Generate Crypto op data structure */ ut_params->op = rte_crypto_op_alloc(ts_params->op_mpool, @@ -4336,77 +4368,118 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct rte_crypto_sym_op *sym_op = ut_params->op->sym; - if (ut_params->obuf) { - sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( - ut_params->obuf, auth_tag_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, - "no room to append digest"); - sym_op->auth.digest.phys_addr = pktmbuf_mtophys_offset( - ut_params->obuf, data_pad_len); - } else { - sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( - ut_params->ibuf, auth_tag_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, - "no room to append digest"); - sym_op->auth.digest.phys_addr = pktmbuf_mtophys_offset( - ut_params->ibuf, data_pad_len); - } - sym_op->auth.digest.length = auth_tag_len; - - if (op == RTE_CRYPTO_CIPHER_OP_DECRYPT) { - rte_memcpy(sym_op->auth.digest.data, auth_tag, auth_tag_len); - TEST_HEXDUMP(stdout, "digest:", - sym_op->auth.digest.data, - sym_op->auth.digest.length); - } + /* Append aad data */ + aad_pad_len = RTE_ALIGN_CEIL(tdata->aad.len, 16); + sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + aad_pad_len); + TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, + "no room to append aad"); - /* iv */ - iv_pad_len = RTE_ALIGN_CEIL(iv_len, 16); + sym_op->auth.aad.length = tdata->aad.len; + sym_op->auth.aad.phys_addr = + rte_pktmbuf_mtophys(ut_params->ibuf); + memcpy(sym_op->auth.aad.data, tdata->aad.data, tdata->aad.len); + TEST_HEXDUMP(stdout, "aad:", sym_op->auth.aad.data, + sym_op->auth.aad.length); + /* Prepend iv */ + iv_pad_len = RTE_ALIGN_CEIL(tdata->iv.len, 16); sym_op->cipher.iv.data = (uint8_t *)rte_pktmbuf_prepend( ut_params->ibuf, iv_pad_len); TEST_ASSERT_NOT_NULL(sym_op->cipher.iv.data, "no room to prepend iv"); memset(sym_op->cipher.iv.data, 0, iv_pad_len); sym_op->cipher.iv.phys_addr = rte_pktmbuf_mtophys(ut_params->ibuf); - sym_op->cipher.iv.length = iv_len; + sym_op->cipher.iv.length = tdata->iv.len; - rte_memcpy(sym_op->cipher.iv.data, iv, iv_len); + rte_memcpy(sym_op->cipher.iv.data, tdata->iv.data, tdata->iv.len); + TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, + sym_op->cipher.iv.length); - /* - * Always allocate the aad up to the block size. - * The cryptodev API calls out - - * - the array must be big enough to hold the AAD, plus any - * space to round this up to the nearest multiple of the - * block size (16 bytes). - */ - aad_buffer_len = ALIGN_POW2_ROUNDUP(aad_len, 16); + /* Append plaintext/ciphertext */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, "no room to append plaintext"); - sym_op->auth.aad.data = (uint8_t *)rte_pktmbuf_prepend( - ut_params->ibuf, aad_buffer_len); - TEST_ASSERT_NOT_NULL(sym_op->auth.aad.data, - "no room to prepend aad"); - sym_op->auth.aad.phys_addr = rte_pktmbuf_mtophys( - ut_params->ibuf); - sym_op->auth.aad.length = aad_len; + memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); + TEST_HEXDUMP(stdout, "plaintext:", plaintext, + tdata->plaintext.len); - memset(sym_op->auth.aad.data, 0, aad_buffer_len); - rte_memcpy(sym_op->auth.aad.data, aad, aad_len); + if (ut_params->obuf) { + ciphertext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - TEST_HEXDUMP(stdout, "iv:", sym_op->cipher.iv.data, iv_pad_len); - TEST_HEXDUMP(stdout, "aad:", - sym_op->auth.aad.data, aad_len); + memset(ciphertext + aad_pad_len + iv_pad_len, 0, + tdata->ciphertext.len); + } + } else { + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); + ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, + plaintext_pad_len); + TEST_ASSERT_NOT_NULL(ciphertext, + "no room to append ciphertext"); - if (ut_params->obuf) { - rte_pktmbuf_prepend(ut_params->obuf, iv_pad_len); - rte_pktmbuf_prepend(ut_params->obuf, aad_buffer_len); + memcpy(ciphertext, tdata->ciphertext.data, + tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, + tdata->ciphertext.len); + + if (ut_params->obuf) { + plaintext = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf, + plaintext_pad_len + aad_pad_len + + iv_pad_len); + TEST_ASSERT_NOT_NULL(plaintext, + "no room to append plaintext"); + + memset(plaintext + aad_pad_len + iv_pad_len, 0, + tdata->plaintext.len); + } } - sym_op->cipher.data.length = data_len; - sym_op->cipher.data.offset = aad_buffer_len + iv_pad_len; + /* Append digest data */ + if (op == RTE_CRYPTO_CIPHER_OP_ENCRYPT) { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + memset(sym_op->auth.digest.data, 0, tdata->auth_tag.len); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->obuf ? ut_params->obuf : + ut_params->ibuf, + plaintext_pad_len + + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; + } else { + sym_op->auth.digest.data = (uint8_t *)rte_pktmbuf_append( + ut_params->ibuf, tdata->auth_tag.len); + TEST_ASSERT_NOT_NULL(sym_op->auth.digest.data, + "no room to append digest"); + sym_op->auth.digest.phys_addr = rte_pktmbuf_mtophys_offset( + ut_params->ibuf, + plaintext_pad_len + aad_pad_len + iv_pad_len); + sym_op->auth.digest.length = tdata->auth_tag.len; - sym_op->auth.data.offset = aad_buffer_len + iv_pad_len; - sym_op->auth.data.length = data_len; + rte_memcpy(sym_op->auth.digest.data, tdata->auth_tag.data, + tdata->auth_tag.len); + TEST_HEXDUMP(stdout, "digest:", + sym_op->auth.digest.data, + sym_op->auth.digest.length); + } + + sym_op->cipher.data.length = tdata->plaintext.len; + sym_op->cipher.data.offset = aad_pad_len + iv_pad_len; + + sym_op->auth.data.length = tdata->plaintext.len; + sym_op->auth.data.offset = aad_pad_len + iv_pad_len; return 0; } @@ -4418,9 +4491,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext, *auth_tag; + uint8_t *ciphertext, *auth_tag; uint16_t plaintext_pad_len; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4431,31 +4504,20 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GCM_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); /* clear mbuf payload */ memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - /* - * Append data which is padded to a multiple - * of the algorithms block size - */ - plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); - - plaintext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - plaintext_pad_len); - memcpy(plaintext, tdata->plaintext.data, tdata->plaintext.len); - - TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->plaintext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->plaintext.len, plaintext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); if (retval < 0) return retval; @@ -4470,14 +4532,18 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, "crypto op processing failed"); + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + if (ut_params->op->sym->m_dst) { ciphertext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); auth_tag = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_dst, uint8_t *, plaintext_pad_len); } else { - ciphertext = plaintext; - auth_tag = plaintext + plaintext_pad_len; + ciphertext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; } TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); @@ -4543,15 +4609,68 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_encryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_encryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_encryption(&gcm_test_case_aad_2); +} + +static int test_mb_AES_GCM_authenticated_decryption(const struct gcm_test_data *tdata) { struct crypto_testsuite_params *ts_params = &testsuite_params; struct crypto_unittest_params *ut_params = &unittest_params; int retval; - - uint8_t *plaintext, *ciphertext; - uint16_t ciphertext_pad_len; + uint8_t *plaintext; + uint32_t i; /* Create GCM session */ retval = create_gcm_session(ts_params->valid_devs[0], @@ -4562,31 +4681,23 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) if (retval < 0) return retval; - /* alloc mbuf and set payload */ - ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + if (tdata->aad.len > MBUF_SIZE) { + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->large_mbuf_pool); + /* Populate full size of add data */ + for (i = 32; i < GCM_MAX_AAD_LENGTH; i += 32) + memcpy(&tdata->aad.data[i], &tdata->aad.data[0], 32); + } else + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, rte_pktmbuf_tailroom(ut_params->ibuf)); - ciphertext_pad_len = RTE_ALIGN_CEIL(tdata->ciphertext.len, 16); - - ciphertext = (uint8_t *)rte_pktmbuf_append(ut_params->ibuf, - ciphertext_pad_len); - memcpy(ciphertext, tdata->ciphertext.data, tdata->ciphertext.len); - - TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); - - /* Create GCM opertaion */ - retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, - tdata->auth_tag.data, tdata->auth_tag.len, - tdata->iv.data, tdata->iv.len, - tdata->aad.data, tdata->aad.len, - tdata->ciphertext.len, ciphertext_pad_len); + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); if (retval < 0) return retval; - rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); ut_params->op->sym->m_src = ut_params->ibuf; @@ -4602,7 +4713,9 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) plaintext = rte_pktmbuf_mtod(ut_params->op->sym->m_dst, uint8_t *); else - plaintext = ciphertext; + plaintext = rte_pktmbuf_mtod_offset(ut_params->op->sym->m_src, + uint8_t *, + ut_params->op->sym->cipher.data.offset); TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); @@ -4662,6 +4775,358 @@ static int test_snow3g_decryption_oop(const struct snow3g_test_data *tdata) } static int +test_mb_AES_GCM_auth_decryption_test_case_256_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_2); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_3(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_3); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_4(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_4); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_5(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_5); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_6(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_6); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_256_7(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_256_7); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_1(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_1); +} + +static int +test_mb_AES_GCM_auth_decryption_test_case_aad_2(void) +{ + return test_mb_AES_GCM_authenticated_decryption(&gcm_test_case_aad_2); +} + +static int +test_AES_GCM_authenticated_encryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_oop(void) +{ + return test_AES_GCM_authenticated_encryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_oop(const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + + /* Create GCM session */ + retval = create_gcm_session(ts_params->valid_devs[0], + RTE_CRYPTO_CIPHER_OP_DECRYPT, + tdata->key.data, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + ut_params->obuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + memset(rte_pktmbuf_mtod(ut_params->obuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->obuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + rte_crypto_op_attach_sym_session(ut_params->op, ut_params->sess); + + ut_params->op->sym->m_src = ut_params->ibuf; + ut_params->op->sym->m_dst = ut_params->obuf; + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op processing failed"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->obuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_oop(void) +{ + return test_AES_GCM_authenticated_decryption_oop(&gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_encryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *ciphertext, *auth_tag; + uint16_t plaintext_pad_len; + uint8_t key[tdata->key.len + 1]; + + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + /* clear mbuf payload */ + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_ENCRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_ENCRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_GENERATE); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext_pad_len = RTE_ALIGN_CEIL(tdata->plaintext.len, 16); + + ciphertext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + auth_tag = ciphertext + plaintext_pad_len; + + TEST_HEXDUMP(stdout, "ciphertext:", ciphertext, tdata->ciphertext.len); + TEST_HEXDUMP(stdout, "auth tag:", auth_tag, tdata->auth_tag.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + ciphertext, + tdata->ciphertext.data, + tdata->ciphertext.len, + "GCM Ciphertext data not as expected"); + + TEST_ASSERT_BUFFERS_ARE_EQUAL( + auth_tag, + tdata->auth_tag.data, + tdata->auth_tag.len, + "GCM Generated auth tag not as expected"); + + return 0; + +} + +static int +test_mb_AES_GCM_authenticated_encryption_sessionless(void) +{ + return test_AES_GCM_authenticated_encryption_sessionless( + &gcm_test_case_5); +} + +static int +test_AES_GCM_authenticated_decryption_sessionless( + const struct gcm_test_data *tdata) +{ + struct crypto_testsuite_params *ts_params = &testsuite_params; + struct crypto_unittest_params *ut_params = &unittest_params; + + int retval; + uint8_t *plaintext; + uint8_t key[tdata->key.len + 1]; + + /* alloc mbuf and set payload */ + ut_params->ibuf = rte_pktmbuf_alloc(ts_params->mbuf_pool); + + memset(rte_pktmbuf_mtod(ut_params->ibuf, uint8_t *), 0, + rte_pktmbuf_tailroom(ut_params->ibuf)); + + /* Create GCM operation */ + retval = create_gcm_operation(RTE_CRYPTO_CIPHER_OP_DECRYPT, tdata); + if (retval < 0) + return retval; + + /* Create GCM xforms */ + memcpy(key, tdata->key.data, tdata->key.len); + retval = create_gcm_xforms(ut_params->op, + RTE_CRYPTO_CIPHER_OP_DECRYPT, + key, tdata->key.len, + tdata->aad.len, tdata->auth_tag.len, + RTE_CRYPTO_AUTH_OP_VERIFY); + if (retval < 0) + return retval; + + ut_params->op->sym->m_src = ut_params->ibuf; + + TEST_ASSERT_EQUAL(ut_params->op->sym->sess_type, + RTE_CRYPTO_SYM_OP_SESSIONLESS, + "crypto op session type not sessionless"); + + /* Process crypto operation */ + TEST_ASSERT_NOT_NULL(process_crypto_request(ts_params->valid_devs[0], + ut_params->op), "failed to process sym crypto op"); + + TEST_ASSERT_NOT_NULL(ut_params->op, "failed crypto process"); + + TEST_ASSERT_EQUAL(ut_params->op->status, RTE_CRYPTO_OP_STATUS_SUCCESS, + "crypto op status not success"); + + plaintext = rte_pktmbuf_mtod_offset(ut_params->ibuf, uint8_t *, + ut_params->op->sym->cipher.data.offset); + + TEST_HEXDUMP(stdout, "plaintext:", plaintext, tdata->ciphertext.len); + + /* Validate obuf */ + TEST_ASSERT_BUFFERS_ARE_EQUAL( + plaintext, + tdata->plaintext.data, + tdata->plaintext.len, + "GCM plaintext data not as expected"); + + TEST_ASSERT_EQUAL(ut_params->op->status, + RTE_CRYPTO_OP_STATUS_SUCCESS, + "GCM authentication failed"); + return 0; +} + +static int +test_mb_AES_GCM_authenticated_decryption_sessionless(void) +{ + return test_AES_GCM_authenticated_decryption_sessionless( + &gcm_test_case_5); +} + +static int test_stats(void) { struct crypto_testsuite_params *ts_params = &testsuite_params; @@ -7102,6 +7567,86 @@ struct test_crypto_vector { TEST_CASE_ST(ut_setup, ut_teardown, test_mb_AES_GCM_authenticated_decryption_test_case_7), + /** AES GCM Authenticated Encryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_256_7), + + /** AES GCM Authenticated Decryption 256 bits key */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_2), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_5), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_6), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_256_7), + + /** AES GCM Authenticated Encryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_encryption_test_case_aad_2), + + /** AES GCM Authenticated Decryption big aad size */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_auth_decryption_test_case_aad_2), + + /** AES GMAC Authentication */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_1), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_3), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_test_case_4), + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GMAC_authentication_verify_test_case_4), + + /** Negative tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_data_corrupt), + TEST_CASE_ST(ut_setup, ut_teardown, + authentication_verify_AES128_GMAC_fail_tag_corrupt), + + /** Out of place tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_oop), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_oop), + + /** Session-less tests */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_encryption_sessionless), + TEST_CASE_ST(ut_setup, ut_teardown, + test_mb_AES_GCM_authenticated_decryption_sessionless), + + /** Scatter-Gather */ + TEST_CASE_ST(ut_setup, ut_teardown, + test_AES_GCM_auth_encrypt_SGL_out_of_place_400B_1seg), + TEST_CASES_END() /**< NULL terminate unit test array */ } }; diff --git a/app/test/test_cryptodev_gcm_test_vectors.h b/app/test/test_cryptodev_gcm_test_vectors.h index 45ea3d4..5764edb 100644 --- a/app/test/test_cryptodev_gcm_test_vectors.h +++ b/app/test/test_cryptodev_gcm_test_vectors.h @@ -33,7 +33,17 @@ #ifndef TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ #define TEST_CRYPTODEV_GCM_TEST_VECTORS_H_ -#define GMAC_LARGE_PLAINTEXT_LENGTH 65376 +#define GMAC_LARGE_PLAINTEXT_LENGTH 65344 +#define GCM_MAX_AAD_LENGTH 65536 +#define GCM_LARGE_AAD_LENGTH 65296 + +static uint8_t gcm_aad_zero_text[GCM_MAX_AAD_LENGTH] = { 0 }; + +static uint8_t gcm_aad_text[GCM_MAX_AAD_LENGTH] = { + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, + 0x00, 0xf1, 0xe2, 0xd3, 0xc4, 0xb5, 0xa6, 0x97, + 0x88, 0x79, 0x6a, 0x5b, 0x4c, 0x3d, 0x2e, 0x1f }; struct gcm_test_data { @@ -48,7 +58,7 @@ struct gcm_test_data { } iv; struct { - uint8_t data[64]; + uint8_t *data; unsigned len; } aad; @@ -111,7 +121,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -148,7 +158,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -186,7 +196,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { @@ -238,8 +248,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .data = gcm_aad_zero_text, .len = 8 }, .plaintext = { @@ -294,8 +303,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef }, + .data = gcm_aad_text, .len = 8 }, .plaintext = { @@ -351,10 +359,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, - 0x00, 0x00, 0x00, 0x00 - }, + .data = gcm_aad_zero_text, .len = 12 }, .plaintext = { @@ -409,10 +414,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, - 0xfe, 0xed, 0xfa, 0xce - }, + .data = gcm_aad_text, .len = 12 }, .plaintext = { @@ -466,10 +468,7 @@ struct gmac_test_data { .len = 12 }, .aad = { - .data = { - 0xfe, 0xed, 0xfa, 0xce, 0xde, 0xad, 0xbe, 0xef, - 0xfe, 0xed, 0xfa, 0xce - }, + .data = gcm_aad_text, .len = 12 }, .plaintext = { @@ -1003,6 +1002,450 @@ struct gmac_test_data { } }; +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_1 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { 0x00 }, + .len = 0 + }, + .ciphertext = { + .data = { 0x00 }, + .len = 0 + }, + .auth_tag = { + .data = { + 0x53, 0x0F, 0x8A, 0xFB, 0xC7, 0x45, 0x36, 0xB9, + 0xA9, 0x63, 0xB4, 0xF1, 0xC4, 0xCB, 0x73, 0x8B }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_2 = { + .key = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 32 + }, + .iv = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, + 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 }, + .len = 16 + }, + .ciphertext = { + .data = { + 0xCE, 0xA7, 0x40, 0x3D, 0x4D, 0x60, 0x6B, 0x6E, + 0x07, 0x4E, 0xC5, 0xD3, 0xBA, 0xF3, 0x9D, 0x18 }, + .len = 16 + }, + .auth_tag = { + .data = { + 0xD0, 0xD1, 0xC8, 0xA7, 0x99, 0x99, 0x6B, 0xF0, + 0x26, 0x5B, 0x98, 0xB5, 0xD4, 0x8A, 0xB9, 0x19 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_3 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 0 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E }, + .len = 64 + }, + .auth_tag = { + .data = { + 0x64, 0xAF, 0x1D, 0xFB, 0xE8, 0x0D, 0x37, 0xD8, + 0x92, 0xC3, 0xB9, 0x1D, 0xD3, 0x08, 0xAB, 0xFC }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_4 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x63, 0x16, 0x91, 0xAE, 0x17, 0x05, 0x5E, 0xA6, + 0x6D, 0x0A, 0x51, 0xE2, 0x50, 0x21, 0x85, 0x4A }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_5 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 8 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0xA7, 0x99, 0xAC, 0xB8, 0x27, 0xDA, 0xB1, 0x82, + 0x79, 0xFD, 0x83, 0x73, 0x52, 0x4D, 0xDB, 0xF1 }, + .len = 16 + } + +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_6 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_zero_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x5D, 0xA5, 0x0E, 0x53, 0x64, 0x7F, 0x3F, 0xAE, + 0x1A, 0x1F, 0xC0, 0xB0, 0xD8, 0xBE, 0xF2, 0x64 }, + .len = 16 + } +}; + +/** AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_256_7 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = 12 + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39 }, + .len = 60 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E }, + .len = 60 + }, + .auth_tag = { + .data = { + 0x4E, 0xD0, 0x91, 0x95, 0x83, 0xA9, 0x38, 0x72, + 0x09, 0xA9, 0xCE, 0x5F, 0x89, 0x06, 0x4E, 0xC8 }, + .len = 16 + } +}; + +/** variable AAD AES-128 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_1 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08 }, + .len = 16 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GCM_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x42, 0x83, 0x1E, 0xC2, 0x21, 0x77, 0x74, 0x24, + 0x4B, 0x72, 0x21, 0xB7, 0x84, 0xD0, 0xD4, 0x9C, + 0xE3, 0xAA, 0x21, 0x2F, 0x2C, 0x02, 0xA4, 0xE0, + 0x35, 0xC1, 0x7E, 0x23, 0x29, 0xAC, 0xA1, 0x2E, + 0x21, 0xD5, 0x14, 0xB2, 0x54, 0x66, 0x93, 0x1C, + 0x7D, 0x8F, 0x6A, 0x5A, 0xAC, 0x84, 0xAA, 0x05, + 0x1B, 0xA3, 0x0B, 0x39, 0x6A, 0x0A, 0xAC, 0x97, + 0x3D, 0x58, 0xE0, 0x91, 0x47, 0x3F, 0x59, 0x85 + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xCA, 0x70, 0xAF, 0x96, 0xA8, 0x5D, 0x40, 0x47, + 0x0C, 0x3C, 0x48, 0xF5, 0xF0, 0xF5, 0xA5, 0x7D + }, + .len = 16 + } +}; + +/** variable AAD AES-256 Test Vectors */ +static const struct gcm_test_data gcm_test_case_aad_2 = { + .key = { + .data = { + 0xfe, 0xff, 0xe9, 0x92, 0x86, 0x65, 0x73, 0x1c, + 0x6d, 0x6a, 0x8f, 0x94, 0x67, 0x30, 0x83, 0x08, + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a }, + .len = 32 + }, + .iv = { + .data = { + 0xca, 0xfe, 0xba, 0xbe, 0xfa, 0xce, 0xdb, 0xad, + 0xde, 0xca, 0xf8, 0x88 }, + .len = 12 + }, + .aad = { + .data = gcm_aad_text, + .len = GCM_LARGE_AAD_LENGTH + }, + .plaintext = { + .data = { + 0xd9, 0x31, 0x32, 0x25, 0xf8, 0x84, 0x06, 0xe5, + 0xa5, 0x59, 0x09, 0xc5, 0xaf, 0xf5, 0x26, 0x9a, + 0x86, 0xa7, 0xa9, 0x53, 0x15, 0x34, 0xf7, 0xda, + 0x2e, 0x4c, 0x30, 0x3d, 0x8a, 0x31, 0x8a, 0x72, + 0x1c, 0x3c, 0x0c, 0x95, 0x95, 0x68, 0x09, 0x53, + 0x2f, 0xcf, 0x0e, 0x24, 0x49, 0xa6, 0xb5, 0x25, + 0xb1, 0x6a, 0xed, 0xf5, 0xaa, 0x0d, 0xe6, 0x57, + 0xba, 0x63, 0x7b, 0x39, 0x1a, 0xaf, 0xd2, 0x55 }, + .len = 64 + }, + .ciphertext = { + .data = { + 0x05, 0xA2, 0x39, 0xA5, 0xE1, 0x1A, 0x74, 0xEA, + 0x6B, 0x2A, 0x55, 0xF6, 0xD7, 0x88, 0x44, 0x7E, + 0x93, 0x7E, 0x23, 0x64, 0x8D, 0xF8, 0xD4, 0x04, + 0x3B, 0x40, 0xEF, 0x6D, 0x7C, 0x6B, 0xF3, 0xB9, + 0x50, 0x15, 0x97, 0x5D, 0xB8, 0x28, 0xA1, 0xD5, + 0x22, 0xDE, 0x36, 0x26, 0xD0, 0x6A, 0x7A, 0xC0, + 0xB5, 0x14, 0x36, 0xAF, 0x3A, 0xC6, 0x50, 0xAB, + 0xFA, 0x47, 0xC8, 0x2E, 0xF0, 0x68, 0xE1, 0x3E + }, + .len = 64 + }, + .auth_tag = { + .data = { + 0xBA, 0x06, 0xDA, 0xA1, 0x91, 0xE1, 0xFE, 0x22, + 0x59, 0xDA, 0x67, 0xAF, 0x9D, 0xA5, 0x43, 0x94 + }, + .len = 16 + } +}; + /** GMAC Test Vectors */ static uint8_t gmac_plaintext[GMAC_LARGE_PLAINTEXT_LENGTH] = { 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, @@ -1781,8 +2224,8 @@ struct cryptodev_perf_test_data { }, .gmac_tag = { .data = { - 0x88, 0x82, 0xb4, 0x93, 0x8f, 0x04, 0xcd, 0x06, - 0xfd, 0xac, 0x6d, 0x8b, 0x9c, 0x9e, 0x8f, 0xec + 0x3f, 0x07, 0xcb, 0xb9, 0x86, 0x3a, 0xea, 0xc2, + 0x2f, 0x3a, 0x2a, 0x93, 0xd8, 0x09, 0x6b, 0xda }, .len = 16 } @@ -1802,7 +2245,7 @@ struct cryptodev_perf_test_data { .len = 12 }, .aad = { - .data = { 0 }, + .data = gcm_aad_zero_text, .len = 0 }, .plaintext = { -- 1.7.9.5 ^ permalink raw reply [flat|nested] 15+ messages in thread
* Re: [dpdk-dev] [PATCH v6 0/2] crypto/aesni_gcm: migration from MB library to ISA-L 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 0/2] " Piotr Azarewicz 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 1/2] " Piotr Azarewicz 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 2/2] app/test: add GCM additional tests Piotr Azarewicz @ 2017-01-18 12:37 ` De Lara Guarch, Pablo 2 siblings, 0 replies; 15+ messages in thread From: De Lara Guarch, Pablo @ 2017-01-18 12:37 UTC (permalink / raw) To: Azarewicz, PiotrX T, dev > -----Original Message----- > From: Azarewicz, PiotrX T > Sent: Tuesday, January 17, 2017 11:19 AM > To: De Lara Guarch, Pablo; dev@dpdk.org > Subject: [PATCH v6 0/2] crypto/aesni_gcm: migration from MB library to > ISA-L > > Current Cryptodev AES-NI GCM PMD is implemented using Multi Buffer > Crypto library.This patch reimplement the device using ISA-L Crypto > library: https://github.com/01org/isa-l_crypto. > > The migration entailed the following additional support for: > * GMAC algorithm. > * 256-bit cipher key. > * Session-less mode. > * Out-of place processing > * Scatter-gatter support for chained mbufs (only out-of place and > destination mbuf must be contiguous) > > Verified current unit tests and added new unit tests to verify new > functionalities. > > Signed-off-by: Piotr Azarewicz <piotrx.t.azarewicz@intel.com> > Acked-by: Declan Doherty <declan.doherty@intel.com> Applied to dpdk-next-crypto. Thanks, Pablo ^ permalink raw reply [flat|nested] 15+ messages in thread
end of thread, other threads:[~2017-01-18 12:37 UTC | newest] Thread overview: 15+ messages (download: mbox.gz / follow: Atom feed) -- links below jump to the message on this page -- 2016-12-27 18:53 [dpdk-dev] [PATCH v2] crypto/aesni_gcm: migration from MB library to ISA-L Michal Jastrzebski 2017-01-03 13:02 ` [dpdk-dev] [PATCH v3] " Piotr Azarewicz 2017-01-03 14:14 ` Thomas Monjalon 2017-01-05 13:51 ` [dpdk-dev] [PATCH v4] " Piotr Azarewicz 2017-01-05 15:12 ` Thomas Monjalon 2017-01-16 12:37 ` [dpdk-dev] [PATCH v5] " Piotr Azarewicz 2017-01-16 14:42 ` Thomas Monjalon 2017-01-16 15:20 ` Mcnamara, John 2017-01-16 14:43 ` Thomas Monjalon 2017-01-17 7:42 ` Azarewicz, PiotrX T 2017-01-16 16:05 ` Declan Doherty 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 0/2] " Piotr Azarewicz 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 1/2] " Piotr Azarewicz 2017-01-17 11:19 ` [dpdk-dev] [PATCH v6 2/2] app/test: add GCM additional tests Piotr Azarewicz 2017-01-18 12:37 ` [dpdk-dev] [PATCH v6 0/2] crypto/aesni_gcm: migration from MB library to ISA-L De Lara Guarch, Pablo
This is a public inbox, see mirroring instructions for how to clone and mirror all data and code used for this inbox; as well as URLs for NNTP newsgroup(s).