Encrypted data ingestion
You can ingest encrypted data files to ÃÛ¶¹ÊÓƵ Experience Platform using cloud storage batch sources. With encrypted data ingestion, you can leverage asymmetric encryption mechanisms to securely transfer batch data into Experience Platform. Currently, the supported asymmetric encryption mechanisms are PGP and GPG.
The encrypted data ingestion process is as follows:
- Create an encryption key pair using Experience Platform APIs. The encryption key pair consists of a private key and a public key. Once created, you can copy or download the public key, alongside its corresponding public key ID and Expiry Time. During this process, the private key will be stored by Experience Platform in a secure vault. NOTE: The public key in the response is Base64-encoded and must be decoded prior to using.
- Use the public key to encrypt the data file that you want to ingest.
- Place your encrypted file in your cloud storage.
- Once the encrypted file is ready, create a source connection and a dataflow for your cloud storage source. During the flow creation step, you must provide an
encryption
parameter and include your public key ID. - Experience Platform retrieves the private key from the secure vault to decrypt the data at the time of ingestion.
This document provides steps on how to generate a encryption key pair to encrypt your data, and ingest that encrypted data to Experience Platform using cloud storage sources.
Get started get-started
This tutorial requires you to have a working understanding of the following components of ÃÛ¶¹ÊÓƵ Experience Platform:
- Sources: Experience Platform allows data to be ingested from various sources while providing you with the ability to structure, label, and enhance incoming data using Platform services.
- Cloud storage sources: Create a dataflow to bring batch data from your cloud storage source to Experience Platform.
- Sandboxes: Experience Platform provides virtual sandboxes which partition a single Platform instance into separate virtual environments to help develop and evolve digital experience applications.
Using Platform APIs
For information on how to successfully make calls to Platform APIs, see the guide on getting started with Platform APIs.
Supported file extensions for encrypted files supported-file-extensions-for-encrypted-files
The list of supported file extensions for encrypted files are:
- .csv
- .tsv
- .json
- .parquet
- .csv.gpg
- .tsv.gpg
- .json.gpg
- .parquet.gpg
- .csv.pgp
- .tsv.pgp
- .json.pgp
- .parquet.pgp
- .gpg
- .pgp
Create encryption key pair create-encryption-key-pair
The first step in ingesting encrypted data to Experience Platform is to create your encryption key pair by making a POST request to the /encryption/keys
endpoint of the Connectors API.
API format
POST /data/foundation/connectors/encryption/keys
Request
The following request generates an encryption key pair using the PGP encryption algorithm.
code language-shell |
---|
|
table 0-row-2 1-row-2 2-row-2 3-row-2 | |
---|---|
Parameter | Description |
name |
The name of your encryption key pair. |
encryptionAlgorithm |
The type of encryption algorithm that you are using. The supported encryption types are PGP and GPG . |
params.passPhrase |
The passphrase provides an additional layer of protection for your encryption keys. Upon creation, Experience Platform stores the passphrase in a different secure vault from the public key. You must provide a non-empty string as a passphrase. |
Response
A successful response returns your Base64-encoded public key, public key ID, and the expiry time of your keys. The expiry time automatically sets to 180 days after the date of key generation. Expiry time is currently not configurable.
code language-json |
---|
|
table 0-row-2 1-row-2 2-row-2 3-row-2 | |
---|---|
Property | Description |
publicKey |
The public key is used to encrypt the data in your cloud storage. This key corresponds with the private key that was also created during this step. However, the private key immediately goes to Experience Platform. |
publicKeyId |
The public key ID is used to create a dataflow and ingest your encrypted cloud storage data to Experience Platform. |
expiryTime |
The expiry time defines the expiration date of your encryption key pair. This date is automatically set to 180 days after the date of key generation and is displayed in unix timestamp format. |
Retrieve encryption keys retrieve-encryption-keys
To retrieve all encryption keys in your organization, make a GET Request to the /encryption/keys
endpoit=nt.
API format
GET /data/foundation/connectors/encryption/keys
Request
The following request retrieves all encryption keys in your organization.
code language-shell |
---|
|
Response
A successful response returns your encryption algorithm, name, public key, public key ID, key type, and the corresponding expiry time of your keys.
code language-json |
---|
|
Retrieve encryption keys by ID retrieve-encryption-keys-by-id
To retrieve a specific set of encryption keys, make a GET request to the /encryption/keys
endpoint and provide your public key ID as a header parameter.
API format
GET /data/foundation/connectors/encryption/keys/{PUBLIC_KEY_ID}
Request
code language-shell |
---|
|
Response
A successful response returns your encryption algorithm, name, public key, public key ID, key type, and the corresponding expiry time of your keys.
code language-json |
---|
|
Create customer managed key pair create-customer-managed-key-pair
You can optionally create a sign verification key pair to sign and ingest your encrypted data.
During this stage, you must generate your own private key and public key combination and then use your private key to sign your encrypted data. Next, you must encode your public key in Base64 and then share it to Experience Platform in order for Platform to verify your signature.
Share your public key to Experience Platform
To share your public key, make a POST request to the /customer-keys
endpoint while providing your encryption algorithm and your Base64-encoded public key.
API format
POST /data/foundation/connectors/encryption/customer-keys
Request
code language-shell |
---|
|
table 0-row-2 1-row-2 2-row-2 | |
---|---|
Parameter | Description |
encryptionAlgorithm |
The type of encryption algorithm that you are using. The supported encryption types are PGP and GPG . |
publicKey |
The public key that corresponds to your customer managed keys used for signing your encrypted. This key must be Base64-encoded. |
Response
code language-json |
---|
|
table 0-row-2 1-row-2 | |
---|---|
Property | Description |
publicKeyId |
This public key ID is returned in response to sharing your customer managed key with Experience Platform. You can provide this public key ID as the sign verification key ID when creating a dataflow for signed and encrypted data. |
Retrieve customer managed key pair
To retrieve your customer managed keys, make a GET request to the /customer-keys
endpoint.
API format
GET /data/foundation/connectors/encryption/customer-keys
Request
code language-shell |
---|
|
Response
code language-json |
---|
|
Connect your cloud storage source to Experience Platform using the Flow Service API
Once you have retrieved your encryption key pair, you can now proceed and create a source connection for your cloud storage source and bring your encrypted data to Platform.
First, you must create a base connection to authenticate your source against Platform. To create a base connection and authenticate your source, select the source you would like to use from the list below:
After creating a base connection, you must then follow the steps outlined in the tutorial for creating a source connection for a cloud storage source in order to create a source connection, a target connection, and a mapping.
Create a dataflow for encrypted data create-a-dataflow-for-encrypted-data
To create a dataflow, make a POST request to the /flows
endpoint of the Flow Service API. To ingest encrypted data, you must add an encryption
section to the transformations
property and include the publicKeyId
that was created in an earlier step.
API format
POST /flows
Request
accordion | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
View example request | ||||||||||||||||||||||||
The following request creates a dataflow to ingest encrypted data for a cloud storage source.
|
Response
accordion | ||
---|---|---|
View example response | ||
A successful response returns the ID (
|
Request
accordion | ||||||||
---|---|---|---|---|---|---|---|---|
View example request | ||||||||
|
Response
accordion | ||
---|---|---|
View example response | ||
A successful response returns the ID (
|
Delete encryption keys delete-encryption-keys
To delete your encryption keys, make a DELETE request to the /encryption/keys
endpoint and provide your public key ID as a header parameter.
API format
DELETE /data/foundation/connectors/encryption/keys/{PUBLIC_KEY_ID}
Request
code language-shell |
---|
|
Response
A successful response returns HTTP status 204 (No Content) and a blank body.
Validate encryption keys validate-encryption-keys
To validate your encryption keys, make a GET request to the /encryption/keys/validate/
endpoint and provide the public key ID that you want to validate as a header parameter.
GET /data/foundation/connectors/encryption/keys/validate/{PUBLIC_KEY_ID}
Request
code language-shell |
---|
|
Response
A successful response returns either a confirmation that your IDs are valid, or invalid.
A valid public key ID returns a status of Active
along with your public key ID.
code language-json |
---|
|
An invalid public key ID returns a status of Expired
along with your public key ID.
code language-json |
---|
|
Restrictions on recurring ingestion restrictions-on-recurring-ingestion
Encrypted data ingestion does not support ingestion of recurring or multi-level folders in sources. All encrypted files must be contained in a single folder. Wildcards with multiple folders in a single source path are also not supported.
The following is an example of a supported folder structure, where the source path is /ACME-customers/*.csv.gpg
.
In this scenario, the files in bold are ingested into Experience Platform.
-
ACME-customers
- File1.csv.gpg
- File2.json.gpg
- File3.csv.gpg
- File4.json
- File5.csv.gpg
The following is an example of an unsupported folder structure where the source path is /ACME-customers/*
.
In this scenario, the flow run will fail and return an error message indicating that data cannot be copied from the source.
-
ACME-customers
-
File1.csv.gpg
-
File2.json.gpg
-
Subfolder1
- File3.csv.gpg
- File4.json.gpg
- File5.csv.gpg
-
-
ACME-loyalty
- File6.csv.gpg
Next steps
By following this tutorial, you have created an encryption key pair for your cloud storage data, and a dataflow to ingested your encrypted data using the Flow Service API. For status updates on your dataflow’s completeness, errors, and metrics, read the guide on monitoring your dataflow using the Flow Service API.