ÃÛ¶¹ÊÓƵ

Azure Data Lake Storage Gen2 connection

Overview overview

Read this page to learn to create a live outbound connection to your (ADLS Gen2) data lake to periodically export data files from Experience Platform.

Connect to your ADLS Gen2 storage through API or UI connect-api-or-ui

Supported audiences supported-audiences

This section describes which types of audiences you can export to this destination.

Audience origin
Supported
Description
Segmentation Service
✓
Audiences generated through the Experience Platform Segmentation Service.
Custom uploads
✓
Audiences imported into Experience Platform from CSV files.

Export type and frequency export-type-frequency

Refer to the table below for information about the destination export type and frequency.

Item
Type
Notes
Export type
Profile-based
You are exporting all members of a segment, together with the applicable schema fields (for example your PPID), as chosen in the select profile attributes screen of the destination activation workflow.
Export frequency
Batch
Batch destinations export files to downstream platforms in increments of three, six, eight, twelve, or twenty-four hours. Read more about batch file-based destinations.

Export datasets export-datasets

This destination supports dataset exports. For complete information on how to set up dataset exports, read the tutorials:

File format of the exported data file-format

When exporting audience data, Platform creates a .csv, parquet, or .json file in the storage location that you provided. For more information about the files, see the supported file formats for export section in the audience activation tutorial.

When exporting datasets, Platform creates a .parquet or .json file in the storage location that you provided. For more information about the files, see the verify successful dataset export section in the export datasets tutorial.

Connect to the destination connect

IMPORTANT
To connect to the destination, you need the View Destinations and Manage Destinations access control permissions. Read the access control overview or contact your product administrator to obtain the required permissions.

To connect to this destination, follow the steps described in the destination configuration tutorial. In the destination configuration workflow, fill in the fields listed in the two sections below.

Authenticate to destination authenticate

To authenticate to the destination, fill in the required fields and select Connect to destination.

  • URL: The endpoint for Azure Data Lake Storage Gen2. The endpoint pattern is: abfss://<container>@<accountname>.dfs.core.windows.net.

  • Tenant: The tenant information that contains your application.

  • Service principal ID: The application’s client ID.

  • Service principal key: The application’s key.

  • Encryption key: Optionally, you can attach your RSA-formatted public key to add encryption to your exported files. View an example of a correctly formatted encryption key in the image below.

    Image showing an example of a correctly formatted PGP key in the UI

Fill in destination details destination-details

To configure details for the destination, fill in the required and optional fields below. An asterisk next to a field in the UI indicates that the field is required.

  • Name: Fill in the preferred name for this destination.

  • Description: Optional. For example, you can mention which campaign you are using this destination for.

  • Folder path: Enter the path to the destination folder that will host the exported files.

  • File type: Select the format Experience Platform should use for the exported files. When selecting the CSV option, you can also configure the file formatting options.

  • Compression format: Select the compression type that Experience Platform should use for the exported files.

  • Include manifest file: Toggle this option on if you’d like the exports to include a manifest JSON file that contains information about the export location, export size, and more. The manifest is named using the format manifest-<<destinationId>>-<<dataflowRunId>>.json. View a sample manifest file. The manifest file includes the following fields:

    • flowRunId: The dataflow run which generated the exported file.
    • scheduledTime: The time in UTC when the file was exported.
    • exportResults.sinkPath: The path in your storage location where the exported file is deposited.
    • exportResults.name: The name of the exported file.
    • size: The size of the exported file, in bytes.

Enable alerts enable-alerts

You can enable alerts to receive notifications on the status of the dataflow to your destination. Select an alert from the list to subscribe to receive notifications on the status of your dataflow. For more information on alerts, see the guide on subscribing to destinations alerts using the UI.

When you are finished providing details for your destination connection, select Next.

Activate audiences to this destination activate

IMPORTANT

See Activate audience data to batch profile export destinations for instructions on activating audiences to this destination.

Scheduling scheduling

In the Scheduling step, you can set up the export schedule for your Azure Data Lake Storage Gen2 destination and you can also configure the name of your exported files.

Map attributes and identities map

In the Mapping step, you can select which attribute and identity fields to export for your profiles. You can also select to change the headers in the exported file to any friendly name that you wish. For more information, view the mapping step in the activate batch destinations UI tutorial.

Validate successful data export exported-data

To verify if data has been exported successfully, check your Azure Data Lake Storage Gen2 storage and make sure that the exported files contain the expected profile populations.

recommendation-more-help
7f4d1967-bf93-4dba-9789-bb6b505339d6