[Beta]{class="badge informative"}
Create an Acxiom Data Ingestion source connection and dataflow in the UI
Use the Acxiom Data Ingestion source to ingest Acxiom data into Real-Time Customer Data Platform and enrich first-party profiles. Then, you can use your Acxiom-enriched first-party profiles to improve audiences and activate across marketing channels.
Read this tutorial to learn how to create an Acxiom Data Ingestion source connection and dataflow using the ÃÛ¶¹ÊÓƵ Experience Platform user interface. The Acxiom Data Ingestion source is used to retrieve and map response from Acxiom enhancement service using Amazon S3 as a drop point.
Prerequisites prerequisites
This tutorial requires a working understanding of the following components of Experience Platform:
-
Experience Data Model (XDM) System: The standardized framework by which Experience Platform organizes customer experience data.
- Basics of schema composition: Learn about the basic building blocks of XDM schemas, including key principles and best practices in schema composition.
- Schema Editor tutorial: Learn how to create custom schemas using the Schema Editor UI.
-
Real-Time Customer Profile: Provides a unified, real-time consumer profile based on aggregated data from multiple sources.
Gather required credentials
In order to access your bucket on Experience Platform, you need to provide valid values for the following credentials:
Connect your Acxiom account
In the Platform UI, select Sources from the left navigation bar to access the Sources workspace. The Catalog screen displays a variety of sources for which you can create an account with.
You can select the appropriate category from the catalog on the left-hand side of your screen. Alternatively, you can find the specific source you wish to work with using the search option.
Under the Data & Identity Partners category, select Acxiom Data Ingestion and then select Set up.
Create a new account
If you are using new credentials, select New account. On the input form that appears, provide a name, an optional description, and your Acxiom credentials. When finished, select Connect to source and then allow some time for the new connection to establish.
Use an existing account
To use an existing account, select Existing account.
Select an account from the list to view details on that account. Once you have selected an account, select Next to proceed.
Select Data
Select the file that you want to ingest from the desired bucket and sub-directory. A preview of the data can be provided once delimiter and compression type is defined. Once you have selected your file, select Next to proceed.
Provide dataset and dataflow details
Next, you must provide information regarding your dataset and your dataflow.
Dataset details
A dataset is a storage and management construct for a collection of data, typically a table, that contains a schema (columns) and fields (rows). Data that is successfully ingested into Experience Platform is persisted within the data lake as datasets. To use a new dataset, select New dataset.
table 0-row-2 1-row-2 2-row-2 3-row-2 | |
---|---|
New dataset details | Description |
Output dataset name | The name of the new dataset. |
Description | (Optional) A brief explanation of the purpose of the dataset. |
Schema | A dropdown list of schemas that exist in your organization. You can also create your own schema prior to the source configuration process. For more information, read the guide on creating schema in the UI. |
To use an existing dataset, select Existing dataset.
You can select Advanced search to view a window of all datasets your organization, including their respective details such as whether they are enabled for ingestion to Real-Time Customer Profile.
If your dataset is enabled for Real-Time Customer Profile, then during this step, you can toggle Profile dataset to enable your data for Profile-ingestion. You can also use this step to enable Error diagnostics and Partial ingestion.
- Error diagnostics: Select Error diagnostics to instruct the source to produce error diagnostics that you can later reference when monitoring your dataset activity and dataflow status.
- Partial ingestion: Partial batch ingestion is the ability to ingest data containing errors, up to a certain configurable threshold. This feature allows you to successfully ingest all of your accurate data into Experience Platform, while all of your incorrect data is batched separately with information on why it is invalid.
Dataflow details
Once your dataset is configured, you must then provide details on your dataflow, including a name, an optional description, and alert configurations.
Experience Platform can produce event-based alerts which users can subscribe to, these options all a running dataflow to trigger these. For more information, read the alerts overview
- Sources Dataflow Run Start: Select this alert to receive a notification when your dataflow run begins.
- Sources Dataflow Run Success: Select this alert to receive a notification if your dataflow ends without any errors.
- Sources Dataflow Run Failure: Select this alert to receive a notification if your dataflow run ends with any errors.
Mapping
Use the mapping interface to map your source data to the appropriate schema fields before ingesting data to Experience Platform. For more information, read the mapping guide in the UI
Schedule your dataflow ingestion
Next, use the scheduling interface to define the ingestion schedule of your dataflow.
Configure frequency to indicate how often the dataflow should run. You can set your frequency to:
- Once: Set your frequency to
once
to create a one-time ingestion. Configurations for interval and backfill are unavailable when creating a one-time ingestion dataflow. By default, the scheduling frequency is set to once. - Minute: Set your frequency to
minute
to schedule your dataflow to ingest data on a per-minute basis. - Hour: Set your frequency to
hour
to schedule your dataflow to ingest data on a per-hour basis. - Day: Set your frequency to
day
to schedule your dataflow to ingest data on a per-day basis. - Week: Set your frequency to
week
to schedule your dataflow to ingest data on a per-week basis.
Once you select a frequency, you can then configure the interval setting to establish the time frame between every ingestion. For example, if you set your frequency to day and configure the interval to 15, then your dataflow will run every 15 days. You cannot set the interval to zero. The minimum accepted interval value for each frequency is as follows:
- Once: n/a
- Minute: 15
- Hour: 1
- Day: 1
- Week: 1
Review your dataflow
Use the review page for a summary of your dataflow prior to ingestion. Details are grouped in the following categories:
- Connection - Shows the source type, the relevant path of the chosen source file, and the number of columns within that source file.
- Assign dataset & map fields - Shows which dataset the source data is being ingested into, including the schema that the dataset adheres to.
- Scheduling - Shows that active period, frequency, and interval of the ingestion schedule.
Once you have reviewed your dataflow, click Finish and allow some time for the dataflow to be created.
Next steps
By following this tutorial, you have successfully created a dataflow to bring batch data from your Acxiom source to Experience Platform. For additional resources, visit the documentation outlined below.
Monitor your dataflow
Once your dataflow has been created, you can monitor the data that is being ingested through it to view information on ingestion rates, success, and errors. For more information on how to monitor dataflow, visit the tutorial on monitoring accounts and dataflows in the UI.
Update your dataflow
To update configurations for your dataflows scheduling, mapping, and general information, visit the tutorial on updating sources dataflows in the UI.
Delete your dataflow
You can delete dataflows that are no longer necessary or were incorrectly created using the Delete function available in the Dataflows workspace. For more information on how to delete dataflows, visit the tutorial on deleting dataflows in the UI.
Additional resources additional-resources
For more information, read the .