ÃÛ¶¹ÊÓƵ

Advanced Data Lifecycle Management in ÃÛ¶¹ÊÓƵ Experience Platform

ÃÛ¶¹ÊÓƵ Experience Platform provides a robust set of tools to manage large, complicated data operations in order to orchestrate consumer experiences. As data is ingested into the system over time, it becomes increasingly important to manage your data stores so that data is used as expected, is updated when incorrect data needs correcting, and is deleted when organizational policies deem it necessary.

These activities can be performed using the Data Lifecycle UI workspace or the Data Hygiene API. When a data lifecycle job executes, the system provides transparency updates at each step of process. See the section on timelines and transparency for more information on how each job type is represented in the system.

NOTE
Advanced Data Lifecycle Management supports dataset deletions through the dataset expiration endpoint and ID deletions (row-level data) using primary identities via the workorder endpoint. You can also manage dataset expirations and record deletions through the Platform UI. See the linked documentation for more information. Note that Data Lifecycle does not support batch deletion.

Data Lifecycle UI workspace ui

The Data Lifecycle workspace in the Platform UI allows you to configure and schedule data lifecycle operations, helping to ensure that your records are being maintained as expected.

For detailed steps on managing data lifecycle tasks in the UI, see the data lifecycle UI guide.

Data Hygiene API api

The Data Lifecycle UI is built on top of the Data Hygiene API, whose endpoints are available for you to use directly if you prefer to automate your data lifecycle activities. See the Data Hygiene API guide for more information.

Timelines and transparency

Record delete and dataset expiration requests each have their own processing timelines and provide transparency updates at key points in their respective workflows.

The following takes place when a dataset expiration request is created:

Stage
Time after scheduled expiration
Description
Request is submitted
0 hours
A data steward or privacy analyist submits a request for a dataset to expire at a given time. The request is visible in the Data Lifecycle UI after it has been submitted, and remains in a pending status until the scheduled expiration time, after which the request will execute.
Dataset is dropped
1 hour
The dataset is dropped from the dataset inventory page in the UI. The data within the data lake is only soft deleted, and will remain so until the end of the process, after which it will be hard deleted.
Profile count updated
30 hours
Depending on the contents of the dataset being deleted, some profiles may be removed from the system if all of their component attributes are tied to that dataset. 30 hours after the dataset is deleted, any resulting changes in overall profile counts are reflected in dashboard widgets and other reports.
Audiences updated
48 hours
Once all affected profiles are updated, all related audiences are updated to reflect their new size. Depending on the dataset that was removed and the attributes that you are segmenting on, the size of each audience could increase or decrease as a result of the deletion.
Journeys and destinations updated
50 hours
Journeys, campaigns, and destinations are updated according to changes in related segments.
Hard deletion complete
15 days
All data related to the dataset is hard deleted from the data lake. The status of the data lifecycle job that deleted the dataset is updated to reflect this.

Next steps

This document provided an overview of Platform’s Data Lifecycle capabilities. To get started making data hygiene requests in the UI, refer to the UI guide. To learn how to create Data Lifecycle jobs programmatically, refer to the Data Hygiene API guide

recommendation-more-help
332f81c1-51e7-4bde-8327-2eb07f09604f