[Integration]{class="badge positive"}
Integrate Campaign v8 with Real-Time CDP
Connect Campaign to Experience Platform as a destination
Learn how to activate an ÃÛ¶¹ÊÓƵ Experience Platform segment to a destination using the Amazon S3 connection type.
Transcript
Hello, and welcome to creating an ÃÛ¶¹ÊÓƵ Campaign destination in Platform. In order to use ÃÛ¶¹ÊÓƵ Experience Platform segments to enrich our audiences within Campaign, we need to connect Campaign as a Platform destination. This is done using the Platform user interface. In this video, we will walk through the steps necessary to connect Campaign to Platform as a destination. Experience Platform allows data to be activated or exported to multiple external sources, such as ÃÛ¶¹ÊÓƵ applications, cloud-based storage, databases, and more. This video follows a workflow for creating a Campaign destination using an Amazon S3 bucket, creating a dataflow to export a Platform segment to the bucket, and then scheduling the dataflow to happen multiple times per day. This video will not cover the steps for creating segments within Platform. For detailed steps on how to create a Platform segment, please refer to the Experience Leak videos and documentation. You will need to have a segment created before creating a destination. In the Platform UI, select Destinations from the left navigation, then select the catalog to view all available destinations. You can browse the catalog or use the search functionality to search for ÃÛ¶¹ÊÓƵ Campaign. Once you have located the Campaign destination, select Configure to begin configuring the destination. If your organization has already connected to this destination, you will see the option to Activate instead of Configure on the destination card. To learn more about the difference between activating and configuring, please refer to the Experience Platform destinations documentation. For this example, we will select Configure to configure a new destination from scratch. Selecting Configure enters the Activate destination workflow, where the first step is to connect our Campaign account. We select New Account, and then for the connection type we can select an option from the dropdown. The recommended method to send data to Campaign is through Amazon S3 or Azure Blob. Depending on the connection type you select, you must enter the appropriate details. We will be using Amazon S3, which requires an access key ID and a secret access key. Optionally, you can add your RSA formatted public key to add encryption with PGP and GPG to your exported files under the Key section. Your public key must be written as a Base64 encoded string. For this example, we will not select this option. To learn more about adding encryption, refer to the destinations documentation. Once the information has been entered, select Connect to destination. After confirming that you are connected, select Next to continue with the workflow. The next step in the destination activation workflow is to provide account authentication information. This includes a name, where we provide a relevant name for your destination. The description. Enter an optimal description for your destination. This is useful if your organization is creating multiple destinations or has multiple people working with destinations. The bucket name. For S3 connections, enter the location of your S3 bucket where Platform will deposit your export data. The folder path. Add the path in your storage location to the file that should hold your export data. The file format. For this example, we will be exporting a CSV file. And marketing actions. Marketing actions indicate the intent for which data will be exported to the destination. You can select from ÃÛ¶¹ÊÓƵ-defined marketing actions, or you can create your own marketing action. For more information about marketing actions, see the Experience Platform Data Governance documentation. We will select Email Targeting as our marketing action because this export will be used for sending an email campaign. After completing the authentication information, select Create Destination to create your destination. Once your destination is confirmed, select Next to choose the Experience Platform segments that you would like to activate to campaign. For this example, we have selected a segment of customers who are predicted to make an order as we want to target them with an email offering a discount code on a future purchase. With our segment selected, we can select Next to proceed with scheduling our export. On the export screen, we can see the name of the segment we selected, scheduling information, as well as file name. For file-based destinations, a unique file name is generated per segment. You can use the file name editor to create and edit a unique file name, or keep the default name. We will keep the default name. When we select Create Schedule, the scheduling dialog opens, allowing us to configure our schedule. Under File Export Options, we can choose to export full files or export incremental files. Choosing to export full files will export a complete snapshot of all the profiles that qualify for the segment. Choosing to export incremental files will export only the profiles which qualified for the segment since the last export. The first incremental file export includes all profiles that qualify for the segment acting as a backfill. Future incremental files include only the profiles which qualified for the segment since the first incremental file export. We will select this option to help limit the load on our instance. For frequency, we will have the export happen every three hours and indicate the time we would like the first export to happen. For date, we can select a single day or date range where we would like the export to take place. After creating our schedule, we return to the workflow and can see that frequency and export type are now filled in. We then select next to proceed to selecting attributes. On the attributes selection screen, Experience Platform provides recommended attributes to help us get started. You can use the minus sign to remove any recommended attributes that you don’t want to use. Attributes can be added using the add new field button and can be marked as mandatory or duplication keys using the check boxes. A mandatory attribute makes it so that all exported profiles must include that attribute. In other words, profiles without the mandatory key are not exported. Leaving all mandatory check boxes blank exports all qualified profiles regardless of their attributes. A deduplication key eliminates multiple records of the same profile in the export files. You can select a single namespace or up to two schema fields as a duplication key. Not selecting a duplication key may lead to duplicate profile entries in the export files. Since this segment will be used for an email campaign, we will set email address as a mandatory field, thereby not exporting any profile that does not contain an email address. We’ll be using the mobile number and email address to deduplicate our records. With our attributes selected, we can use the next to proceed to the final review stage. The review screen provides an overview of the destination that we just created, including the name that we provided, the destination, ÃÛ¶¹ÊÓƵ campaign, any marketing actions that were selected. We can also review the number of selected segments and the number of selected segments that were selected in the previous section. The number of selected segments and attributes. After confirming this information is accurate, we will select finish to complete our destination setup and begin our scheduled exports. After our export has run, we can select the browse tab under destinations to view our configured destinations. Selecting the name of the destination opens details regarding data flows and activations. The data flow runs tab shows our data flows every three hours as specified. Since our destination was set up for incremental export, we can see that the first data flow included all profiles in order to backfill the data. Since no profiles were added between the first data flow and the next, the subsequent data flows shows zero new profiles received. The activation data tab shows the segments that have been activated to ÃÛ¶¹ÊÓƵ campaign, as well as their status related to marketing actions and more. Here, we can see the customers predicted to order segment that we selected during the destination configuration. After watching this video, you’ve learned how to activate an experience platform segment to a destination using an Amazon S3 bucket. The exported data is now available within S3 and updated files will add only the newly qualified segment members at regular intervals. Thanks for watching.
Import recipient data from Experience Platform and send an email
Learn how to configure an external account in ÃÛ¶¹ÊÓƵ Campaign to import recipient data from ÃÛ¶¹ÊÓƵ Experience Platform to Campaign. Understand how to create a workflow to upload and target the recipients received from the Experience Platform.
Transcript
In this video, we’re going to create a workflow to import our ÃÛ¶¹ÊÓƵ Experience Platform recipient segment data that was uploaded to a campaign destination via an S3 bucket. Additionally, we will send a targeted delivery to our new imported recipients. Let’s get started. First, in order to pull in our S3 data, we will need to configure an external account in ÃÛ¶¹ÊÓƵ Campaign. Adding an external account is easy. Under Administration, find External Accounts. In this instance, External Accounts is located under Platform. Next, we need to add a new external account. Select the New icon, followed by providing a label and internal name. Once selected, select AWS S3 from the Type dropdown. This will populate a new section asking for your AWS S3 account credentials, including the Server, Access Key, Secret Key, and Region. Upon filling in the required account credentials, remember to save. Upon saving our AWS S3 account, we are ready to begin a new targeting workflow. Navigate to Profiles and Targets, Jobs, and select Targeting Workflows. Then, create a new workflow. Let’s name this workflow ÃÛ¶¹ÊÓƵ Experience Platform Data Import Delivery. The first component we want to add to our new workflow is the File Transfer component located in the Events tab. Drag and drop File Transfer onto the workflow, followed by double-clicking it to open the File Transfer popover. Optionally, we can give this component a new label. I’ll call mine Platform S3 Data Download. Set the Action to File Download from the dropdown, then tick the checkbox for Use an External Account. Selecting the folder or dropdown allows us to select the account we added at the beginning of this video. If you have a server folder within your S3 container where you saved your data, make sure to provide the folder name. Once complete, select OK to continue. Now that we are downloading the files from our S3 account, we need to add a Data Loading component for our CSV file. Within the Actions tab, select the Data Loading File component and add it to the workflow, then double-click it to open the Data Loading popover. The Data Loading component expects a sample CSV file. After providing a sample of how our data is structured, we can use the preview to confirm the imported data will be in the format we expect. If we recall the previous video for uploading data from platform to campaign via a destination, we can see that the fields we exported to our S3 bucket are present in our preview. Once we’re happy with the preview, double-check that the Specified in Transition and Default target database radio buttons are selected, then select OK to continue. At this point, if we run our workflow, we should see results equal to the number of exported profiles. After adding a Start component, I expect the workflow to return 47 results because my latest CSV file that was uploaded to the S3 bucket has 47 different profiles. I can right-click the result and select Display the Target to preview the output. Now that we have data loading, we need to map the CSV data fields to campaign recipient fields. To do this, let’s select the Targeting tab and drag and drop the Enrichment component. Upon double-clicking the component, the Add Additional Data popover appears. Our primary set should have automatically populated to our data loading file. We are interested in mapping our data, hence we want to select the Reconciliation tab. After ticking the checkbox, we are provided a UI which expects a targeting dimension and reconciliation conditions. Since we want to add recipients, select the Recipients Targeting Dimension, Recipients NMS. We can leave Use a Simple Reconciliation Key selected and skip straight to adding our source and destination expressions. Select the Add button and the Edit Expression button should be made available. Upon selecting it, we can see our data loader fields. We want to pass the First Name, Last Name, and Email. Let’s start by selecting Email. Next, we want to map this expression to 1 in the recipient schema. Following the same workflow, we can select Email from the available fields in our recipient schema. After adding all our fields, select OK to continue. Now, because we mapped our data using the recipient schema as the target, we might see some duplication if these recipients were already uploaded. As a best practice, we should drop in a deduplication component before uploading our data and sending an email delivery. After adding the component, double click it and select Edit Configuration. We only want our enrichment data to be exported. Select the Temporary Schema radio button, followed by selecting Enrichment from the Schema dropdown. Then select Next to continue. We need to define the field we want to filter duplicates on. Add a field by selecting the Add button, followed by selecting the Email expression. We can also rename the expression by giving it a label. Select Next to continue. We can keep the default values of 1 and Choose for me, and then select Finish, followed by selecting OK to continue. Now, when I run my workflow, I only receive 47 results. However, as you can see, we have the additional export of a name and email in the Targeting Dimension column. This means that we can use this targeting dimension in an email directly within the workflow. However, we also want to upload and save all these recipients to a folder. So let’s select Flow Control and drag and drop the Fork component. First, let’s upload our data to a folder. To upload our data, we need to drag and drop the Update Data component from the Targeting tab. Upon opening the Update Data popover, select the Insert or Update Operation Type. This means we can override previous recipients with new information and add new ones. Additionally, we want to select the recipient dimension again as our target. Similar to the enrichment, we need to define our destination and source expression mapping. However, this time we also want to add our recipients to a folder. This means we need to create a new folder. To create a new folder for recipients, select Profiles and Targets, followed by right-clicking and hovering over Add New Folder, Database, and select Recipients. This will create a new recipients folder for us. I’m going to rename my folder to Platform Profile Recipients. Navigating back to our workflow, we can go back to the Update Data component, and in the Fields to Update section, type the destination as Folder, with the source being our new recipients folder. This means that all the recipients data will upload and save to our new recipients folder. Select OK to continue. Now that we are updating our data, we can simultaneously send an email delivery to all the recipients that were predicted to order from Luma. Because we used a segment from a machine learning model in ÃÛ¶¹ÊÓƵ Experience Platform, we already know these recipients have a high propensity to purchase an item. Let’s target these recipients with an exclusive discount code email offer. To do this, start by selecting the Actions tab, followed by dragging and dropping the delivery component to our other fork. Upon double-clicking, we can select an email template by selecting the New Created from a Template option, followed by selecting our Discount Offer template. If we wanted to customize the email further, we can select the magnifying glass and edit the content. We do not need to specify a target population, because our recipients are already specified by the inbound events. The content is specified in the delivery, and if we want the email delivery to auto-send, we can select Prepare and Start. If you wish to sign off before starting the email delivery, you can leave the default Prepare selected. Select OK to continue. With our delivery and update data component complete, all that’s left to do is add the end flow control. Upon running the workflow, two things should happen. First, we should see a list of 47 recipients in our recipients folder. The second thing is an email delivery should run and send our discount offer email to the recipients that were imported. As we can see here, I got the email delivery offer. You should now know how to import data from an external source and map it to your campaign recipient schema. Additionally, we were able to send an email discount offer via Campaign, which will generate delivery logs. This allows us to send these delivery logs back to Platform. The workflow for exporting our delivery logs back to ÃÛ¶¹ÊÓƵ Experience Platform will be covered in subsequent videos. Thanks for watching.
recommendation-more-help
6cc49b4d-22d2-4aea-b5a9-a228666a600e