ÃÛ¶¹ÊÓƵ

[Integration]{class="badge positive"}

Ingest data using the ÃÛ¶¹ÊÓƵ Analytics source connector

The ÃÛ¶¹ÊÓƵ Analytics Source connector allows you to easily stream, map, and filter data from ÃÛ¶¹ÊÓƵ Analytics into ÃÛ¶¹ÊÓƵ Experience Platform’s Real-Time Customer Profile and Experience data lake. Learn how to use data prep features to create semantic and calculated fields to improve user experience in Segment Builder and Platform applications. Cross-regional report suites are supported for this connector. For more detailed product documentation, see Create an ÃÛ¶¹ÊÓƵ Analytics source connector in the UI.

Transcript
In this video, I’ll explain how users can ingest their data from ÃÛ¶¹ÊÓƵ Analytics into ÃÛ¶¹ÊÓƵ Experience Platform and enable the data for platform’s real-time customer profile. These are the areas I’ll cover in this video. Analytics Source Connector, Workflow Options for using a Standard or Custom Schema, Use Cases for a Custom Schema, Data Prep Functions in the Custom Schema Workflow, Filter Rules to selectively include or exclude data from ingestion to the profile store, Configure Analytics Data Objects for the real-time customer profile, and Monitor the Analytics Data Flow. The Analytics Source Connector isn’t the only way to get your analytics data into platform, but it’s the fastest method requiring the least level of effort. If you have use cases that require real-time segmentation based on analytics attributes, review the tutorials and documentation about streaming ingestion and Web SDK. To set up the ÃÛ¶¹ÊÓƵ Analytics Source Connector, just log in to Experience Platform and navigate to Sources to view the Sources catalog. Under ÃÛ¶¹ÊÓƵ Applications, look for ÃÛ¶¹ÊÓƵ Analytics. Select the ÃÛ¶¹ÊÓƵ Analytics Source Connector to add data. In the Analytics Source Add Data step, you can choose to source data from any of your organization’s report suites, each of which is considered a unique data set and platform. Notice my organization includes report suites from multiple regions. This is possible as long as they’re mapped to the same organization as the Experience Platform sandbox instance in which the connection is being created. Grayed out report suites indicate a data flow has already been created. Select the report suite you want to ingest data from in the list, and then at the top, select Next. There are two target schema choices, Default and Custom. There’s also an option to enable data to profile service. Before we go further in the workflow, I’ll spend some time reviewing the differences between the Default and Custom schema. First, what is a schema? It’s a set of rules that validate the structure and format of data, and it’s used by platform to ensure the consistency and quality of data coming in. Selecting the Analytics Default schema in the Source Connector workflow will automatically map your report suite data to the Default schema without any additional effort on your end. You don’t need to create a new schema for this option. Everything comes over as is. Selecting Custom Schema in the Source’s workflow does require you set up a new schema because you’ll be mapping standard analytics attributes to new attributes. Let me show you what I mean. There are two field groups that are part of this schema. The first is the ÃÛ¶¹ÊÓƵ Analytics Experience Event template, which contains the standard analytics attributes. The second field group is Reservation Details. This is a user-defined field group added to this schema. Here we see fields that have descriptive names like transaction, cancellation, and confirmation number. Later, I’ll map analytics variables to some of these new attributes in the Source Connector workflow. Before we go back to the workflow, here are the main use cases for using a custom analytics schema. First, you may want to see better semantic or descriptive attribute names in things like segmentation service and customer journey analytics. But let’s say you don’t want to change these in the Analytics report suite settings. You can do that using data prep features. Second, if you want a more standardized way of referencing the same data that might be captured differently across report suites, using custom attributes is the way to go, as you see illustrated in this table. Third, you may have data in analytics that is stored in pipe delimited format, or maybe you want to join two values together in a single attribute. The data prep features accomplish this. And last, let’s say you want more flexibility for defining identities for your analytics data beyond the experience cloud ID. You can do that by setting up a new attribute in your custom field group and marking it as an identity field. Back in the UI, I’ll choose Enable Data to Profile Service. Later in the flow, I’ll show you how to apply filtering rules and conditions to selectively include or exclude data from ingestion to the profile service. Next, I’ll choose Custom Schema. From the schema list, I’ll select Travel Reservations. The Map Standard Field section gives you details about the default mapping that occurs from your report suite to the Analytics Experience Event field group in the schema. If there are descriptor conflicts when mapping your report suite to a pre-existing schema, they’ll appear here. Now I’ll create some mapper functions, also known as data prep. First, I’ll set up a pass-through mapping. I’ll select the Add New Mapping button under Custom. In the Source field, which is coming from my report suite descriptors, I’m going to select EVAR5. My report suite doesn’t have a label or descriptor for this variable, but it contains confirmation number values. I want to map this to the semantic field created in the Vehicle Reservations custom field group that’s part of the schema. Now in the Target field, I’ll select the Confirmation Number field. Next, I’ll set up a Calculated field. I’ll select the Add Calculated Field button. This opens an editor that contains the functions, fields, and operators on the left, as well as a text editor and a preview section in the middle. I’ll type in TRIM in the search box, and then I’ll click on the plus sign to add it to the editor. I’ll do the same for lower. Next, I’ll select Field at the top and type in EVAR2, and I’ll click the plus sign again. So let’s say EVAR2 contains a transaction ID. What I want to do is trim any spaces and ensure the value is lowercase. Last, I’ll make sure to rearrange the formula so that the syntax is correct. Now there’s a preview button to see a sample, or notice the green checkmark to indicate proper syntax. I’ll click on the Save button in the upper right corner. This adds the calculated field to the left. Now I’ll configure the Target field on the right. I’ll select Transaction ID in the Transaction object, and then I’ll click on Select at the bottom. Now that all the mappings are addressed, I can click on Next in the upper right corner. This is the filtering step in the flow. This step only applies if Enable for Profile was selected earlier. Here’s a quick review of the Analytics to Platform architecture using the Data Source Connector. All analytics data automatically goes to the data lake, the repository that’s used for things like query service, customer journey analytics, and other applications that use platform data. The Profile Store, on the other hand, is a separate repository used to create customer profiles and used by things like segmentation service. Customers who use Experience Platform are discerning about the volume and nature of data they want to send to the Profile Store. The ability to filter analytics data is permitted only prior to this data entering Profile. This means you need to set it up during the initial ingestion workflow if you enable analytics data for Profile at that time. There are two types of filter rules available. Row-level filtering allows you to apply conditions that dictate which data to include for profile ingestion, whereas column-level filtering allows you to specify which data to exclude. I’ll show you row-level filtering first. Under Row Filter, I’ll enter Country to filter for that column. Let’s say my analytics report suite contains reservations from multiple countries, but I only want to send data to the Profile Store related to reservations made in the United States. I’ll drag and drop the Country attribute to the filtering canvas. There are many operators available, such as Starts With, Exists, etc., but I’m going to keep this set to equals. I’ll enter United States in the text box, and then I’ll press Enter. Now, I’ll click on the column filter. Let’s say I want to exclude some mobile application events. I’ll expand the hierarchy for application, then I’ll select Application Closes and all of the Boolean type attributes. On the Data Flow Detail page, I’ll provide a name. Then I’m going to click on Next in the upper right corner. Alright, this takes us to the review step. We want to make sure everything looks good before clicking on the Finish button in the upper right corner. Last, I’ll show you some other validations and configurations you can do once your analytics data has started to ingest. I’ll demonstrate using a different data flow. I’ll click on Data Flows at the top here in the Sources section. I’ll filter ÃÛ¶¹ÊÓƵ Analytics by clicking the filter icon, then I’ll select the last data flow in the list. This opens the Dataset Activity page. Under the Dataset Activity, there’s a quick summary of ingested batches and failed batches during a specific time window. As I scroll, I see ingested batch IDs. Each batch represents data ingested. There’s also some metadata about the records successfully ingested or failed. I’ll select Preview Dataset in the upper right corner to show you what the last processed batch looks like. If the dataset is enabled for real-time customer profile, the toggle for this setting would be green in the Properties panel to the right. Notice the link to the schema below the profile toggle. If this data should be sent to the Profile Store, confirm the setting for the schema as well. I’ll open the schema in a new window and check it out. In this example, the dataset isn’t configured for Profile, but the schema is. This means data won’t be sent to the Profile Store because the dataset isn’t configured for Profile. This concludes the demonstration for using the Source Connector to ingest data from Analytics into Experience Platform. You should now understand all the configuration options available, as well as how to enable this data for the real-time customer profile. Good luck!
recommendation-more-help
9051d869-e959-46c8-8c52-f0759cee3763