ÃÛ¶¹ÊÓƵ

Expedia and Leading Brands Evolve their Organization from ÃÛ¶¹ÊÓƵ Analytics to Customer Journey Analytics

Join us for an exciting webinar as we explore the next evolution of analytics, featuring Jake Winter, principal lead at Adswerve, Erika Ulmer, Senior Manager, Data Product Management at Expedia, Ben Gaines, Director of Product Management at ÃÛ¶¹ÊÓƵ and Trevor Paulsen, Director of Product Management at ÃÛ¶¹ÊÓƵ who will share their first-hand experience of up leveling their organizations with ÃÛ¶¹ÊÓƵ Customer Journey Analytics.

We’ll discuss the latest trends and best practices in analytics, as well as the challenges and opportunities that come with leveraging Customer Journey Analytics. Whether you’re an analytics professional or a business leader looking to improve your organization’s analytics capabilities, this webinar is not to be missed.

Transcript
Hello everyone, and welcome to our webinar. Thank you for joining us for today’s presentation. This is evolving. Your organization from ÃÛ¶¹ÊÓƵ Analytics to Customer Journey Analytics. And now I’d like to hand this session over to Ben Gaines, who is a director of product management for the Digital Experience business here at ÃÛ¶¹ÊÓƵ, and he will be our host today and will get us started. Hello, Ben, Welcome and the stage is now yours. Thank you, Emily, and thank you everyone for joining us today. We are so excited to be here hosting this webinar with some true rockstar talent talking about how you can evolve your organization from ÃÛ¶¹ÊÓƵ Analytics to customer Journey analytics. The aforementioned rock star talent is includes Erica Ulmer, who leads Data Product Management at Expedia Group. You’ll hear from Jake Winter, who is the principal lead for Customer Journey Analytics and ad swerves. Jake also has has done this client side in a previous role at Best Buy, so we’ll hear from him on multiple fronts. And many of you know Trevor Pulse, my peer on the product management team for Customer Journey Analytics. And I’m Ben and I’m thrilled to host the session. We are going to keep this light and it’s going to be a it’s going to be a great discussion. Please do ask questions that you have as the presenters that you see on the on the screen here or share their their thoughts and their experiences. And we will answer as many of those as we can. And we’ll try to you know, for ones that we can answer, we’ll try to find ways to answer those offline. With that, I want to turn it over to Trevor to give a little bit of background on what Customer Journey Analytics is and sort of why we’re why we’re here today, why why we’re even talking about this idea of evolving from ÃÛ¶¹ÊÓƵ Analytics to Customer Journey Analytics in organizations. Trevor Hey, yeah, thanks, Ben. So yeah, just to introduce myself again, my name is Trevor Paulsen. I’m kind of one of the first people that ever worked on Customer Journey Analytics and, you know, it’s been a fun project. As part of my role at ÃÛ¶¹ÊÓƵ, I talk with a lot of companies from all around the globe and in in reality, I haven’t talked to anyone or any company that’s not trying to better understand their users and their customers so that they can craft a better experience, whether that’s online or offline. And in those conversations over the years, a few challenges have popped up again and again. And a lot of those, you know, three of the most common ones I’ve put here in the slide. So first off, companies are looking for a lot more flexibility when it comes to data. And while ÃÛ¶¹ÊÓƵ Analytics has traditionally been able to give folks insights really fast, the need for a more modern system that can flex with the unique analytics needs of their business and integrate tightly with other parts of their business. Deck has come through loud and clear to us over the years. So that’s the first one. The second thing that I hear all the time is that there’s this puzzle of trying to piece together customer data from every customer touchpoint, and it’s a beast to try and bring all of that stuff together into a single customer view, especially when it’s spread across different devices or different channels. And that often means like wrangling with complex sequel or slamming into the limits of really what’s possible with traditional analytics tools. And that’s always a persistent challenge that I hear. And then the third one that I hear all the time is that even with insights in hand, those insights are only as good as the actions they can drive for your company. And businesses need to weave those insights into every interaction point they have with their customers. You know, whether it’s a first time site visit or a loyalty email program, you know, breaking down those silos between marketing and product, you know, that’s not an easy challenge. So CGA is designed from the ground up to try and it really address those problems in a bigger, better way than ÃÛ¶¹ÊÓƵ Analytics was ever designed to do. You know, with the with this new version, I don’t know how new it is. It’s kind of had it out for a few years now, but it really gives us the ability to be more adaptable and address these sorts of challenges that customers have had in ways that really fundamentally change the game for, you know, what you might have been able to do in the past. So it enables you to merge data from various sources, you know, streamline some of the identity stitching challenges that you have and use analysis workspace, which you might already be super comfortable with from ÃÛ¶¹ÊÓƵ Analytics. And plus on top of that, it’s tightly integrated with the ÃÛ¶¹ÊÓƵ Experience platform, which helps companies really streamline the process of activating the insights that they get through things like ÃÛ¶¹ÊÓƵ’s customer data platform, CDP or Journey Optimizer, which helps customers kind of streamline a smoother transition from analysis to action. So yeah, that’s, that’s really why we built C.J. And you know, Ben, maybe I’ll turn it over back to you. Yeah. Thank you, Trevor. C.J. That has been a wild ride and a really exciting innovation. I’ve been, you know, it’s been awesome to be a part of it over the last few years. Thanks for kind of setting the foundation. The reason that we’re here, ultimately, the reason this webinar exists is that back in September I was was talking to one of the ÃÛ¶¹ÊÓƵ sales reps or maybe it was a customer success manager and he was saying, my customer is is on C.J. they’re coming from ÃÛ¶¹ÊÓƵ Analytics, but they’re just not sure how to make that move. How do they want to really standardize on Customer Journey Analytics and have that be kind of a the, the system that that their analysts and their marketers and others who are touching the customer experience use but they’re not sure how to get there. What content do you have on that phone? And I said, Well, I think Erica Ulmer touched on that at Summit. I was wrong. She touched on a different C.J. related topic. So I said, We’ve got to get Erica and Jake together with Trevor and me and we’ve got to talk through how customers make this journey. It’s not always an easy journey. It’s it is a C.J. has as Trevor eloquently covered is is significantly more powerful. You have a lot more options. There’s a lot more flexibility. So it is a journey. It is it is itself a journey to get on Customer Journey Analytics. And in this webinar for the next few minutes, we want to take advantage of the wisdom of of the folks in this room to answer some of these questions, to address some of these challenges that make it hard for people to or for hard for organizations to make that move and take advantage fully of the power of Customer Journey Analytics. So those reasons are I mean, the first the first one really is, is about that flexibility. Honestly, you you have choices to make around what datasets you want to bring in to C.J. It’s not just digital data. How are you going to what schema are you going to use? You’re not locked in to a certain schema. There are multiple data collection options for digital channels. There’s a lot more that we’re going to talk about and we’re going to help you think think through how to how that is should work in your organization. Number two, familiarizing your organization with ADP. Sort of relatedly, you know, there’s a new set of applications. It’s not just integrating with ÃÛ¶¹ÊÓƵ Target or ÃÛ¶¹ÊÓƵ Campaign anymore. There’s a Journey Optimizer, there’s CDP, and there are other just the nature of ATP. We’ve we’ve taken the opportunity to rethink some of the things that were kind of set in stone over the last 20 years in the in the digital analytics space in a really positive way. But that also introduces challenges because it’s a new language and it’s a new set of capabilities. And then the third one, which I think is is in some ways maybe the biggest is the actual change management of bringing hundreds or thousands of users of ÃÛ¶¹ÊÓƵ Analytics over into a new system that looks very similar and has a lot of the same data, but might have more data and allows you to do some different things with that data. And you know, how do you think about bringing over all of your, you know, all of your workspace projects that you saved over the years, all of your segments? How are you going to mark for your for your stakeholders that this was the day we cut over or certain carryovers may have happened on different days? How are you going to think about all of that? So those are those are the three big areas of challenge that we see. And now let’s get into the experiences of Erika and Jake and and Trevor who who has spent more time with customers than anybody, I think, on the planet. And and getting to really our first question or first discussion topic, start with Erika. Erika, why did why did Expedia want to evolve fully to just to get on to C.J. for. Sure and thank you, Ben. I’ll introduce myself quickly. Erika Omar, senior manager of Data Product Management at Expedia Group. You will hear me. Sean That too. IG And I’m based here in San Francisco, if you can guess from the clue behind me. But yeah, thank you. Ben. So the reason we were interested in evolving to C.J. as an organization was really where we were as a company at the time. A few years ago, we were in a period of convergence, so we own multiple different brands and we were operating in really a fractured data landscape at the time where each brand had adopted their own data capture solution and they were sending their own data into their own data products. And so C.J was kind of evaluated as, can we bring all of this data together so that way we can report on business performance both holistically but still have that deep dive capability available. If you want to report on a specific brand or a specific point of sale. And C.J was chosen to go down this path primarily due to its functionality, which the most compelling reason, especially from a leadership perspective, is the ability for us to switch to a first party data collection system. And fundamentally what that results in is it does end up reducing our vendor lock because we no longer have to rely on ÃÛ¶¹ÊÓƵ for our implementation and our data capture. We have that completely on our own and that we use as a front end analysis UI on the data that we’re capturing ourselves. But we did see additional functionality that we thought was very valuable. Another one primarily being as Trevor touched upon, is the ability, the ability to integrate non clickstream data. So currently at we bring in commerce and transactional data, including cancellations, experimentation, app installs, marketing attribution. And what that really does is that allows us to have a fuller view of that customer experience in this tool compared to our legacy tooling. Amazing how, how did you were? Jake I think is is working through a quick technical issue. So I’m going to stay on you for a second, Erica, and ask a follow up question. The the reasons that you gave were those Intuit is to the teams that you needed to kind of rally around this idea. Did you how much I guess how much selling did you have to do and what were the what were the roadblocks that you had to overcome internally? Yeah, I mean, I don’t think it’s particularly intuitive in some degree until we kind of communicated that out. You know, most users are not happy about a migration when they’re perfectly happy with their previous tooling that they had available. And we had multiple different tools and we were really operating a fractured landscape. But once we kind of showed a value proposition compared to even ÃÛ¶¹ÊÓƵ Analytics and that shift to C.J., really the ability to bring in these other datasets, the fact that we have more control over the data in this tool because we can back out of data, we can backfill data. We have those extensive customization capabilities when it comes to persistence metric attribution. Once we communicated that, it became very clear. But you can’t just throw out C.J. in an organization. I was like, Sure, sounds great. You do definitely have to sell it internally, but it’s a it’s if you package it, well, people understand fairly quickly. Okay. I think that’s a key point. Thank you, Jake. You’re you’re back. I think if interest to introduce yourself and then and then help us understand why why your organization evolved to C.J. and also keeping in mind that you’ve had the experience now of seeing how this works in some other organizations as well. So just, I guess, speaks to what you have learned about the why behind making really making that full shift on to C.J., since that’s the focus of this topic? Yeah, absolutely. Thanks, Ben. Which just as Erika started answering a question, my Internet went out, so I’m on a hot spot right now. So this is fun. So, yes, Thanks, Ben. I’m Jake Winter, principal lead with ads where, as Ben mentioned, I’ll also be speaking to a prior life with Best Buy, where I transition them from ÃÛ¶¹ÊÓƵ Analytics to C.J., But I’ll be sprinkling in a little bit of the experiences I’ve been having with over a dozen clients that I’ve been helping with that serve over the last six months Since moving there. To go back to the question, why did we want to evolve to CGA? So I was part of that process at Best Buy. One of the catalysts was that we were heavy users of the data feeds. We brought them in to Hadoop. We combine it with other offline data to really get at some of those customer level type insights that Ben was talking about. But a problem with it was that those feeds are really fragile. They require really complex transforms. If you’ve ever worked with post product list, you know exactly what I’m talking about. The nightmare of actually passing that out. Also, processing was unpredictable. Someone make a change in the admin and then they’d break something downstream in a Power BI report. And lastly, it was really difficult to use. People didn’t know what our 21 was or what Prop 11 was on the side that was using the data warehouse. And as much as we could, they just kind of ended up being this group of people that had most of the domain knowledge around the data and it wasn’t being democratized the way we wanted it to be. So we started kind of evaluating what we wanted in a solution. One of the key things was we wanted human readable schemas. We wanted to get away from props and virus. We wanted better ownership of our data like Erika was talking about. So one was privacy. We wanted control over where the data was going to be sent based on preferences. We wanted more data completeness. So we were encountering a lot of the challenges with ATP ad blockers. If we captured it with our own end point, we could get more complete data. And then also we wanted control. We didn’t want to be subject to server call overages. We had just gotten through Koven. We wanted to be able to control what data we were going to put where, which really, quite honestly fit well with S.J. So around the same time that we were making this evaluation, S.J. was starting to become a thing. Trevor and team were starting to kind of roll it out and we were early adopters. We started beta testing with it and quite honestly it fit a lot of what we were trying to achieve with our own data platform. Was that similar follow up to what I asked Erika? I mean, it sounds like there was or General organizational of understanding that that, that this addressed a lot of your your questions. Where did you have doubters and how did you kind of deal with that? But honestly, it came a little bit from both sides. So we had a technology team that was building a data platform and kind of wanted that to be the centralized place where people get insights. Then we had our ÃÛ¶¹ÊÓƵ Analytics digital analytics team that wanted to keep using ÃÛ¶¹ÊÓƵ Analytics. We had to kind of tailor the messaging of like, this doesn’t prevent you from doing what you did and B Analytics still be able to do what you did. And this doesn’t replace what’s happening in the data platform. You can still do all of the Power BI Tableau use cases that you had before. This is going to be something to augment and supplement and be able to kind of explore and do analysis and find insights in a rapid way without needing sequel. So we had to kind of toe that line the entire time. Okay, cool. Very helpful. I want to go Trevor, I don’t I, I don’t have a question for you, but I welcome your contributions. If you’re hearing anything that resonates with you from the time that you’ve spent with with CGA customers. Yeah. I mean, it’s it’s funny, we did our own transition as well. Now, it was not at the scale that Jake and Erika managed, but a lot of our own internal tracking. We had to sell people at. Ben was one of those people I had to sell. On adopting C.J. This is years and years ago, right? Like when we first did you? Oh, I. Was like everyone in our product organization. And then we had our marketing organization. We had sales folks that like lot in in our environment. C.J. provides a lot more flexibility just for our own internal usage, too. And so there was, you know, we had to we had to try and adopt that. We, we, we valued the kind of cross-device view that C.J. gave us the ability to link in some of the more sales pipeline information that we deal with. And all of that stuff is highly valuable for us, I think. So we ended up making the transition herself too, so it’s kind of fun to see maybe on the adobe side eating our own product here. Yeah, well, I’m glad you overcame apparently my objections. I never objecting, but I can be I can be ornery. So I don’t doubt that that that that happened. I want to turn to a question from the audience Robert asked, and I will start with Jake and then Erika. If you and anyone else if you if you have thoughts, were there any concerns? I think particularly as you’re bringing in offline data or non digital data call center, you know, point of sale, whatever, whatever it is, were there any concerns with data proliferation, copying data into AEP to power? And I think the more important question is I’m sure there were how did you overcome those objections? Because that is something that I think a lot of the people on on this call as they’re evolving to C.J. will will have to do. Yes, absolutely. Those questions came up. One of the things we’ve talked about and not happened to some of my ads fair experience, too, is quite honestly, you don’t need all of the data for C.J., so don’t recreate your data warehouse. Don’t bring everything over, bring what’s going to be most valuable for kind of building out that customer journey. And that doesn’t mean bringing everything over it. Like my classic example is if you’re bringing post data over, we don’t need to recreate like net sales metric and have cash transactions in there. They provide no value in the in the journey of a customer when you’re trying to combine it with digital. But being able to kind of like set expectations that we don’t need a full data set, bring what’s valuable, and then also like having buy in from the people that work with those data sets because there are there are likely analytics teams already working with those datasets. And if you can get them excited about using CGI to potentially do some of the analysis, that would have been harder to do in their own platforms, they can kind of generate the excitement and get involved with the migration over to CGI and they can be kind of some of those early adopters and advocates. Yeah I, I you’re, you’re sparking an idea in me of also I will answer this question as someone who’s been around the landscape for a little while himself not biting off everything all at once, starting with you know not just constraining the data set, which you’re which you’re I think a correctly advocating for it. But but also, you know, if you’ve got six different data sets that you eventually want to bring into siege, don’t insist on having all six from the get go. Start small. Start with the easiest, start with your closest partner organizationally, someone who understands the value of having that data integrated in a in an environment where it can be explored easily and then build out from there as you demonstrate the value. Hopefully I didn’t steal your answer, Erica, Any any thoughts on you’re in it? You may be in a slightly different situation with being a mostly digital business at Egx, but any any concerns that you had to overcome in terms of bringing that data into HP and how did you do that? Yeah, I mean, I would definitely second what both you and Jacob said. It’s like we’ve said no to stakeholders that have approached us for certain data sets they actually wanted to bring into S.J. once they were familiar with it and they want to join to the other datasets we have available. So we do do a cost analysis on the size of that data and how much value it’s actually going to bring by making it available. And we will say no to certain data sets. So doing that sort of cost analysis, how many people are actually going to look at that data and CGI, but with the concerns of copying data into CGI, it’s something that we want to be aware of because the more data products you have available, the more likely you are to have variation. So every time we do bring in enrichment data set in, we actually build dashboards to monitor the source of truth and what CGI is doing. And so you can either use the CGI API to build that sort of dashboards or use those built in tableau connectors that CGI has available. But that’s something that we actually do to make sure that we’re aligned across all of our different data products. Ali G. Awesome. Thank you. And one of the questions in the chat was do you treat CGI as the source of truth? And Eric, I’d be curious if you feel the same, but like we generally didn’t, we treated our data warehouse, which is hosted in the cloud that was generally the source of truth, maybe for some of the like digital metrics, like what would have been visits or conversion rate or something like that, that would be typically kind of source of truth type metric from S.J. But the rest of them, like Eric is saying, that’s primarily going to be in the data warehouse. Awesome. Okay, I’m going to we’re going to move on to the next question. When you were embarking on this project and we’ll start with you, Jake, this time and and then, Erica, how did you define success? What did success for this evolution from ÃÛ¶¹ÊÓƵ Analytics to C.J. look like? Yeah, so we took an approach of, quite honestly, setting OKRs. We had a data product team that supported our entire pipeline. We if it wasn’t clear earlier, we built our own data collection tool. So we had our own libraries, we had our own processing, and we captured it in the cloud. We use Google Cloud platform, but any cloud would work. And then we sent that over to ÃÛ¶¹ÊÓƵ Experience platform. So that kind of the flexibility of how you can get data into with that’s where we don’t really use that pattern. Typically, we use Web SDK, we use a lot of the ÃÛ¶¹ÊÓƵ tools to get it in there because that’s convenient and fast. But this was the approach that BestBuy needed because of the reasons I stated earlier. But the OKRs we created were kind of broken down into phases. So we started with kind of people, events, payloads is kind of how I break it down. So first focusing on as we build out this data, set the ideas that were capturing, making sure that they align with what we had been seeing previously. And ÃÛ¶¹ÊÓƵ, they’re not going to be the same. Especially we took them like an API or first party device approach when we moved to S.J., so we created our own visitor ID, in other words, so we knew that they weren’t going to align, but at least we wanted to see that they kind of trended with each other. So first making sure IDs are good. Then we had events, which is like how much breadth we had of the data that we were collecting. So we measured that relative to both our mobile app and web and had a percentage that we reported to our stakeholders of how well we were progressing against having the full breadth of data, the full breadth of events that we previous had had in ÃÛ¶¹ÊÓƵ Analytics. And then the last care that we had was related to depth. So within the actual payloads that we were putting into CGA, did they have the same depth of information that ÃÛ¶¹ÊÓƵ Analytics had? Did they have the same tiny VARs but in schema instead? And we measured that and we did that for both web and mobile app, providing regular updates to stakeholders, like I said. And then finally, once we had a data set that we were confident in and started rolling out to users, we had adoption metrics and we focused primarily on active users. If you look in your ÃÛ¶¹ÊÓƵ Analytics users, you can often see a lot of all the activity that doesn’t really speak to who’s actually using the tool now. So we focused on just last month monthly active users for both CGA and ÃÛ¶¹ÊÓƵ Analytics. And as we started to shift use cases over measured how many we were getting over into C.J. and how the number of ÃÛ¶¹ÊÓƵ Analytics users was shrinking. After the it was there sort of a like a threshold that you were looking for there that you, you know, you I mean, I guess it was 100%. There’s a question in the chat actually that that may be ask this better than I’m asking it. Have you already left ÃÛ¶¹ÊÓƵ Analytics? Completely? How did you transition users to the new tool? We can, but we have we have other questions later on that we can can get to that as well. But I do want to I should have clarified in the introduction that both Expedia and BestBuy and I think I think Jake is helping other customers as well get fully on to C.J. So there they are. Their users are all logging in to C.J. and only C.J. So that’s that’s part of the what we’re what we’re trying to help. Help. How many of you do with this webinar and the content that will create off of it? But Jake was I mean, was it just like I mean, it was like 100% for you, that was the threshold you wanted to be at 100% searching? Yeah, absolutely. 100%. And quite honestly, for all of these metrics, we actually ended up exceeding 100% because we captured more data because we are using our own pipeline. We had better schemas that could represent our experience and we had more users because we kind of got a few more people interested and excited. Like I said, we had some decision science teams that had never used C.J before in the tool. Awesome. Erica how about you? What what its success? How did you think about success at Expedia? Yeah, success for us was very much tied to the data landscape we were already in, so adoption metrics were a massive part of that. But really being able to deprecate these legacy tools that we had so we didn’t have a bunch of contracts running in parallel for tools that serve the same function. ADG And so we kind of staggered what success look like with targeting first, getting all of our Mixpanel users over, which was a smaller subset of total users. That was about 400 to 500 internal users that we pushed into CGI. First, we gained our learnings from that experience, then moved to our second milestone of success, which was moving off of ÃÛ¶¹ÊÓƵ Analytics, which we had roughly 4000 employees that had access to that, but roughly a thousand active users. And so measuring how many of those people moved over and making sure we hit our deprecation deadlines was our key measure of success. In the move to CGI. Excellent. I want to make sure that we talk about timelines and resources and how you did that, because the first time I met Erica, it was she was training hundreds of users on J and seeing kind of the way Expedia thought about that and the way that Erica organized that is really impressive. So we’re going to get to that in just a few minutes. In terms of defining success, Trevor, have you what have you seen work for customers? Anything that that Jake or Erica said that stuck out to you? Yeah, I think the common thread that I hear among the customer base that are very successful with this is they tend to take an iterative, agile approach to their adoption. Most of the time it starts with a core team. You know, your company probably has some sort of center of excellence. There is some familiarity that has to happen, right? There’s new baselines of metrics there, improvements, right? Like you’re getting a person level view instead of a device level view that you used to have. And, you know, the ability to start with that core team. Understand, okay, this is where our new like conversion rate looks like. This is what our new like visitor. It’s not visitors anymore, right? It’s like our actual customer account looks like, here’s what you know, how our flows look familiarizing yourself with that new baseline so that that CEO can then go arm the rest of the business, you know, to to prepare like, hey, we’re going to give you an improved view that you’re, you know, might be different than what you’ve had before, but it’s better for these reasons. So usually they’ll start with that CEO and then they’ll bring those bring in kind of core teams from around the organization. Typically the teams that are more data savvy, sometimes that’s like product management teams, other times that’s more marketing related teams or others that are kind of, you know, very familiar with the data and are kind of eager to see that view. So I would say every organization is a little different. But the the common thread is that agility and the ability to kind of iteratively expand team by team into different parts of the company. Yep. Awesome. Okay, I’m going to I’m going to move us on to the next question. I think we’ve there’s plenty of great stuff coming in in chat, by the way. We should have not a three hour webinar because you all are asking amazing questions and I do want to make sure maybe we’ll do a blog post where Jake and Erica can collaborate on answering some of the questions we don’t get to. So please keep asking questions even if we don’t get a chance to answer them here. Bucketing the tasks required, kind of laying out like your. I mean, we’re going to get to timelines and resources a little bit, Jake. You you sort of, I think, addressed this a little bit, talking about identity and events and then your analytics users. So start with Erica on this one as you sort of thought about like, you know, you had this mindset shift of, all right, we’re going to standardize on C.J. What how did you how did Expedia think about the different tasks how did you how did you organize the tasks that you needed to to execute. Yeah. Specific to C.J. I mean one of the buckets that we had was definitely configuration as C.J. And so that’s not just C.J on the components. So ÃÛ¶¹ÊÓƵ Experience Platform and building the pipelines to send data and build resiliency into those pipelines so we could always make sure that we’re hitting, you know, our allies around latency for example. I mean that was an ongoing process of the at the very beginning with having that data and then we had to configure it accurately in our data view, making sure that our persistent settings are attribution. The binding of dimensions all made sense and reflected data similarly to our legacy tools. And so that was also a lengthy process that we went through a lot of bugs and worked through those bugs over time. So that’s kind of like the parity between tools, making sure the data is landing within a reasonable time frame and that all of the components that were available in our legacy tools where we created in CGI, another one was data quality. And so not every organization will go through this, but we were technically also migrating to what was fundamentally a new data set. And so with that, we needed to look at a year’s worth of data across our major metrics and make sure that what our legacy tool reported C.J was close to that and we didn’t have to be exact, but we did set a specific discrepancy range that we needed to be within. Many metric was off of that that triggered a data investigation to understand why there was a discrepancy. And if we could solve that or note it for our historical notes. A third bucket kind of touched upon is training and enablement. And so this was something that actually helped us have for the most part, a positive sentiment on how we handled this migration is we were proactive in reaching out to teams and over communicating way ahead of time that this migration was happening. And so we offered customized trainings for heavy users of the tool, meeting them where they were when it came to use cases for using C.J. And we also partnered with our learning and development team to actually hold in-person hands on workshops where we went through practice problems. I’m using C.J last one I’ve already touched about this is communication. So with any migration over communicating I think is the best. Not everyone reads all of their emails or all of their slack. And so we try to reach our users where they were. By sending these constantly, we would also go to other teams, town halls. So we would reach out to like our product analytics team and be like, We want to go to your town hall to talk about this upcoming migration and what you need to do to be prepared. That is, that is awesome. The that makes a ton of sense and I think hits on a lot of the lot of the challenges that that I think I see people facing. In terms of you talked about data quality and data timeliness. What was that shift like? I mean, obviously, I mean, I think a theme that is emerging here, if the people watching haven’t caught on to it is validation and making sure that people are not going to be shocked by the data that they see and see that it’s going to make sense. Sure, there might be more data, like Jake mentioned, a few minutes ago, but for the metrics that they care about and yes, we do have a way to we have a project migration tool that will bring your workspace projects, your segments, your metrics into C.J., and let you map them to two new dimensions and metrics as they exist in C.J. So you can use all of those things. But when you getting back to sort of data timeliness, what did you what did you find there? How how did you deal with the challenges that you faced? And are you in a spot now where you feel like you’re sort of you know, you’re you’re where you need to be? Yeah, I definitely think we’ve come a long way since we were originally setting up C.J… I mean, we regularly hit ourselves around latency, but one of the things that we did have to communicate with how we decided to implement C.J. EDG was that there was going to be an increase in latency compared to ÃÛ¶¹ÊÓƵ Analytics, which was relatively real time due to the fact that data had to take a lot more steps to actually land in this tool with how we decided to configure it. Latency did increase to a couple of hours and so that’s just something we were very upfront about. We were like, Here are all the great things about this move to C.J. but unfortunately data latency that is kind of going to increase. But while we are aware that’s potentially a negative, here’s all the other things that we think make that valuable. One of the important things that we have done is build an auto reconciliation jobs into our pipelines. And so we send data hourly in batches. We’re not streaming data in. So if there are any misalignments due to upstream latency, our data engineers actually check for that every couple of hours and we will resend data back into app just to make sure we are aligned with the actual data set that we’re pulling from in our in our data platform. Real quick, I haven’t forgot I met you, Jake, but real quick, the from the chat, the latency that you’re talking about that that kind of was introduced, I believe is specific to the way that you chose to implement C.J. and it isn’t something that would be universal to everybody. Is that right? Yeah. Okay. Okay. Cool to take that a little further to like, Best Buy made the same decisions. We chose to do more with the data before we brought it over to AP. So we had similar latency considerations that said all of our ads for clients that are using kind of the ÃÛ¶¹ÊÓƵ Tools to capture their data Web SDK using that full stack. It’s about a similar latency as we see in ÃÛ¶¹ÊÓƵ Analytics. Awesome. Thank you. What about you, Jake? What do you what do you see particularly maybe as you’re as you’re consulting with with apps where clients how do you think about bucketing the tasks, the steps that are required, like at a macro level. Primarily, like I don’t know that I have anything additional there, but it’s more or less getting the data set straight, communications making sure that the users are aware of the changes, changes the baselines of some of those KPIs and then gradually moving over those use cases. Excellent. Trevor, any anything to add from your observations? I don’t think I think I have anything as astute as Jake or Erika other than to say the companies that I observed that are most successful have a plan, right? Like they have tasks and they organize that. Companies that think they’re just going to dive in and, you know, turn this on and everything’s just going to work automatically, like without prepping anyone or without thinking ahead of time, like what your goals are. That is often harder for folks to to navigate. So definitely, you know, having a plan, having some idea ahead of time of the things that you want to accomplish, and that the buckets of tasks that you’re going to accomplish that with is a good idea. Whatever those are for your company. Absolutely. Okay. I want to this is might be my favorite question of of the ones that we’ve got here for the group timelines and resources like the rubber meeting the road. You know what what what was your goal in terms of like, we want to have our organization on C.J., you know, using C.J. more or less exclusively by X like, you know, was it a working backward from a date? Was it a you know, Trevor just talked about having a plan. What did that what did that timeline look like? How long did it take you? Was it longer or shorter than you expected? And what resources did you have to bring in or what or leverage to make that timeline go? Jake, well, we’ll start with you this time. Yeah. So I think I’ll first answer the question from some of the clients that I work with, but that’s fair because it’s a different type of evolution and when you’re using ÃÛ¶¹ÊÓƵ’s tools versus how Eric and I, we’ve talked about kind of collecting your own data and using your own pipelines. Generally, we can do a switch over in about six months or so, so we can put together like what the date is going to look like coming in through Web SDK. We can start to migrate workspaces over and start to do that training. It’s possible, but it does for large organizations sometimes reach that more like a year for Best Buy. Specifically going into some details of what we had to go through, we had to rebuild an entire clickstream we had to rebuild libraries for both mobile app and web to capture this data. We built our own endpoint, so that took a lot longer. Simultaneously, we were really early adopters with C.J. and it didn’t necessarily have the features we needed when we initially did beta testing and APAC, I say this as like to provide some comfort as we went through identifying those kind of like must have features, kind of the ruthless prioritization of identifying which features do we have to have. Things like merchandizing for an ecommerce is absolutely necessary, things like deduplication we had to have in place. So those had to be put in place. And the team here obviously delivered those because we were able to start that. So we had a full probably like a year and a half of just kind of noodling with C.J., testing it, making sure that it can meet our needs. And then it was probably from there another year of getting our data in a place that we could actually execute the cut over. And then I’d say about 3 to 6 months of having overlapping tools where we had both ÃÛ¶¹ÊÓƵ Analytics and C.J. and the final three was more or less the final touches to make sure that C.J. could do everything that ÃÛ¶¹ÊÓƵ Analytics did, that as far as like resources and teams, we had a full data product team had probably 10 to 15 engineers building out that if you want to go that round, it’s a very different route and I’m actually helping a few clients that are going that route where they’re building out their own data collection. But you will have to have resources kind of allocated and dedicated to it. As far as the analytics and product teams, they’re generally kind of dealing with maybe 10 to 25% of their time dedicated to moving over reports, doing training, getting used to the new tool or things like that. That’s great. And I, I remember sitting with you on on several calls, going through lists of things that you really needed at Best Buy. I think we’ve checked all those boxes. I just want to make sure that our audience recognizes that we’ve we’ve done a lot to ease that. And I hope that you’re seeing that with the customers that you’re now helping on this journey, that it’s a you know, that that period of kind of waiting for the tool to be able to do everything it hopefully is doing everything and then some. But I don’t want to put words in your mouth. Absolutely. Debbie sponsored webinar Killing Eric. And how about what did what did timeline timelines look like for you? You also Expedia was also a very early adopter of CGI, which puts you further down this road. How was it different for you than for for Jake? I think there are a lot of similarities. Our total timeline for the migration to C.J. took roughly two years, but once again, we were in a very fractured data landscape and so we kind of piecemeal when we were forcing users over to the tool. And so one year was kind of mixpanel the next year was ÃÛ¶¹ÊÓƵ Analytics, which was a much larger share of that organization, but kind of have touched upon, which is like for each of these it was getting our engineering and our pipelines into a reliable state, configuring C.J. for that set of users, that data quality analysis piece, and that we actually allowed a lot of runtime for users to use so we would get close to a spot of configuration, not be completely done, and then start training it. And they went right away and encouraging people, even if we didn’t feel fully ready to start using the tool so they they could uncover additional issues that maybe we hadn’t found yet. And that can be something with our data or something we needed to escalate to ÃÛ¶¹ÊÓƵ because we were early adopters. And so there were there were some functionalities that we kind of needed built at the time, but we allowed roughly six eight months of runtime for users have access to both tools to get comfortable with moving over to J and exposing issues that they might have had and allowing us time to fix that before we completely revoked that legacy tool for resources. Kind of similar to Jake as well. We have a pod of engineers that are responsible for the data set that we send into CGI so that data capture. But then we have a set of data engineers and their responsibility is managing those pipelines to CGI. And so that’s separate from the data capture engineers. We our product teams where I said, of course, which is responsible for that CGI configuration, the training and enablement as well as the data quality and validation. And so that’s kind of where we ended up falling, is product managers were heavily involved in the alignment process between legacy tools and CGI, and then our program team really helped us with the communications to our stakeholders and reporting how we were progressing in our migration to our leadership teams. For those those resources that you you mentioned that kind of came in, I mean, those are sounds like net new in terms of sorry, new to the ÃÛ¶¹ÊÓƵ ecosystem with this with this move to C.J. But correct me if I’m wrong on that, but it did you was there any sort of like selling or anything that you had to do to get those teams to sort of contribute to the effort? Or how did that how did that go? I mean, we were lucky enough that the data product team at EAG is well versed in the ÃÛ¶¹ÊÓƵ landscape. I think everyone at the time on the team had multiple years of experience with ÃÛ¶¹ÊÓƵ Analytics, so we were already very familiar. But the sort of tooling, but it was new primarily to engineers and so we offered training enablement sessions for them as well. And I would say at this point in time, they’re now some of our heaviest users of CGI because they’ll use it to debug their own data that they producing and making available. And they actually gave us our highest NPS scores across the business. So we just offered like we treated our engineers who are working with us, same as we would stakeholders. And we’re like, here’s the value and C.J., here’s what our vision is for this tool moving forward. Awesome. We we’ve got about got about 10 minutes left. I want to go to a really open ended question for everyone. And this is one that certainly all three of you can can speak to with whatever comes to mind. Practical tips for moving to C.J. Anything that we haven’t covered that you feel like was helpful for you or Jake that your advising that you’re seeing with your clients, that ads were just anything that that comes to mind? Jake, why don’t we why don’t we start with you? Yeah. So I tell you, the first one and I encounter this certainly in my current role, which is kind of getting an idea of what your personality will be, it’s kind of important it brings your data together. And C.J., it’s one of the first decisions you make when you’re setting up your first C.J. connection, but thinking about like, what are your IDs that you have in your ecosystem? What is something that can potentially unite them, and how are you going to think about kind of bringing that data together with that idea? That’s really important. Another one I would say is that I mentioned it earlier, but the setting expectations data is processed differently. Everything is done when you’re pulling the data now from an attribution and persistence perspective, they will the metrics will look different. So really measure those based baselines and again, overcommunicate why shifts are happening and have a good kind of go to explanation. Maybe it’s cross-device sessions are getting collapsed down. So you’re seeing fewer visits. Maybe it’s got better bot filtering now because you can take advantage of some of the data cleansing that is inherent with a or like we had. Maybe you collect more data, but regardless, like you really have to communicate, the data is going to change. You’re not going to be able to match exactly what you had in ÃÛ¶¹ÊÓƵ Analytics. And then I mentioned this also earlier, but getting like expanding your userbase, bringing in new people, getting them excited, there’s potentially new data that’s coming in. You don’t want the perception that you’re taking their data from them or that you’re going to do analysis that they don’t do or something like that. You want them to be bought in and excited. Yeah, yeah. I this is, this is the theme that I know Trevor and I have have run up against many, many times. This is not about, you know, taking something off of someone’s plate. This is about well, I mean it is about freeing them up to do, to do more of what only they can do rather than having to, to do things that that are completely possible in a in a tool like C.J… Erika, what about what about you? Practical tips, anything that comes to mind? I mean, I’ll, of course, echo everything Jake said very much resonate with all of those tips from the experience. Some other ones, just like within configuration, is one of the things that’s been helpful for us is to actually keep a copy of our production data view, which we call our data view. It’s not a data, it’s just a copy of our production one. And any time we want to make a change to, you know, the persistence or the binding of a dimension, or if we want to change attribution will apply it in that cue data of you first to see what the impact across key business metrics, because there are times we’ve made changes to the settings and they’ve had unintended consequences on a larger portion of data than we were expecting. And so just making sure you have an understanding is extremely powerful, but that can have unintended consequences. Another Benefit of CGI, in my opinion, is the ability to easily screen tests without changing your data at all. And so when we need to potentially exclude high volume events that we don’t think have a lot of value in CGI, we’ll just apply a quick filter to our production data view, excluding those. And we basically run a screen test to see two weeks. Does anyone complain about the absence of these events? If no one does, then we can exclude them from our pipeline. So it’s much easier to do screen testing for our users and do things like that. But I would say always test any configuration changes that you do because we’ve run into this where we just kind of deploy something and then we get a ton of questions that are like, what happened? Our historical trends have completely changed. So testing is very important. I think that’s going to. A good thing to highlight of just like how different C.J. is. Like you make a change in Doby Analytics and it’s from that point forward, but you make a change in CGI. It’s the whole history. You restate the whole thing, which is really powerful and amazing. You can do these attribution things that you couldn’t do before to full historical data set up, but it’s super dangerous. Yeah. Which, which actually there was an unplanned question that I wanted to throw in there around just cool things that you have been able to do in CGI with your organization, you know, fully, fully onboarded or even maybe before they were fully onboarded. But like just, you know, when when when someone says, you know, just what’s it what is a cool, powerful use case or like moment of value that you that comes to mind. But is there anything that from your experience, either of you that you start with Jake, maybe from from your time at this buy or something that you’ve seen that you can share anonymously, that you’ve seen as a customer have like a, you know, a light bulb moment or something. I mean, it’s hard not to talk about dry fields. It’s just being able to I describe it as like a DeLorean. You can take your DeLorean, go back in time, make changes, and then immediately come back and see how they affected the outcomes and the data. Having that ability and drive fields to edit data, update it, process it differently and just get that immediate feedback. That’s really cool. Yeah. Re sequencing is another that comes to mind, you know, bringing in if you are bringing in other channels of data, not needing those to come in necessarily in order like they would in ÃÛ¶¹ÊÓƵ Analytics. We tried to shoehorn it in there. That’s a great one. Erica, does anything, anything else come to mind for you? Yeah, I think one of the cool things that we’ve been able to do is have a better picture of cancellation data. I mean, that’s vital and poor, vitally important data when we’re talking about lodging, air bookings, activity bookings. And so what we’ve been able to do is still, even if a cancellation happens offline six months after the booking, you can stitch that to that session. Actually have an understanding that that order was eventually canceled. But in the future. And so we can kind of add that to that historical data so that way we have a better understanding of what actually ended up happening with a specific booking or order. The other one is a little bit more broad is we’ve seen so many new functions start using CGI since we’ve, you know, roll this out, teams that are not the most data savvy. And when I say not the most data savvy, they’re not writing intense sequel queries all the time, being able to go in and self-service more than ever before. So that’s just been really cool to see new functions like our media insights teams using that tool and coming to us with questions. Awesome. I’m going to ask one other one other question from the chat specifically for Eric. Although Jake interested in your in your take as well. Well, let’s start with this. Did you or have you updated how many is the question is how many existing reports or dashboards needed to be updated. Was that a fairly simple task for most or that did that become a resource challenge to update or rebuild them? We did kind of put the onus on end users for the most part to recreate their reports because we positioned it as this is a great learning opportunity for you in getting familiar with Customer Journey Analytics e.g. is in a every company may not experience what you did. We did not move our ÃÛ¶¹ÊÓƵ Analytics data over to CGI, which I know that there are automated tools that if you do implement it that way, that you can leverage. Those were not around when we were migrating reports over. So for us it was a manual process. I manage a team and so some of their tasks were looking at important dashboards and recreating those as training opportunities for them. But then we also held sessions on those people that had their own reports. A lot of the time they were analytics individuals and so we would expect them to have that capability to recreate those reports. And one of the learning materials we created to assist with that was an entire dating data mapping document. So we took every, every prop and metric in ÃÛ¶¹ÊÓƵ Analytics and we had a searchable table that said, okay, you use this bar and ÃÛ¶¹ÊÓƵ Analytics, you need to use this dimension in CGI to facilitate that report building. But I do think it’ll come down to what does your resourcing look like across the business and how are you implementing CGI? Awesome. Trevor Any, any quick practical tips on moving to CGI? Maybe just one brief one. I know a lot of folks always ask me like, Well, we use data feeds extensively. What should we do now in the world of CGA and unfortunately, like that’s actually kind of complicated and specific to your tier circumstance. But in general, like a lot of what I’ve noticed is that companies tend to use CGI itself for the things that they used to rely on data feeds for, you know, whether that’s cross-channel analysis, merging with data, those sorts of things like CGI is just an easier way to do that than trying to recreate, you know, sessions and such from a data feed file. But on top of that, CGA has a lot of amazing new features that help facilitate data integration with common BI flows that you might exist, that might already exist, that you might be using data feeds for. Like we have our new full table export capability, which is, in my opinion, ten times better than the data warehouse product we used to have in ÃÛ¶¹ÊÓƵ Analytics. You have the ability to export the raw data from app, and next month you’ll be able to connect directly to our common BI visualization tools like Tableau or Power BI. So I think a lot of what we’re focusing on on the product side for CGA is the ability to help our customers better integrate CGA into their existing flows, kind of meeting our customers where they are. To echo, you know, something Erika said earlier, I think that’s something that we’re trying to do with the product as well. So when you get into that and you start thinking about how CGA collides with the various other data flows and reports that you have in your organization, you know, be thinking about those use cases and how CGA can probably help improve and give additional insight to these existing users. Existing workflows and existing dashboards that you may have at your company. So that’s a tip I would give. Awesome. I will share just one that Erika didn’t mention, but when I when I watched Erika train her organization on CGA, I was blown away by the organization. The the thought that had been put into a wiki space with frequently asked questions. The image that sticks in my head is a calendar of trainings where people could could click and I think sign up. Maybe I if not that that would be a great thing to add to it. But just having a having a place where people can go to find out the resources to get answers to questions, you know, doing a really good job of that. Again, kind of in that theme of over communicating, I thought was a really good practical thing that you all did. Erika, And thank you for letting me let me sit in on that a few years ago. We’re going to do one more question. You know, not not, not rushing, but also, you know, at a at a good speed, where will will what will Last question If an executive asked you how CGA provided value to your organization, how would you describe the business impacts? And I should have warned one of you first, Erika. I think we started with Jake last time. So Erika, how would you describe the business impact of C.J. to a senior leader? Sure. So I think the most important thing is democratization of data. How we position C.J. within our suite of data products is that it is has the lowest barrier to entry. It’s a no code solution to analyzing this data. And so with a little bit of training, really anyone, whether they are a data focused role or not, can leverage this tool to start pulling insights. And so ideally you want almost everyone in your business to be making data driven decisions. And so something that can empower individuals that don’t know how to write sequel to start making data backed decisions for the business. Excellent. Thank you, Jake. I’d say similarly that focus on rapid insight generation being able to explore data, find those insights. As we implemented Cloud data platform, there was a lot of hesitance and concern and worry about much. A query was going to cost how much data they’re processing within the platform. They’re seeing it right in front of them. A lot of that melts away when you give them C.J. They can explore, can look at billions of rows of data, do whatever they want, and find those insights without having to be worried about what they’re going to incur. Excellent. Hopefully that’s helpful for everybody. Let’s move to our key takeaways and as we wrap things up, I think we’ve we’ve hit thoroughly on these three points. There were probably 44 different takeaways that we could have put here because Erika J. Trevor have done an amazing job sharing their their wisdom and experience with us. But first, you do need to understand the data that the mean. There’s a lot you have a lot more flexibility and control of the data as as I think Jake focused on particularly early in the session, understanding how to model data with field groups in extreme understanding identity, understanding the, you know, the scope of events that you have available to you is a big area of focus. Erika I loved how, how you hit on training and communication. Overcommunicate How people and why we’re why we’re doing this, that this is going to give us a better view of the customer end to end and allow us a lot more control over our our data, the way that’s displayed, the way that we work with that. And then number three, figure out a minimum viable version of of CGA, get start getting data in, but don’t try to boil the ocean, so to speak. Don’t try to don’t try to get all of your datasets in at the same time, you know, before starting to put people in CGA again, we could have had about about 40 different things I think on this slide. But those are three of, of my favorite takeaways from this discussion. We’ve we’ve tried to answer as many questions as we could. I know we’re a little bit over time, so what we will do is take all of the questions that you have asked for, those that we can can document in blog form or unexperienced league. We will we will do that as best we can for others. Maybe we need a version too, of this webinar. So we’ll look at possibly doing something else sometime down the road and exploring some of the topics that you all brought up in the chat. But for now I want to say we’ve got some additional resources. These are also reflected, by the way, in the in the resources pod that Emily at the beginning Erika has spoken about Expedia’s experience with CJA, so you can watch that Jake gave a great summit session on data doomsday and not being scared of the future of data. So you can watch that as well. And then a blog post from Jake on what to expect as you’re migrating to CGI. Now with that, we are going to say goodbye. I want to thank again our presenters, our speakers, Jake, Erika, Trevor. I’m going to hand it back to Emily for some final housekeeping. And thank you all for joining Emily. Thank you, Ben. Thank you for being our host. It was an excellent and very engaging session Erica. Jake, we certainly appreciate your expertise and thank you for joining us. Trevor, same to you. This does conclude today’s session. And to all of you in the audience, we appreciate your participation and we thank you for your very thoughtful time and attention on behalf of ÃÛ¶¹ÊÓƵ. We wish you a great rest of your day.

Key Takeaways

  • Gain a deep understanding of the data, including modeling data with field groups and understanding identity and event scope.
  • Over communicate the purpose and benefits of Customer Journey Analytics (CJA) to stakeholders and provide training to empower individuals to make data-driven decisions.
  • Start with a minimum viable version and focus on a specific use case or dataset rather than trying to migrate everything at once.
  • Customer Journey Analytics offers more flexibility and control over data, allowing for rapid insight generation and the ability to explore and analyze data without worrying about cost or data processing limitations.
  • Democratize data by providing a no-code solution for analyzing data, empowering individuals across the organization to make data-backed decisions.
  • Consider the challenges faced by the organization, such as the need for data flexibility, integrating customer touchpoint data, and turning insights into actions.
  • Define clear goals for the transition to Customer Journey Analytics, such as adoption metrics and deprecating legacy tools.
  • Tasks for the transition include configuration, data quality, training, and communication.
  • Plan and have a clear timeline for a successful transition to Customer Journey Analytics.
recommendation-more-help
abac5052-c195-43a0-840d-39eac28f4780