ÃÛ¶¹ÊÓÆµ

2.5.5 Forward events to AWS Kinesis & AWS S3

IMPORTANT
Completion of this exercise is optional and a cost is involved to use AWS Kinesis. While AWS provides a free tier account which lets you test and configure many services without a cost, AWS Kinesis isn’t part of that free tier account. So in order to implement and test this exercise, a cost will be involved to use AWS Kinesis.

Good to know

ÃÛ¶¹ÊÓÆµ Experience Platform supports various Amazon services as destination.
Kinesis and S3 are both profile export destinations and can be used as part of ÃÛ¶¹ÊÓÆµ Experience Platform’s Real-Time CDP.
You can easily feed high-value segment events and associated profile attributes into your systems of choice.

In this exercise, you’ll learn how to setup your own Amazon Kinesis stream to stream event data coming from the ÃÛ¶¹ÊÓÆµ Experience Platform Edge ecosystem to a cloud storage destination, such as Amazon S3. This is useful in case you’d like to collect experience events from web and mobile properties and push them into your datalake for analysis and operational reporting. Datalakes generally ingest data in a batch fashion with large daily file imports, they do not expose public http endpoint which could be used in conjunction with event forwarding.

Supporting the above use cases implies that streamed data need to be buffered or placed in a queue before being written to a file. Care has to be taken to not open file for write access across multiple process. Delegating this task to dedicated system is ideal to scale nicely while ensuring a great level of service, this is where Kinesis comes to the rescue.

Amazon Kinesis Data Streams focuses on ingesting and storing data streams. Kinesis Data Firehose focuses on delivering data streams to select destinations, such as S3 buckets.

As part of this exercise, you’ll…

  • Perform a basic setup of a Kinesis data stream
  • Create a Firehose delivery stream and use S3 bucket as destination
  • Configure Amazon API gateway as a rest api endpoint to receive your event data
  • Forward raw event data from ÃÛ¶¹ÊÓÆµâ€™s Edge to your Kinesis stream

Configure your AWS S3 bucket

Go to and sign in with your Amazon-account.

ETL

After logging in, you’ll be redirected to the AWS Management Console.

ETL

In the Find Services menu, search for s3. Click the first search result: S3 - Scalable Storage in the Cloud.

ETL

You’ll then see the Amazon S3 homepage. Click Create Bucket.

ETL

In the Create Bucket screen, you need to configure two things:

  • Name: use the name eventforwarding---aepUserLdap--.

ETL

Leave all the other default settings as they are. Scroll down and click Create bucket.

ETL

You’ll then see your bucket being created and will be redirected to the Amazon S3 homepage.

ETL

Configure your AWS Kinesis Data Stream

In the Find Services menu, search for kinesis. Click the first search result: Kinesis - Work with Real-Time Streaming Data.

ETL

Select Kinesis Data Streams. Click Create data stream.

ETL

For the Data stream name, use --aepUserLdap---datastream.

ETL

There’s no need to change any of the other settings. Scroll down and click Create data stream.

ETL

You’ll then see this. Once your data stream is succesfully created, you can move forward to the next exercise.

ETL

Configure your AWS Firehose Delivery Stream

In the Find Services menu, search for kinesis. Click Kinesis Data Firehose.

ETL

Click Create Firehose stream.

ETL

For Source, select Amazon Kinesis Data Streams. For Destination, select Amazon S3. Click Browse to select your data stream.

ETL

Select your data stream. Click Choose.

ETL

You’ll then see this. Remember the Firehose stream name as you’ll need it later.

ETL

Scroll down until you see Destination Settings. Click Browse to select your S3 bucket.

ETL

Select your S3 bucket and click Choose.

ETL

You’ll then see something like this. Update the following settings:

  • New line delimiter: set to Enabled
  • Dynamic partitioning: set to Not enabled

ETL

Scroll down a bit more and click Create Firehose stream

ETL

After a couple of minutes, your Firehose stream will be created and Active.

ETL

Create IAM user

In the left AWS IAM menu, click Users. You’ll then see the Users screen. Click Create user.

ETL

Next, configure your user:

  • User Name: use --aepUserLdap--_kinesis_forwarding

Click Next.

ETL

You’ll then see this permissions screen. Click Attach policies directly.

Enter the search term kinesisfirehose to see all related policies. Select the policy AmazonKinesisFirehoseFullAccess. Scroll down and click Next.

ETL

Review your configuration. Click Create User.

ETL

You’ll then see this. Click View User.

ETL

Click Add permissions and click Create inline policy.

ETL

You’ll then see this. Select the service Kinesis.

ETL

Go to Write and check the checkbox for PutRecord.

ETL

Scroll down to Resources and select All. Click Next.

ETL

Name your policy kike this: Kinesis_PutRecord and click Create policy.

ETL

You’ll then see this. Click Security credentials.

ETL

Click Create access key.

ETL

Select Application running outside AWS. Scroll down and click Next.

ETL

Click Create access key

ETL

You’ll then see this. Click Show to see your Secret access key:

ETL

Your Secret access key is now being shown.

IMPORTANT
Store your credentials in a text-file in your computer.
  • Access key ID: …
  • Secret access key: …
Once you click Done you’ll never see your credentials again!

Click Done.

ETL

You’ve now successfully created an IAM user with proper permimssions, which you’ll need to specify when configuring the AWS extension in your Event Forwarding property.

Update your Event Forwarding property: Extension

With your Secret and Data Element configured, you can now set up the extension for Google Cloud Platform in your Event Forwarding property.

Go to , go to Event Forwarding and open your Event Forwarding property.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

Next, go to Extensions, to Catalog. Click the AWS extension and click Install.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

Enter the IAM user credntials that you generated in the previous exercise. Click Save.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

Next, you need to configure a rule that starts forwarding event data to Kinesis.

Update your Event Forwarding property: Rule

In the left menu, go to Rules. Click to open the rule All Pages which you created in one of the previous exercises.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

You’ll then see this. Click the + icon to add a new action.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

You’ll then see this. Make the following selection:

  • Select the Extension: AWS
  • Select the Action Type: Send Data to Kinesis Data Stream
  • Name: AWS - Send Data to Kinesis Data Stream

You should now see this:

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

Next, configure the following:

  • Stream Name: --aepUserLdap---datastream
  • AWS Region: check your region in your AWS Data Stream setup
  • Partition Key: 0

You can see your AWS Region here:

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

You should now have this. Next, click the data element icon for the Data field.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

Select XDM Event and click Select.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

You’ll then have this. Click Keep Changes.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

You’ll then see this. Click Save.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

Go to Publishing Flow to publish your changes.
Open your Development library by clicking Main.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

Click the Add All Changed Resources button, after which you’ll see your Rule and Data Element changes appear in this library. Next, click Save & Build for Development. Your changes are now being deployed.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

After a couple of minutes, you’ll see that the deployment is done and ready to be tested.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection SSF

Test your configuration

Go to . After logging in with your ÃÛ¶¹ÊÓÆµ ID, you’ll see this. Click the 3 dots … on your website project and then click Run to open it.

DSN

You’ll then see your demo website open up. Select the URL and copy it to your clipboard.

DSN

Open a new incognito browser window.

DSN

Paste the URL of your demo website, which you copied in the previous step. You’ll then be asked to login using your ÃÛ¶¹ÊÓÆµ ID.

DSN

Select your account type and complete the login process.

DSN

You’ll then see your website loaded in an incognito browser window. For every exercise, you’ll need to use a fresh, incognito browser window to load your demo website URL.

DSN

Switch your view to AWS. By opening your data stream and going into the Monitoring tab, you’ll now see incoming traffic.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection Setup

When you then open your Data Firehose stream and go into the Monitoring tab, you’ll also see incoming traffic.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection Setup

Finally, when you have a look at your S3 bucket, you’ll now notice files being created there as a consequence of your data ingestion.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection Setup

When you download such a file and open it using a text editor, you’ll see that it contains the XDM payload from the events that were forwarded.

ÃÛ¶¹ÊÓÆµ Experience Platform Data Collection Setup

IMPORTANT
Once your setup is working as expected, don’t forget to turn of your AWS Kinesis Data Stream and Data Firehose to avoid being charged!

Next Step: Summary and benefits

Go Back to Module 2.5

Go Back to All Modules

recommendation-more-help
aeafc5b5-cd01-4e88-8d47-d76c18d7d349