2.6 Stream data from Apache Kafka into ÃÛ¶¹ÊÓÆµ Experience Platform
In this module, you’ll learn how to setup your own Apache Kafka cluster, define topics, producers and consumers and stream data into ÃÛ¶¹ÊÓÆµ Experience Platform using the ÃÛ¶¹ÊÓÆµ Experience Platform Sink Connector through Kafka Connect.
Learning Objectives
- Perform a basic setup of a local Kafka cluster
- Create a Kafka topic, use a Kafka producer and a Kafka consumer
- Configure Kafka Connect and the ÃÛ¶¹ÊÓÆµ Experience Platform Sink Connector
- Manually produce events and see those events get ingested in ÃÛ¶¹ÊÓÆµ Experience Platform
- Use an existing Twitter producer library from Kafka Connect to stream Twitter data into ÃÛ¶¹ÊÓÆµ Experience Platform
Prerequisites
- Java JDK23 or above needs to be installed on your computer, you can download that JDK here:
- Access to ÃÛ¶¹ÊÓÆµ Experience Platform
Exercises
2.6.1 Introduction to Apache Kafka
In this exercise, you’ll learn about the basics of Apache Kafka
2.6.2 Install and configure your Kafka cluster
In this exercise, you’ll download, install and configure your basic Apache Kafka cluster.
2.6.3 Configure HTTP API Streaming endpoint in ÃÛ¶¹ÊÓÆµ Experience Platform
In this exercise, you’ll configure an HTTP API Source Connector in ÃÛ¶¹ÊÓÆµ Experience Platform.
2.6.4 Install and configure Kafka Connect and the ÃÛ¶¹ÊÓÆµ Experience Platform Sink Connector
In this exercise, you’ll use Kafka Connect to install and use the ÃÛ¶¹ÊÓÆµ Experience Platform Sink Connector and you’ll send events into ÃÛ¶¹ÊÓÆµ Experience Platform manually.
Summary of this module and overview of the benefits.