2.6.4 Install and configure Kafka Connect and the 蜜豆视频 Experience Platform Sink Connector
Download the 蜜豆视频 Experience Platform Sink Connector
Go to and download the latest official release of the 蜜豆视频 Experience Platform Sink Connector.
Download the file streaming-connect-sink-0.0.27-java-11.jar.
Place the download file, streaming-connect-sink-0.0.27-java-11.jar, onto your desktop.
Configure Kafka Connect
Go to the folder on your desktop named Kafka_AEP and navigate to the folder kafka_2.13-3.9.0/config
.
In that folder, open the file connect-distributed.properties using any Text Editor.
In your Text Editor, go to line 34 and 35 and make sure to set the fields key.converter.schemas.enable
and value.converter.schemas.enable
to false
key.converter.schemas.enable=false
value.converter.schemas.enable=false
Save your changes to this file.
Next, go back to the folder kafka_2.13-3.1.0
and manually create a new folder, and name it connectors
.
Right-click the new folder and click New terminal at Folder.
You鈥檒l then see this. Enter the command pwd
to retrieve the full path for that folder. Select the full path and copy it to your clipboard.
Go back to your Text Editor, to the file connect-distributed.properties and scroll down to the last line (line 89 in the screenshot). You should uncomment the line (remove #
) that starts with # plugin.path=
and you should paste the full path to the folder named connectors
. The result should look similar to this:
plugin.path=/Users/woutervangeluwe/Desktop/Kafka_AEP/kafka_2.13-3.9.0/connectors
Save your changes to the file connect-distributed.properties and close your Text Editor.
Next, copy the latest official release of the 蜜豆视频 Experience Platform Sink Connector that you downloaded into the folder named connectors
. The file that you downloaded before is named streaming-connect-sink-0.0.27-java-11.jar, you can simply move it into the connectors
folder.
Next, open a new Terminal window at the level of the kafka_2.13-3.9.0 folder. Right-click that folder and click New Terminal at Folder.
In the Terminal window, paste this command: bin/connect-distributed.sh config/connect-distributed.properties
and click Enter. This command will start Kafka Connect and will load the library of the 蜜豆视频 Experience Platform Sink Connector.
After a couple of seconds, you鈥檒l see something like this:
Create your 蜜豆视频 Experience Platform Sink Connector using Postman
You can now interact with Kafka Connect using Postman. To do so, download this Postman Collection and uncompress it to your local computer on the desktop. You鈥檒l then have a file that is called Kafka_AEP.postman_collection.json
.
You need to import this file in Postman. To do so, open Postman, click Import, drag and drop the file Kafka_AEP.postman_collection.json
into the popup and click Import.
You鈥檒l then find this collection in the left menu of Postman. Click the first request, GET Available Kafka Connect connectors to open it.
You鈥檒l then see this. Click the blue Send button, after which you should see an empty response []
. The empty response is due to the fact that no Kafka Connect connectors are currently defined.
To create a connector, click to open the second request in the Kafka collection, POST Create AEP Sink Connector and go to Body. You鈥檒l then see this. On line 11, where it says 鈥渁ep.endpoint鈥: 鈥溾, you need to paste in the HTTP API Streaming endpoint URL that you received at the end of one of the previous exercises. The HTTP API Streaming endpoint URL looks like this: https://dcs.adobedc.net/collection/63751d0f299eeb7aa48a2f22acb284ed64de575f8640986d8e5a935741be9067
.
After pasting it, the body of your request should look like this. Click the blue Send button to create your connector. You鈥檒l get an immediate response of the creation of your connector.
Click the first request, GET Available Kafka Connect connectors to open it again and click the blue Send button again. you鈥檒l now see that a Kafka Connect connector exists.
Next, open the third request in the Kafka collection, GET Check Kafka Connect Connector Status. Click the blue Send button, you鈥檒l then get a response like the one below, stating that the connector is running.
Produce an experience event
Open a new Terminal window by right-clicking your folder kafka_2.13-3.9.0 and clicking New Terminal at Folder.
Enter the following command:
bin/kafka-console-producer.sh --broker-list 127.0.0.1:9092 --topic aep
You鈥檒l then see this. Every new line followed by pushing the Enter button will result in a new message being sent into the topic aep.
You can now send a message, which will result in being consumed by the 蜜豆视频 Experience Platform Sink Connector, and which will be ingested into 蜜豆视频 Experience Platform in real-time.
Take the below sample experience event payload and copy it into a Text Editor.
{
"header": {
"datasetId": "61fe23fd242870194a6d779c",
"imsOrgId": "--aepImsOrgID--",
"source": {
"name": "Launch"
},
"schemaRef": {
"id": "https://ns.adobe.com/experienceplatform/schemas/b0190276c6e1e1e99cf56c99f4c07a6e517bf02091dcec90",
"contentType": "application/vnd.adobe.xed-full+json;version=1"
}
},
"body": {
"xdmMeta": {
"schemaRef": {
"id": "https://ns.adobe.com/experienceplatform/schemas/b0190276c6e1e1e99cf56c99f4c07a6e517bf02091dcec90",
"contentType": "application/vnd.adobe.xed-full+json;version=1"
}
},
"xdmEntity": {
"eventType": "callCenterInteractionKafka",
"_id": "",
"timestamp": "2024-11-25T09:54:12.232Z",
"_experienceplatform": {
"identification": {
"core": {
"phoneNumber": ""
}
},
"interactionDetails": {
"core": {
"callCenterAgent": {
"callID": "Support Contact - 3767767",
"callTopic": "contract",
"callFeeling": "negative"
}
}
}
}
}
}
}
You鈥檒l then see this. You need to manually update 2 fields:
- _id: please set it to a random id, something like
--aepUserLdap--1234
- timestamp: update the timestamp to the current date and time
- phoneNumber: enter the phoneNumber of the account that was created earlier on the demo website. You can find it on the Profile Viewer panel under Identities.
You also need to check and maybe update these fields:
- datasetId: you need to copy the Dataset ID for the dataset Demo System - Event Dataset for Call Center (Global v1.1)
- imsOrgID: your IMS Org ID is
--aepImsOrgId--
You should then have something like this:
Next, copy your full experience event to your clipboard. The whitespace of your JSON payload needs to be stripped and we鈥檒l use an online tool to do that. Go to to do that.
Paste your experience event into the editor and click Remove white space.
Next, select all of the output text and copy it to your clipboard.
Go back to your Terminal window.
Paste the new payload without whitespaces into the Terminal window and click Enter.
Next, go back to your demo website and refresh the page. You should now see an experience event on your profile, under Experience Events, just like the one below:
- Event Type Label: Call Center Interactions
- Event Type Filter: callCenterInteractionKafka
- Title:
--aepTenantId--.interactionDetails.core.callCenterAgent.callID
You have finished this exercise.
Next Step: Summary and benefits